Tag Archives: SEO Expert Montreal

8 Steps to Optimize Your Blog Post

18 May

If you’re writing and publishing blog posts, but not putting in the few extra steps to optimize and align them with an overall keyword strategy, then you’re not leveraging the full potential of that content and you’re not making your website pages visible and highlighted for the search engines.

Blog Optimizing: Back to the Basics

SEO Back to Basics

Content is a form of online currency that is crucial to any business’ online marketing. With consumers relying on search engines for product research and reviews, content is key for ranking among those search results because search engines largely determine the quality and relevancy of the Internet’s countless web pages by looking at the text on those pages.

Just having content, even great content, on your company’s website isn’t enough to grab the attention of search engines. Businesses must leverage this content using search engine optimization (SEO) tactics. Maintaining a corporate blog is a good SEO tactic that allows for rapid content creation without the constraints of website architecture and web development teams.

Here’s how you can optimize your blog post in eight steps.

1. Find a Compelling Subject

One method for differentiating your content from all the other writing available across the web is to offer a fresh perspective and a unique angle on a given subject matter. If you haven’t spent time working through this step, don’t bother with the rest of the optimization process.

2. Conduct Keyword Research

This step is the perfect litmus test for determining whether your blog post topic is aligned with what people are looking for. When developing your focused keyword list around the blog post topic, make sure to do a sanity check and confirm that consumers are actually using these keywords to search for your product/service.

Save yourself time in the long run and filter out visitors who are unlikely to buy your product by ensuring your keywords align with the purchasing intent of your target audience.

3. Select Keywords

In order to rank high for a given keyword phrase, it’s important that you only designate up to two to three keywords per website page. Limit your blog post to one primary keyword, as well as two or three variations of that keyword (e.g. optimize blog post, optimize blog, blog post optimize, blog optimize).

4. Track Keyword Ranking Trends

Make sure your focus keyword is worth optimizing for. If there are only 10 searches for a given keyword per month, it might not be worth your while.

Look at how your target keyword phrase is trending, in terms of global monthly searches, how competitive the search term is, and whether any of your competitors or one of your pages are already ranking for it.

5. Optimize the Page

Page optimization is crucial for boosting the visibility of your blog post for the search engines. After you create the content, insert your keyword phrase throughout the blog post in specific locations where the search engines will be looking for information about your page (i.e. URL, title tag, H1, H2, emphasized text in the body of the post, alt tags on images).

From here on out, every time you mention this specific keyword phrase on your website, use an internal link to its corresponding blog page. There are also available SEO plugins for certain blog platforms, like WordPress’ popular  “All in One SEO Pack,” to help you control these SEO elements.

6. Syndicate via Social Channels

Syndicate your blog post externally by sharing it across your social networks like Twitter and Facebook. Additionally, post comments with your blog post link on relevant, external articles to attract clicks through to your site.

Make sure to use the blog post’s target keywords in your syndication via tweets and Facebook status updates. Help your audience share your content as quickly and easily as possible by including social sharing buttons on your blog post pages like the tweet, Facebook Like, LinkedIn Share, and AddThis buttons.

Consider adding Facebook’s new comments plugin to drive engagement and sharing. Also, make your content available via RSS feed, so subscribers can regularly view your latest content on their news reader of choice.

7. Find Top Links

Inbound links are essential for boosting the search engine rank of a website page. A handful of relevant links will help you better rank. Use a link suggestion tool to help identify and track high-quality, relevant websites that you can reach out to with your blog post and request a link back to your page.

8. Track Keyword Performance

Monitor your blog post on a regular basis, in terms of rank, visits, and leads from its given keyword phrase over time. By checking back on your progress, you can understand what about your content is resonating with your audience and what to improve upon. Evaluate what worked and what didn’t, then repeat the successful tactics with your next piece of content.

Summary

SEO is a gradual process, but by just setting aside an hour a week, you can make a lot of progress over time.

While many view paid search as a quick and easy way to drive traffic without a large time investment, once you switch it off, you lose that traffic. SEO, on the other hand, when done well, can have a long-lasting, sustainable impact for your website.

Advertisements

The top 20 SEO myths you should be aware of

17 May

seo-myths-mythbusterIn the SEO business nothing can taken for granted. Every year new techniques and methods are introduced, new tools become available and several algorithmic updates take place. As a result, the circumstances of thisuncertain and fast changing environment give birth to several myths and misconceptions. From time to time few of those myths get confirmed (such as the effect of Social Media in SEO), while most of them get debunked. In this article we will discuss about the 20 most popular SEO myths and we will explain why they are nothing more than misconceptions.

1. Keep a High Keyword Density

Several SEO Professionals suggest that having a high keyword density for the main keywords of the page will help the rankings. They consider this an important rule/target and as a result they try to stuff these words into the text. Of course by doing so, they produce really unattractive SEO copies and not only they don’t help their rankings but also they irritate the readers of their websites.

What one should do in order to improve his rankings is to use different combinations of the main keywords in the text. This will increase the odds of ranking for other similar terms or combinations without affecting the quality of the text. Note that this technique will increase the Keyword Density of the important terms in anatural way. Nevertheless its primary target is not to increase the density but to incorporate in the text the most common keyword combinations that users are likely to search. A more accurate metric that helps you determine if a given keyword is optimized is the KeywordRank, which takes into account several different parameters such as the position of the keyword, the usage, the relevancy and more. You can check the KeywordRank of your targeted terms by using the Keyword Analyzer tool.

2. PageRank is everything

For years several SEO professionals considered PageRank the most important factor that affected the Search Results. In many cases some of them confused the real PageRank values with the ones of the toolbar and they were focusing primarily on how to increase it in order to improve their rankings. Nevertheless, as we mentioned in a previous article, PageRank is not the only signal that Google use. It is just one of the metrics and in some types of search it carries very little weight (news search, local search, real time search etc).

3. PageRank is useless/irrelevant

The last couple of years, more and more SEOs started to question whether the PageRank affects the SEO. This is mainly because it does not appear to be highly correlated with high rankings. Of course as we discussed in the article “Is Google PageRank still important in Seach Engine Optimization?”, PageRank is a signal, it is a metric that measures the quality/authority of the page and it affects the indexing. PageRank should be neither worshipped nor ignored.

4. Submit every page on Google & Bing

Submitting every page of your website in Google and Bing by using their submission forms will neither help you speed up the indexing nor improve your rankings. If you want to reduce the indexing time, add links from hightraffic/authority pages, use XML and HTML sitemaps and improve your internal link structure. Submitting one by one all your pages will neither help nor hurt your rankings.

5. Meta Keywords help the rankings

The Keywords metatags were important for the first META-search engines that did not have the computer power to analyze and store the entire page. Since then, search engines have been evolved and they are able to extract the important keywords of the page without using the META keywords. Another reason why search engines stopped using this tag is because many people were adding tοo many irrelevant terms in it. Google has made it clear many times in the past that they do not use meta keywords at all, so this tag will not help you improve your rankings.

6. Duplicate Content leads to Bans

Several people suggest that having a lot of Duplicate Content in a website can lead to bans. Fortunately this isnot true. Duplicate content can cause serious problems and it can affect the amount of pages that getindexed, the PageRank distribution within the website and consequently the rankings; nevertheless Google will not ban your website for that. You can find more on our previous article “Duplicate Content: the effects on Search Engine Rankings”.

7. Nofollowing links improves the PageRank distribution

In the past, by using the rel=nofollow attribute, we could manipulate the PageRank distribution of our website and perform PageRank sculpting. Nevertheless an algorithmic update of Google changed the way that rel=nofollow operates and now it evaporates the amount of PageRank that does not pass through nofollowed links. Thus as we discussed in the article “The PageRank sculpting techniques and the nofollow issue” the rel=nofollow attribute leads to the evaporation of link juice. If you want to retain control over your PageRank and avoid the evaporation you can use the PageRank Sculpting technique that we have proposed in the past.

8. All links carry the same weight

In the original PageRank formula that was published by Page and Brin, all the links inside a webpage carried the same amount of weight. Nevertheless this has changed over the years and all the major search engines take into account not only the position of the link in the page, but also the relevancy and other characteristics that affect the CTR (font size, color etc). As a result footer links do not carry as much weight as links that appear on the top of the page or inside the main text.

9. HTML Validation helps SEO

Lots of webmasters used to think that by validating their HTML code they improve their SEO campaigns. Fortunately or unfortunately this is not true. The HTML Validation does not affect the Search Engine Rankings and it is not used as a signal. Nevertheless if your HTML code is so bad that parts of the page do not appear in the browser then Search Engines might have problems in extracting your text. Thus have in mind that producing a valid HTML code is a good practice, but in general minor mistakes will not hurt your SEO.

10. All the Nofollowed links do not help

Typically Google say that they drop from their link graph all the links that are marked with nofollow and thus they do not carry any weight. Nevertheless not all of those links are irrelevant for SEO. For example Twitter and Facebook links are nofollowed, nevertheless as we discussed on the article “Twitter & Facebook links affect SEO on Google and Bing”, Google and Bing use those data as a signal. So it makes sense to say that not all nofollowed links are irrelevant for SEO and that the major search engines might in some cases consider them during their analysis.

11. Link all pages to all pages

Some people suggested that by linking all pages to all pages you can improve the indexing or the rankings. So in order to achieve this they use too many secondary menus or footer links. Nevertheless by doing so, you increase dramatically the number of outgoing links per page and you do not pass enough PageRank to theimportant webpages of your site. Typically websites should use a tree-like structure that enables them to focus on the most important pages. More information on this topic can be found on the article “Link Structure: Analyzing the most important methods”.

12. Robots.txt can help solve Duplicate Content issues

The Robots.txt file can be used to prevent Search Engines from parsing particular pages or segments of a website. As a result, some SEOs have tried to use this as a way to reduce the amount of duplicate content that they have on their websites. Nevertheless by blocking these pages, you prevent Google from crawling them, but you do not improve your link structure which causes the problem. As a result since the problem remainsunsolved the negative effects on the rankings continue to exist.

13. Low Quality Links can hurt the rankings

Several SEOs have stated in the past that adding low quality links that come from link farms can actually hurt the SEO campaign of a website. If this was true then people would be able to negatively influence the websites of their competitors just by adding to them low quality links. Fortunately though, Google will not ban a website for getting low quality links. Nevertheless in extremely rare cases, Google has taken measures against websites that tried systematically to manipulate Search Engine Results by artificially increasing the number of their backlinks.

14. Low Quality Links can help the rankings

Major search engines use several methods to detect paid or low quality links and they exclude them from their link graphs. The recent Panda update (or farmer update) made it even clearer that acquiring links from low quality websites/link farms, that contain a lot of duplicate or scrapped content, will not help to achieve high rankings.

15. The Page Title/description will certainly appear on snippet

Several webmasters believe that the Titles or META descriptions that they use in their pages are always the ones that will appear on the snippet of the Search Engine results. This of course is not always true, since Search Engines can change the snippet title and description with something more relevant to the query of the user. Some other times, search engines can even use text that does not exist in the landing page. Usually this text has been retrieved from external sources such as the DMOZ directory or the anchor text of the incoming links.

different-title-in-SERP

16. Pages blocked with Robots.txt will not appear in SERPs

Another common mistake that many SEOs make is that they use robots.txt in order to ensure that a particular page will not appear on the SERPs. Nevertheless this page can appear in the search results if it is linked by other pages. As we discussed on the article “The robots.txt, META-robots & rel=nofollow and their impact on SEO”, the proper way to ensure that a page will not appear in the search results is to use the “nofollow” meta-robots directive.

blocked-by-robots

17. SEO is all about Google

Google might still be the market leader in search, nevertheless we should not forget that Bing and Yahoo hold more than 30% of the total market. The Search Engine Optimization techniques do not focus on optimizing the websites only for Google, but they target on increasing the organic traffic from all the search engines and on developing websites that are attractive both for the users and the search engines. Note that there might be some methods that work better for Google, nevertheless a solid SEO campaign should be effective for all the major search engines.

18. SEO requires a long time to return positive results

The SEO is neither a process that will deliver results overnight nor a one-time activity. To achieve good results it requires effort and time. Nevertheless, positive results can become visible relatively fast. Of course a new website will not be able to achieve immediately good rankings on the highly competitive terms; nevertheless it should be able to rank for the more targeted and long tailed keywords.

19. SEO is a spammy/unethical technique

The SEO is an online marketing technique/process that can help websites increase their organic traffic, theirexposure and their sales. In order to achieve this SEO professionals focus not only on the technical characteristics of the website but also on the content, on the designs and on external factors. The SEO is a marketing tool just like advertising. If you consider SEO unethical you should also feel the same about adverting.

20. “SEO is dead”

Every year, a major update takes place in the Search Engine business and several bloggers or journalists suggest that SEO is dead. Nevertheless as you can see SEO is alive and kicking and it is constantly evolving along with the Search Engines. Certainly the techniques have changed a lot, new tools and the methods become available while other ones are no longer used. SEO is a relatively new form of Marketing and it will exist for as long as Search Engines exist. As Tad Szewczyk wrote a year ago “SEO is dead” is dead.

What other SEO myths have you heard? Leave your comment below!

%d bloggers like this: