Dispelling the Myths of SEO
I don’t blame webmasters, marketers or even the casual blogger for having such a hard time grasping the one true testament of SEO, if that really even exists, because things change on a daily, or even hourly basis. Plus, you have so-called experts touting tall tales of black-hat SEO, and how “in my day I could stuff keywords until the cows came home to rank 1st overall in Google”. The basics are relatively straightforward, and fundamentally unchanged over the past few years; it’s the tactics that have changed. So here is a run through of all the latest resources I could find to dispel the myths and set the records straight.
And sayeth the Matt Cutts: “Thou Shalt have Good SEO”
Basic tenants of SEO that have not failed to this day are simple. Good search engine optimization is based on three things only:
- Site mechanics, that is, the actual server based technologies, software configuration, back-end workings of your site/page.
- Content, the organization of content, the relevance of content, and the popularity of that content.
- Link building strategies.
That’s it. You keep those three things in mind when you’re creating or modifying your sites and pages, and you should be good for an eternity. However, if in doubt regarding a change or tactic, weigh your decision on the following mission statements representatives from top search engines have quoted in popular speaking opportunities:
Google’s Corporate Mission: To organize the world’s information and make it universally accessible and useful.
Yahoo’s Corporate Mission: To connect people to their passions, communities, and the world’s knowledge.
If anything you do to your website goes against those corporate policies, you’re probably not going to land in an engine’s good books, or high rankings for that matter. You see how words like “community” and “accessible” and “useful” are all throwbacks to the three tenants of “links” and “mechanics” and “content”? That is not a happy coincidence.
Deep Dive into the Tenants of SEO
Site mechanics should be relatively straightforward to understand from a “working” or “not working” perspective. If your site fails to load half the time, and you recognize that fact, it’s obvious you need a better web hosting company, or IT Director. So here are some of the good and bad decisions when it comes to site mechanics:
- Good: real server up-time as close to perfect as possible, although the search engines do recognize that servers go down every once in a while, and are well-trained to come back and check for content. You see, downtime isn’t always bad, for instance when a webmaster has to take down their site to recover from a malicious hacker, or for routine maintenance. Your SEO can recover from downtime.
- Good: front-end to back-end reliability, because the last thing you need is for your front-end to make calls to a database that is slow or down most of the time.
- Bad: As is often the case, back-end problems generate pages that deliver status code “200 – OK” (more on status codes) when a database query times out. Basically this says to a search engine, your page came out just fine, this page is all about database errors and timeouts. Your page is probably not about database errors or timeouts.
- Bad: Caching issues for high-traffic sites may not always allow visitors or search engines onto a page, or they display old versions of a page sometimes hours after they’re updated. If search engines are sending you traffic for keywords, make sure you can deliver a page they deem is worthy of that traffic.
We’ve all heard the mantra, “content is king”, but have you heard the newer mantra that adds context? It goes something like this, “relevant content is godly.” Yes, indeed, good content is king, but if it isn’t relevant, compelling or structured well, you’re probably going to lose out in the long term race for good search results.
Now that we’re on the subject: what exactly is relevance? No one really knows, because it’s one of those things that depends on a lot of moving parts in a search engine’s ranking algorithm. Although no one can know for sure, we have some ideas of what makes up good content and bad content:
- Good: logical page structure, navigation, and content presentation goes a long way in establishing good search rankings. This includes the use of descriptive yet to-the-point title tags, sectional divisions using heading tags (h1, h2, h3), and paragraphed sections of text that relate to both.
- Good: unique perspectives, explanations, and compelling copy are used to enhance the subject, while the use of other media can add stickiness from an engagement perspective. Tasteful use of illustrations, photos, audio and video help rankings because search engines use web analytics too. Engines know when people bounce back and click the next search. Did your page cause visitors to bounce? Yep, there’s probably something unappealing about your page, so engines are less likely to rank it high.
- Mediocre: meta keywords are not used anymore by top tier search engines (smaller, 3rd tier engines may), so you can use them, but don’t sweat it if you forget or can’t come to a consensus on what they should be. Meta descriptions, however, are being used more often by Google, as webmasters become more savvy at creating compelling copy to include. In the end, it’s still up to Google to decide whether they use your meta description, or whether they compile something using their algorithm.
- Bad: keyword stuffing. Ignore what people tell you about magical keyword densities of 2-5% and using synonyms and differently worded phrases. If you have to check density, you’re not being honest about your content. That being said, if you have greater than 5% keyword density, your page will probably get flagged. Instead of worrying about density, give your page to someone with no problem calling it like it is, they’ll tell you whether the copy is written well, or whether it sounds stuffy or forced.
- Bad: duplicate content. Obviously, you shouldn’t be copying content verbatim from somewhere else, although there are probably a few exceptions such as newswire sites that aggregate content. Dupes are bad because, the content can be potentially copyrighted, or maybe carries a creative commons mark of some sort. Two, it’s just not appealing to have hundreds of the same page in a search engine. Three, duplicate pages can exhaust valuable crawl time search engine spiders are allocating for your site. They’re there to crawl new content, so don’t bog them down with duplicates.
And finally, the last but certainly not least of the three tenants: link building. Search engines could theoretically guess which content is the best given search user behavior, keyword density analytics, etc., but it would leave little motivation for webmasters to innovate. Links are a game-changer because they act as votes cast by other webmasters, users, and organizations for your site and it’s pages.
Here’s where it gets complicated and mathematical. To solve the problem of link farms and link spam, search engines attribute greater voting power to sites they deem trustworthy. Many of us have heard the term PageRank, and although Matt Cutts would prefer everyone forget that term, the basic principle stands. A link from a site with PageRank of 8 is better than 100 links from sites with PageRank 2. PageRank is logarithmic, and therefore PageRank 2 is 10 times better than PageRank 1, PR3 is 100 times better than PR1, etc.
OMG my head hurts from math…
Without getting into logarithmic scales and calculus or what makes up the myriad of factors pundits believe contribute to PageRank, let’s review a few good and bad policies of link building:
- Good: if you’re a small business and have happy customers that have somewhat large and well-known websites, ask for a testimonial that links back to your site. When doing so, be sure to ask politely whether you can specify anchor text and deep-site landing pages. That way you can start building relevant high-value links from trustworthy sources to pages in your site. This is the story of great link juice.
- Good: specify anchor text that you want to start ranking for in search queries. Do you really want a link to your site to start competing for keywords such as “learn more” or “here” or “visit their site”? Last I checked, competition on those keywords is pretty heated. My bet is, you want your pages to rank for “toronto kitchen renovation” or “canadian insurance broker“.
- Good: build links within your vertical, or with organizations that are somewhat related to your website. Good neighbors give good link juice.
- Good: find opportunities such as sponsorships, engagements, press releases, and good-will gestures to build wholesome back links through older communication mediums.
- Bad: stay away from paid links. The probability that Google knows about a webmaster selling links on their high-PR site is relatively high, because they have to publicize proposals like that somewhere on the web.
- Bad: don’t hide links on your site. You can do all sorts of cool stuff with CSS, such as “display:none” that will effectively hide content and links from users, but not search engines. If you have legitimate reasons for using those codes, use them sparingly and uniformly across similar pages within your site. Search engines can tell the difference between when it’s “a design thing” and when it’s spam.
Although this post is hardly comprehensive, it should go a long way in dispelling many myths that seem to be lingering in the online marketing community as a whole. I hope it goes a long way in helping others achieve their SEO goals, or at least allow a few more people to become that much more conversational in SEO and best practices when trying to attract search engine attention.