Oct 312012

Top 5 Scariest SEO Mistakes

Here we are again at Halloween, that magical time of year when kids get to eat candy until they puke.  In the spirit of Halloween we’re counting down the top 5 most ghastly SEO mistakes known to man.

Warning: what you are about to read is not for the faint of heart.  The elderly, sick and the very young should proceed with caution.

5) Lack of on page content – Perhaps not as scary as some of the others on this list, but enough to make your hair stand on end.  Content may seem like a simple issue to resolve, but nevertheless, it is very important for providing valuable information to users, and signaling relevance to the search engines.  In fact it’s been said, “Relevance is the new PageRank".  Content is a major component of building relevance.  So write something good for at least the primary category and subcategory pages of your site.

Marketers love brains…er…ROI

4) Maniacal focus on measurable ROI - as a zombie hungers for brains; such is the pursuit of measurable ROI.  Don’t misunderstand, we love to gather data and measure effectiveness as much as the next undead digital marketing agency, but it’s not the “be all, end all” to running an effective campaign.  Undue focus on directly measurable returns can lead to missed opportunities in branding, thought leadership, etc.  In this regard we need to think more like marketers.

3) Duplication Duplication Duplication – Scary, scary, scary.  Whether this condition stems from mismanagement of pagination, faceted navigation, product pages in multiple sub-categories or just general CMS BS, it must be cured.  The most typical treatments include rel=”canonical”, meta robots ‘noindex, follow’, rel=”next” and rel=”prev” in the case of pagination, and parameter handling in Google Webmaster Tools.  If none of these works, try a stake through the heart of the worthless bloodsucker that designed the site. *

Buffy may be able to help you slay those duplication demons.

2) Killer content on a subdomain – This might not be “horse head in your bed” scary, more like when you’ve already poured your morning bowl of frosted flakes then frantically search the fridge only to find out you have no milk.  The difference is this time your Mom won’t save you.  The most common incarnation of this particular horror is putting a blog on a subdomain.  Unfortunately subdomains don’t pass all the positive ranking metrics to the root domain.

Our two rules of thumb for when a subdomain makes sense are:

1)   The subdomain should be different enough from the primary product or service offering on the root domain to be a stand-alone offering.

2)   The subdomain should be robust enough, i.e. have enough content to be a stand-alone website.

The best example of properly using a subdomain comes from none other than Google.  Look at the way they break out offerings other than search.

  • play.google.com
  • maps.google.com
  • news.google.com
  • translate.google.com

Finance is the one glaring exception, but what are you going to do?  They’re Google, the King Kong of the search world.

“Disallow: /”…as scary as this guy?

1) Disallow all in robots.txt – Possibly the most horrifying specter in all of SEO.  Believe it or not we’ve seen this in the wild, and it wasn’t pretty.  The fix is pretty easy of course, granted you’re able to quickly figure out why your indexed pages have suddenly dropped to 0.  As with all things, moderation is key when it comes to disallow statements, so be careful.  If you need a refresher on how to properly use the robots.txt file, check out “Robots.txt Best Practices for SEO”.





So there you have it, 5 of the most downright evil SEO ghouls we see over and over again.   If you need more ammunition in your battle against the terrors of declining organic traffic and revenue, give us a ring and we’ll come to your aid.

Happy Halloween!


* Just kidding…obviously.  We would never advocate such a thing.  Do I really have to tell you that?


7 Responses to "Top 5 Scariest SEO Mistakes"
Brian Park says:
I love #1..disallow all in robots txt file..but I am sure so many websites don't even have robots dot txt file at all.
Brian - unfortunately that is all too true...I just found another site missing robots.txt yesterday.
cara says:
Ha, great article Paul! What about the ghosts of URLs past, or non-canonicals in your xml sitemap!! ;) Muhahahahaha (evil Halloween laugh)
Thanks Cara! Terrifying specters for sure :)
David Zee says:
Thanks for enumerating the No no's of SEO. SEO now should focus on quality, unique content and never black. White hat SEO will surely take patience but we all know how rewarding the results will be.
Nick Simard says:
These are not scary but yes the common 5 mistakes of SEO. Among these 5 is not using Robot.txt properly. Because a small mistake may bring one of your product page down.


Check out what others are saying...
[...] Top 5 Scariest SEO Mistakes, Rimm Kaufman [...]

Leave A Comment