Farmer – Panda Thoughts from SMX West
The update Google released recently, initially called ‘farmer’ now called ‘panda’, is the big topic of discussion here at SMX West. There was a great panel discussion yesterday titled: “Content Farms” or the Smartest SEOs in the World?
Smart folks gave their impressions on which sites seemed to suffer (ezine) and which didn’t (eHow) and what those differences were. Most concluded that these ‘content farms’ all contain some quality content and some garbage. Some opined that the punishment was meted out base on the percentage of quality content vs garbage. Perhaps so.
However, how can Google algorithmically determine relative ‘quality’? Have they really figured out how to parse language to the extent that they differentiate between a great article on motorcycle repair from a poor one written by some poor schlub in a third world country who’s never worked on a motorcycle? Not a chance.
College professors do not need to worry that Google can grade papers better than the professors can. The fact that two professors will not necessarily assign the same grade to a given paper makes the point — heck, the same professor looking at a paper at the beginning of a grading session may assign a different grade than they would if that paper came up at the end of a long night of grading (and a nice bottle of Chianti). Assessing quality is hard, and doesn’t lend itself to algorithmic determination.
Anyone managing AdSense ads knows that Google still has trouble telling whether an article that references ‘heavy metal’ is about nuclear physics or rock & roll. Figuring out whether an article on nuclear physics is good or bad is a much, much, much harder problem, and neither Google nor Watson will figure that out any time soon.
The signals Google looks for have to be more easily measured. Inbound links and the quality thereof, domain authority, domain age, and attributes of the page itself (fraction and prominence of page real estate given over to ads, number of links, reciprocal linking patterns, etc). Social media may help as ‘likes’ come to factor into the results. User behavior may help as well: the bounce rate off the page, time on pages, fraction that click deeper into the site vs off of the site.
That said, all of these signals can and will be gamed. The dominant role search plays in people’s lives is a double-edged sword for the engines. It has made them hugely successful businesses, but the stakes are so high for businesses to rank well not poorly that gaming the system is inevitable.
To my thinking, the key is to raise the stakes for the truly bad behavior. The companies that made the news by being caught red handed in link buying rings were rewarded for their bad behavior. While their rankings were ‘artificially’ high they made huge money, and when the plug was pulled by Google their rankings simply fell back to where they ‘belonged’ all along. No incentive not to dabble in black hat tactics again.
When companies or their agencies break the rules they should be banned from the listings entirely for as long as they were engaged in that behavior. Advertisers could require SEO firms to accept liability for any penalties imposed by the engines. If the rules are clear and strictly enforced with real penalties attached the most grievous offenses might be greatly reduced.
This may not help clean up poor quality content, but it may encourage more folks to focus on generating higher quality content, and punishing those who attempt to grossly/mechanically manipulate the algorithmic signals used by the engines might curb that behavior.
Cleaning up this space would benefit white hat SEOs but also every clean business that depends on search traffic: local businesses, advertisers, paid search firms, and the engines themselves. Better results means more usage, and more search engine use helps us all.