Does This Change My Entire SEO Strategy?
What does this mean to us? At RKG, we don’t believe it changes our best practices strategically. We will still concentrate on the Crawl->Index->Rank cycle of SEO by fixing the block-and-tackle aspects of a website’s technical situation while improving the content, presentation and marketing of a website.
What Does Google Say?
In their blog post, Google made a reference to a few things to verify about your current site and what to watch for in the coming weeks:
Ensure that CSS and JS resources are not blocked in a way that Googlebot cannot retrieve them.
This is a standard recommendation that we’ve made to all of our clients before this announcement. While it might seem like blocking those assets through a robots.txt disallow or other methods might be making the crawl of a website more efficient, the ability for spiders to crawl those resources better allows search engines to understand the structure and architecture of the site. This has become even more important with the increasing amount of traffic that comes from mobile and it helps search engines understand your optimizations for mobile devices.
Ensure that your web server is able to handle the increased volume of crawl requests for resources.
Ensure that your website will degrade gracefully.
This is something to monitor and we will address this in more detail below. At this time, we do not know what is too complex and arcane for Googlebot. But Google has added some new functionality to the Fetch as Googlebot tool so you can test your pages to ensure that they are rendering as intended.
What Can We Expect?
A few scenarios to think about:
- How will Bing respond?
What Happens When This Ramps Up?