THE RKGBLOG

The SEO Impact of Losing Search Query Data, and Some Proposed Solutions

Last week this bomb dropped: Google announced that search query data for logged-in users would not be shared, unless you’re running paid search campaigns.

There has been a cacophony of chatter in the search world over Google’s decision to deny websites some portion of their SEO referral data. People are stark raving mad at Google about this, especially due to their decision to violate commonly-accepted ‘standards’ of the web community: that data gathered in a secure session (with https) can be shared with other secure sites (using the same protocol). The double standard here (and slippery slope for Google) is that, provided you’re paying Google and running paid search campaigns, you can still get the keyword referral data.

Cue bedlam.

I understand the community’s concern here, but my first instinct (rather than start throwing stones at the Big Bad Google machine) is to figure out what the real consequences are, and what it means for search marketing (specifically SEO).

What’s The Impact So Far?

First the (obvious) piece of bad news: we will no longer see organic search query data for Google’s logged-in users. It makes us a little bit blind for a small percentage of queries (Google reports that it’s less than 10%, but most sites are reporting a number closer to 2%. More about that below, including numbers from a sample of our clients.)

Now the good news: searches are more secure. And even more importantly, folks who are using Google’s API can no longer tie a search back to a specific user.

But what does it mean for our clients? Well, so far, the picture is best answered in two parts.

First the traffic picture: the segment sending “not provided” obfuscated data is quite small. Looking at a selection of 12 RKG clients (all very large sites across several industries), the average percentage of traffic “not provided” is just 0.76% (using a date range of October 18, 2011 to October 24, 2011). That’s the percentage of “not provided” traffic relative to total organic search traffic, including navigational (brand) and competitive (non-brand) terms.

That’s good news. At least so far (using a very small window of time, admittedly) there isn’t game-changing traffic to worry about here.

Now for the bad news. This segment appears to convert at a relatively high rate across the board, and more importantly, tends to be responsible for a relatively large portion of SEO revenue. So far at least, it appears to be a valuable segment of traffic.

Looking at the same 12 RKG clients, the average revenue rank of the “not provided” segment in organic search was 8.2. Some sites had the segment as high as the #1 driver of SEO revenues!

SEO 'not provided' data is a valuable segment of trafficA common picture: as a revenue driver, the “not provided” segment tends to be quite valuable.

There are a few possibilities as to why this is. First, these are queries that range widely in length, intent, and competitiveness. They’re all across the board. Navigational queries (branded) are blended with competitive queries (non-branded), broad queries are mixed with long-tail queries, and queries with strong purchase intent are mixed with other types of queries.

Second, this may be a demographic that is more likely to purchase or convert. That is unknown right now, and purely speculation. It would make sense, however, because users logged in will likely be more savvy generally and fairly sophisticated.

SEO and PPC Integration is More Important Than Ever

It’s changes like this that make SEO and PPC integration more important than ever. There are several ways we can leverage PPC analysis to glean insight into the SEO referral data that is now invisible. Here are a few simple examples:

  1. For a given URL, provide all navigational (brand) and competitive (non-brand) terms from PPC campaigns. Generate the same data from SEO traffic. A gap analysis should uncover terms generating traffic and conversions in PPC that are not available under the “not provided” segment.
  2. Perform the same exercise, but at a global rather than URL level. This should surface the same data, but globally for the domain rather than specific to a URL.
  3. Perform a gap analysis on SEO keyword coverage pre- and post- the Google change. This should disclose what SEO referral terms have “fallen off the map” and are no longer available in analytics.

There are other things you can do, too. SEER has written up an excellent piece on segmenting your “not provided” data reliably in Google Analytics. Of course, there is always Google Webmaster Tools data, but we all know that data is not truly accurate.

What are some other ideas out there to help solve for this? Looking forward to the discussion.

Here’s the data from this initial analysis.

(date range: 10/18 – 10/23)

Site – SEO Traffic Percentage – SEO Revenue Rank

  1. 0.54% – #1 revenue
  2. 0.63% – #6 revenue
  3. 0.95% – no revenue
  4. 1.05% – #34 revenue
  5. 0.58% – #7 revenue
  6. 0.68% – #10 revenue
  7. 0.78% – #4 revenue driver
  8. 0.45% – #6 revenue driver
  9. 0.80% – no revenue
  10. 0.88% – #14
  11. 0.8% – #8 revenue driver
  12. 1.04% – #7 revenue driver

Avg. SEO traffic percentage: 0.76%

Avg. SEO revenue position: 8.2

UPDATE: Reports began circulating this week that the (not provided) segment was spiking closer to 10% of Google SEO traffic. We confirmed this and saw, for a sample of clients, a median share of Google organic traffic at 10.9%. However, some sites have that number much higher, as high as 20%! We will be watching this very closely.

  • Adam Audette
    Adam Audette is the Chief Knowledge Officer of RKG.
  • Comments
    14 Responses to “The SEO Impact of Losing Search Query Data, and Some Proposed Solutions”
    1. jebbiii says:

      I was thinking SEOs should start a campaign about how Google is jiggering everyone’s search results and gmail users should always sign out of Google after using your email. This is something I have been doing anyway. People are often really shocked to find out that Google is messing with the search results. Maybe we can get our PR buddies in on this too.

      If I was Microsoft I would make a big deal out of the jiggered results.

    2. Great post, Adam. One point of clarification. It isn’t that the organic data will be passed for those who are advertisers, it’s that it will be passed on the advertisements themselves. Advertisers will be equally blinded in their organic search.

    3. Adam Audette Adam Audette says:

      George, absolutely. That’s exactly what I meant to communicate, but I didn’t do a very good job making that distinction. Thanks for clarifying.

    4. Gopal Shenoy says:

      Adam – I think you have the numbers mixed up – “Google reports that it’s less than 10%, but most sites are reporting a number closer to 2%” – there are many sites reporting upwards of 1o% since the changes rolled out. Check out http://searchengineland.com/encrypted-search-terms-hit-google-analytics-99685

    5. Mark Ballard Mark Ballard says:

      Gopal, Adam’s numbers were correct at the time of the post, before the changes rolled out to more users. The roll-out has escalated over the last few days and we are now seeing (not provided) queries make up around 11% of Google organic search traffic.

    6. Watpads says:

      Is there a certain limit to how many times I use a keyword on a page?

    7. Adam Audette Adam Audette says:

      @Watpads – not really. The best way to think about “keyword density” in my experience, is to write for your users. If repeating a term is natural and makes for good writing (and a good user experience), then great. If not, that’s great too. Repeating a term many times in a single article is sometimes very natural, other times it’s not. You need to decide for yourself and your site what makes sense. Just don’t worry too much about it from a search engine’s point of view.

    8. Adam – I read most every data anecdote published as to what % of SEO traffic is now impacted, and looks like it’s about 12%, which jibes with your own numbers. What I’d love to get your take on is the following:

      1. What % SEO traffic needs to be affected for this to become a fundamental impediment to your typical SEO practitioner? 20%? 40%, 60%? What’s the tipping point and why?
      2. The two supposed workarounds – deploying SSL on your server, or using WMT 1000-keyword data – both seem to be non-solutions from everything I’ve read. Are there other ways around this?
      3. ISPs currently monitor queries & sell the data; public WiFi snoopers pose security threats; and owners of routers & trunk Internet lines are in position to monetize query data. As an SEO old-timer, surely you have a strong opinion as to whether or not Google is right in saying they’re acting in the best interests of their users?

      To me this move by Google is absolutely fascinating, and I think much, much, much more needs to be written about it, ideally by ROI-focused actors such as you/RKG. I look forward to potentially hearing more on this from you guys!

      -Chris Zaharias
      PS – my own take on this is on my Searchquant blog.

    9. Adam Audette Adam Audette says:

      Chris, great to have you weigh in here. I’m not sure I have any good answers, unfortunately, but will give this a shot:

      1. I think even at 12% this is a fundamental impediment to good SEO work. That said, if a site’s ‘not provided’ data exceeds say 30%, there is a significant issue.

      2. The only workaround that we’ve arrived at is to leverage PPC data. It’s not perfect, but it’s something. Basically, all we need to do is pull ‘not provided’ URLs in a report and pair that with PPC landing page URLs. Match those up. Then run search query reports on the URLs to find the actual queries which triggered the ads. The end result should be a set of queries that will, at least, fill out some of the ‘not provided’ data. It’s something, anyway.

      3. Not sure what I think about it. It seems like Google is putting the advantage more in their favor with SEO. Maybe they’re afraid it’s been too transparent? That SEOs always end up gaming their algo, no matter what, and they continually have to raise the bar?

      I do think this will raise the barrier to entry for SEOs. It makes it harder to do well, which is ironically probably good for agencies and consultants. We’ll have to see how it plays out.

    10. Just a quick note to say that on my own SEM blog, 50% of Google queries continue to be affected, down from a peak of ~66%. I’d be interested to hear if RKG has any update?

    Trackbacks
    Check out what others are saying...
    1. [...] means a lot of good data simply will not be available anymore. Sigh. The Rimm-Kaufman Group blog explains the problem while proposing potential [...]

    2. [...] move to begin encrypting searches by default for signed-in users.  As a result of the change, a meaningful percentage of search queries were no longer passed in the referring URLs of organic clicks.  There was a [...]

    3. [...] is an update to our previous article on the SEO Impact of Losing Search Query Data back in late [...]