Fading to Black
From around May 2010, a rather ominous looking ‘not provided’ started appearing in Google Analytics, where previously a referring keyword would have been available. It was the result of Google offering up its secure search capabilities; full encryption using SSL (note the HTTPS:// on Google now), stating the privacy of users as the prime factor. It was only available on a dedicated URL, and had to be searched for to be found. Initially, only decimals of a percentage of referrals were labeled in Analytics as ‘not provided’.
October 2011 brought with it a surge in ‘not provided’ referrals, owing to the fact that Google encrypted every search made by a user who was logged into their Google account. Alarm bells were really starting to ring. Upon being quizzed by Search Engine Land’s Danny Sullivan, Google’s head of web spam Matt Cutts appeared adamant that the ‘not provided’ portion would, even at full roll-out, only affect single-digit percentages of searches. Time now tells a different story.
It was all too obvious where we were heading. Well, now we’ve been put out of our misery.The figure currently stands at over 90% for a lot of sites, and we can fully expect a very definitive 100% blackout before long. That’s right; all organic clicks on Google, including those on images and videos, will cease to provide keyword data to webmasters in Analytics.
Google says that this move will protect the privacy of searchers.
“As search becomes an increasingly customized experience, we recognize the growing importance of protecting the personalized search results we deliver.” – Evelyn Kao, Product Manager, Google
This is true, it will. All searches will remain undisclosed. It will stop marketers from tracking what is being searched for.
But Google is not withholding all information from everyone. Not surprisingly, referral keyword data will still be available to AdWords subscribers. Some may argue that this is Google’s sole intention, to force people into paying for more ads. Besides the obvious failings were they not to provide data to paying customers, it actually makes sense to look at this from Google’s perspective.
If a company is dominating the organic results, they’re more likely to hold back on their paid-ads, figuring that a balance of both will be most profitable; probably focusing more of their efforts on organic SEO. At the end of the day, Google is a business and makes its money from advertising revenue. We often consider it our right to be listed on Google, when in actual fact it is a privilege; it is free, after all. Should a multi-million pound company receive vast amounts of traffic by dominating the SERPs, isn’t it natural that Google might feel they should be earning from it, too?
Where Do We Go From Here?
There are a number of other options for collating similar data. Since the blackout, webmasters have been busy offering up various alternatives to keyword metrics. Also, an article detailing several in-depth workarounds using Google Analytics, posted back in 2011 by the brilliant analyst Avinash Kaushik, is made even more relevant by recent events.
I’ll briefly outline a couple of options below; though I think we should all be starting to think beyond keyword data from hereon in.
- Google Webmaster Tools is still allowing users to view an aggregated list of the top 1,000 search queries that drove traffic to their websites in the past 30 days. Go to Search Traffic > Search Queries in GWT to find yours. If you’re concerned about measuring year-on-year stats, you’ll have to make an effort to visit GWT every 30 days to download your data to CSV. If nothing else, this will keep you more in touch with your stats.
- In August of this year, Google announced their new ‘Paid & Organic’ report in AdWords. The data is pulled from Webmaster Tools, and provide users with an insight into their organic data. The good news for this option is that you don’t need to be paying for ads to get the data. Simply signing up will provide it for you.
- Bing has made no noise whatsoever about following in Google’s footsteps. Although Bing is less popular than Google, the statistics are still useful, especially when determining if a keyword is converting or not. If you’re pining for at least some keyword data, conclusions can still be drawn.
- Analysing your internal site search will provide search data. But, it’s only data from users who already know your website exists. Large websites can benefit from doing this, but if you’re a small company, this might not be such a good resource.
Besides using GWT, I think to rely on these methods (plus a couple of others that I’m not taking the time to list) would be grasping at straws. It’s time we moved on.
A New Day
“Losing keyword data, especially when it helped to improve conversions, is certainly a blow. But there are also positives to take from it. Whereas our focus has always been on keywords, we can now take a more holistic approach and work on overall improvements to the entire user-experience.”- Stuart Pollington, Head of SEO, Smart Traffic
Despite our (over)reliance on keyword metrics, there is a plethora of data still available to us. Page-based metrics should now become our main focus.
Let’s say we want to optimise a web page for a particular keyword. We produce informative and useful content and ensure that our chosen keyword or phrase appears naturally. We also include related words and synonyms in the hope of boosting the page’s relevancy. We make the page the best it can be in relation to our chosen keyword.
Over the next couple of months we see:
- A rise in organic traffic to the page
- A drop in bounce rate
- An increase in conversions from the page
Can we safely assume that our campaign for that page is becoming successful? Yes, of course we can! We no longer need to attribute a page’s success to one particular keyword; we’re well aware of what the page is attempting to optimise for, so if the page then starts producing results, what more do we need to know?
Similarly, if our changes to the page result in negative changes to the stats, we can quickly see that the page is under-performing and we should work to improve it. This will actually force site owners to understand their users better and to think about how best to serve them, and not be so reliant on keyword statistics. It will encourage businesses to engage their customers on a personal or social level.
This, in turn, will lead to much greater quality of websites, and a much better user-experience. Sites will have to produce wholly relevant content and nothing else, and it will deal another huge blow to thin content and unnatural keyword insertion.
The particular keyword that actually drove that user to our webpage would, anyway, be given too much credit, even if the data was still available. The whole point of building a campaign, in the modern age of SEO, is to not focus all efforts on one particular key phrase, but to build a page around a particular concept, with user-experience definitively at the top of our priorities.