Get the latest digital marketing news, views and
more direct to your inbox
Oct 03rd, 2013
From around May 2010, a rather ominous looking ‘not provided’ started appearing in Google Analytics, where previously a referring keyword would have been available. It was the result of Google offering up its secure search capabilities; full encryption using SSL (note the HTTPS:// on Google now), stating the privacy of users as the prime factor. It was only available on a dedicated URL, and had to be searched for to be found. Initially, only decimals of a percentage of referrals were labeled in Analytics as ‘not provided’.
October 2011 brought with it a surge in ‘not provided’ referrals, owing to the fact that Google encrypted every search made by a user who was logged into their Google account. Alarm bells were really starting to ring. Upon being quizzed by Search Engine Land’s Danny Sullivan, Google’s head of web spam Matt Cutts appeared adamant that the ‘not provided’ portion would, even at full roll-out, only affect single-digit percentages of searches. Time now tells a different story.
It was all too obvious where we were heading. Well, now we’ve been put out of our misery.The figure currently stands at over 90% for a lot of sites, and we can fully expect a very definitive 100% blackout before long. That’s right; all organic clicks on Google, including those on images and videos, will cease to provide keyword data to webmasters in Analytics.
Google says that this move will protect the privacy of searchers.
“As search becomes an increasingly customized experience, we recognize the growing importance of protecting the personalized search results we deliver.” – Evelyn Kao, Product Manager, Google
This is true, it will. All searches will remain undisclosed. It will stop marketers from tracking what is being searched for.
But Google is not withholding all information from everyone. Not surprisingly, referral keyword data will still be available to AdWords subscribers. Some may argue that this is Google’s sole intention, to force people into paying for more ads. Besides the obvious failings were they not to provide data to paying customers, it actually makes sense to look at this from Google’s perspective.
If a company is dominating the organic results, they’re more likely to hold back on their paid-ads, figuring that a balance of both will be most profitable; probably focusing more of their efforts on organic SEO. At the end of the day, Google is a business and makes its money from advertising revenue. We often consider it our right to be listed on Google, when in actual fact it is a privilege; it is free, after all. Should a multi-million pound company receive vast amounts of traffic by dominating the SERPs, isn’t it natural that Google might feel they should be earning from it, too?
There are a number of other options for collating similar data. Since the blackout, webmasters have been busy offering up various alternatives to keyword metrics. Also, an article detailing several in-depth workarounds using Google Analytics, posted back in 2011 by the brilliant analyst Avinash Kaushik, is made even more relevant by recent events.
I’ll briefly outline a couple of options below; though I think we should all be starting to think beyond keyword data from hereon in.
Besides using GWT, I think to rely on these methods (plus a couple of others that I’m not taking the time to list) would be grasping at straws. It’s time we moved on.
“Losing keyword data, especially when it helped to improve conversions, is certainly a blow. But there are also positives to take from it. Whereas our focus has always been on keywords, we can now take a more holistic approach and work on overall improvements to the entire user-experience.”- Stuart Pollington, Head of SEO, Smart Traffic
Despite our (over)reliance on keyword metrics, there is a plethora of data still available to us. Page-based metrics should now become our main focus.
Let’s say we want to optimise a web page for a particular keyword. We produce informative and useful content and ensure that our chosen keyword or phrase appears naturally. We also include related words and synonyms in the hope of boosting the page’s relevancy. We make the page the best it can be in relation to our chosen keyword.
Over the next couple of months we see:
Can we safely assume that our campaign for that page is becoming successful? Yes, of course we can! We no longer need to attribute a page’s success to one particular keyword; we’re well aware of what the page is attempting to optimise for, so if the page then starts producing results, what more do we need to know?
Similarly, if our changes to the page result in negative changes to the stats, we can quickly see that the page is under-performing and we should work to improve it. This will actually force site owners to understand their users better and to think about how best to serve them, and not be so reliant on keyword statistics. It will encourage businesses to engage their customers on a personal or social level.
This, in turn, will lead to much greater quality of websites, and a much better user-experience. Sites will have to produce wholly relevant content and nothing else, and it will deal another huge blow to thin content and unnatural keyword insertion.
The particular keyword that actually drove that user to our webpage would, anyway, be given too much credit, even if the data was still available. The whole point of building a campaign, in the modern age of SEO, is to not focus all efforts on one particular key phrase, but to build a page around a particular concept, with user-experience definitively at the top of our priorities.
Share this article!