We all know that SEO is evolving, right? The SEO world is prone to earthquakes, and the ground has been moving a lot lately. One of the latest temblors is a development known not-so-affectionately as “Not Provided.”
In the good old days, a useful thing you could do was to log into your Google Analytics setup and see exactly what words and phrases people were using to find your site.
Then Google started to hide that information for some searches. And in September, Google decided to make 100% of that search term data “not provided.”
So we web publishers have had a major piece of data — the precise strings of words that folks are using to find us — taken away. Which should be a major bummer for those of us who want to give web searchers a great experience with our wonderful content.
But it’s not worrying us here at Copyblogger, and it doesn’t need to worry you either.
First, Google isn’t the only search engine
OK, it’s the important one. According to the July 2013 report from ComScore, Google’s market share was 67%.
But that leaves 33% of search coming from other sources … and as of now, search engines like Bing and Yahoo continue to pass keyword information to analytic applications. So your analytics program is still seeing a sample of the types of keywords that users use to find content.
33% is a lot better than zero, and it can still give you some strong data about what’s going on with your site.
You also have some keyword data still available in Webmaster Tools. It’s not as robust as what we had before, but it will give you some clues.
But you have other resources, too.
You know your audience better than Google does
Don’t get us wrong. The engineers at Google are pretty damned smart. But the most sophisticated and elegant algorithm on the planet can’t beat the mighty power of your brain.
This (in Sonia’s opinion) is where SEO professionals can go off the rails. They work very hard to think like Google’s algorithms — when they’d be better off thinking like the intended audience.
Smart content marketers start their process (as they always have) by forming a deep understanding of what the audience wants. That is where smart keyword research begins. That is the starting place for headlines that get the click.
It’s too bad that Google won’t confirm your instincts and intelligent understanding of what your audience wants. Too bad, but not fatal.
You may ask, “But what about the data that is (not provided)? Don’t I still need to confirm that I am getting the search traffic for the terms I’m trying to rank for?”
Yes. And you already do.
A real world example
To help show you what we mean, Sean has pulled an example from Copyblogger.com. We’ll be discussing the data for this post.
Take a look at a screen shot of the keyword terms in Google Analytics:
The first thing you may notice is that for the past 30 days, more than 68% of the keywords are (Not Provided).
This is not surprising since a) Google’s market share is about 67% and b) the post would appeal to a broad audience segment that is likely to use the full range of search engines.
An additional 15% are (Not Set). That can be any number of factors, including referrals from other sites, clicks from emails, etc.
Starting at line number 3, we’re seeing the data coming in from other search engines. In fact, for that 30-day period, we see more than 500 unique keyword phrases.
Now as we mentioned, Google Webmaster Tools (as well as Bing Webmaster tools) does provide a sample of search queries to a site based on the query entered by users. For the same period, Google Webmaster Tools recorded 81 unique keywords.
And when we compare the keywords available from Google Analytics (which shows the terms people use in other search engines) to the keywords in Google Webmaster tools (which shows keywords people use in Google) we see some interesting patterns.
First, 32% of the queries used by people in other search engines exactly match the queries people use on Google. So users of other search engines are not some radically different type of human being.
90% of the top ten frequently used words contained in the search queries for both Analytics and Webmaster Tool match exactly, with the only exception being the word “timeline.”
TL;DR, what does that mean?
All this just means you’re still getting data on how people are searching for the content you create. And while it may not be as complete as it once was, it’s a lot better than zero.
But there is something else we can discover.
Even if you didn’t read the post in this example, can you take a guess on what the title is? We’ll save you the click … it’s:
Even without reading the post, you can tell (because of a clear headline that spells out the benefit to the reader) that the author was trying to target people who were looking for ways to create a cover photo in Facebook for the timeline.
Non-surprising fact of the day: the words that people used to find the content aligned with the actual content.
So how do you know people will find this content on search engines? In other words, is there a way to predict the type of search terms people will be using to find this particular piece of content?
Before this post was ever published, we knew the type of keywords and terms people would use to find it.
How? We used Scribe.
Predictive keyword analysis
Analyzing this post in Scribe (our content optimization software) allowed us to know how the search engines would see the content before it was ever published.
First, Scribe gave this page a score of 100 out of 100, showing how well this content aligned to our recommended best practices for SEO copywriting.
Second, Scribe showed a site score of 63 out of 100. That means that this page was a good fit for the Copyblogger site — we weren’t trying to rank for something that was completely outside of we’re known for.
In other words, if this had been an article about six-pack abs or natural flu remedies, we would have had essentially zero chance of ranking for it. But because it was related to what we already write about, year in and year out, Google figured we probably had something intelligent to say on this topic.
Maybe most important, the two Primary Keywords found in the page were Facebook and Cover Photo. In other words, we were confident that the search engines would see the article the same way we saw it.
If Scribe had discovered different Primary Keywords, that would have meant that the search engines would likely become confused as to what the article was about. Which would have meant a little intelligent tweaking until we were on the same page again.
Not all keywords are created equal
When we created Scribe, we knew that not all words found on a page were equal. So we created a way to rank keywords (and filed a patent on the process).
When a keyword is ranked as Primary, it indicates that search engines would index this page for the Primary Keywords discovered with a 95% degree of statistical certainty.
So even before this page was published on the web, Scribe showed that the search traffic to the page would include one or more of the terms Facebook and Cover Photo.
Now, look back at the analysis of the search queries people used.
Notice how many queries contain the Primary Keywords? Every one contains either one or both of the terms.
In other words, Scribe predicted what terms the content would rank for before the content was published.
There’s no substitute for your judgment
Sure, it’s still a good idea to do some keyword research to discover the terms that general people are using to find content like yours.
Sure, it’s still a good idea to check the partial data we do receive, and see if the traffic is coming in for the terms you think it should come in for.
But at the end of the day, while mechanistically trying to reverse engineer the data has its place, it isn’t even close to the whole story. Abstract analysis should be an enhancement to your judgment as a business owner. Because you know your audience better than anyone else can.
SEO expert Jenny Halasz was recently quoted on Search Engine Land as saying,
There’s no doubt that not having keywords provided will make it a little harder to discover customer intent, but there are a lot of other ways to get clues about that, including actively engaging with your customers on social media and such.
And we’ll leave you with the words of our own Executive VP of Operations, Jess Commins:
We still have general data in Webmaster Tools, Bing, and various data-stitching methods for analysis. But really, it’s not about the keywords bringing people in … it’s about what you do with them when you get them, and whether you deliver on why they came to you in the first place.
Great content will not lose in this battle. And tools like Scribe are even more important now, because they can help writers fine-tune their message for readers, not engines. They’re the ones who matter.
If this year is the year of the writer, changes like Hummingbird signal that next year will probably be the year of the reader. Writers run the show, but shows won’t matter if you can’t keep an audience in their seats.
Editor’s Note: Stay tuned on Copyblogger for more thoughts on the Hummingbird algorithm in the coming weeks …