Why You Don’t Need to Freak Out over Google’s (Not Provided)

Image of Zombies on Broadway Poster

We all know that SEO is evolving, right? The SEO world is prone to earthquakes, and the ground has been moving a lot lately. One of the latest temblors is a development known not-so-affectionately as “Not Provided.”

In the good old days, a useful thing you could do was to log into your Google Analytics setup and see exactly what words and phrases people were using to find your site.

Then Google started to hide that information for some searches. And in September, Google decided to make 100% of that search term data “not provided.”

So we web publishers have had a major piece of data — the precise strings of words that folks are using to find us — taken away. Which should be a major bummer for those of us who want to give web searchers a great experience with our wonderful content.

But it’s not worrying us here at Copyblogger, and it doesn’t need to worry you either.

Here’s why.

First, Google isn’t the only search engine

OK, it’s the important one. According to the July 2013 report from ComScore, Google’s market share was 67%.

But that leaves 33% of search coming from other sources … and as of now, search engines like Bing and Yahoo continue to pass keyword information to analytic applications. So your analytics program is still seeing a sample of the types of keywords that users use to find content.

33% is a lot better than zero, and it can still give you some strong data about what’s going on with your site.

You also have some keyword data still available in Webmaster Tools. It’s not as robust as what we had before, but it will give you some clues.

But you have other resources, too.

You know your audience better than Google does

Don’t get us wrong. The engineers at Google are pretty damned smart. But the most sophisticated and elegant algorithm on the planet can’t beat the mighty power of your brain.

This (in Sonia’s opinion) is where SEO professionals can go off the rails. They work very hard to think like Google’s algorithms — when they’d be better off thinking like the intended audience.

Smart content marketers start their process (as they always have) by forming a deep understanding of what the audience wants. That is where smart keyword research begins. That is the starting place for headlines that get the click.

It’s too bad that Google won’t confirm your instincts and intelligent understanding of what your audience wants. Too bad, but not fatal.

You may ask, “But what about the data that is (not provided)? Don’t I still need to confirm that I am getting the search traffic for the terms I’m trying to rank for?”

Yes. And you already do.

A real world example

To help show you what we mean, Sean has pulled an example from Copyblogger.com. We’ll be discussing the data for this post.

Take a look at a screen shot of the keyword terms in Google Analytics:

The first thing you may notice is that for the past 30 days, more than 68% of the keywords are (Not Provided).

This is not surprising since a) Google’s market share is about 67% and b) the post would appeal to a broad audience segment that is likely to use the full range of search engines.

An additional 15% are (Not Set). That can be any number of factors, including referrals from other sites, clicks from emails, etc.

Starting at line number 3, we’re seeing the data coming in from other search engines. In fact, for that 30-day period, we see more than 500 unique keyword phrases.

Now as we mentioned, Google Webmaster Tools (as well as Bing Webmaster tools) does provide a sample of search queries to a site based on the query entered by users. For the same period, Google Webmaster Tools recorded 81 unique keywords.

And when we compare the keywords available from Google Analytics (which shows the terms people use in other search engines) to the keywords in Google Webmaster tools (which shows keywords people use in Google) we see some interesting patterns.

First, 32% of the queries used by people in other search engines exactly match the queries people use on Google. So users of other search engines are not some radically different type of human being.

90% of the top ten frequently used words contained in the search queries for both Analytics and Webmaster Tool match exactly, with the only exception being the word “timeline.”

TL;DR, what does that mean?

All this just means you’re still getting data on how people are searching for the content you create. And while it may not be as complete as it once was, it’s a lot better than zero.

But there is something else we can discover.

Even if you didn’t read the post in this example, can you take a guess on what the title is? We’ll save you the click … it’s:

How to Create a Cover Photo for Your Facebook Timeline

Even without reading the post, you can tell (because of a clear headline that spells out the benefit to the reader) that the author was trying to target people who were looking for ways to create a cover photo in Facebook for the timeline.

Non-surprising fact of the day: the words that people used to find the content aligned with the actual content.

So how do you know people will find this content on search engines? In other words, is there a way to predict the type of search terms people will be using to find this particular piece of content?

Before this post was ever published, we knew the type of keywords and terms people would use to find it.

How? We used Scribe.

Predictive keyword analysis

Analyzing this post in Scribe (our content optimization software) allowed us to know how the search engines would see the content before it was ever published.

First, Scribe gave this page a score of 100 out of 100, showing how well this content aligned to our recommended best practices for SEO copywriting.

Second, Scribe showed a site score of 63 out of 100. That means that this page was a good fit for the Copyblogger site — we weren’t trying to rank for something that was completely outside of we’re known for.

In other words, if this had been an article about six-pack abs or natural flu remedies, we would have had essentially zero chance of ranking for it. But because it was related to what we already write about, year in and year out, Google figured we probably had something intelligent to say on this topic.

Maybe most important, the two Primary Keywords found in the page were Facebook and Cover Photo. In other words, we were confident that the search engines would see the article the same way we saw it.

If Scribe had discovered different Primary Keywords, that would have meant that the search engines would likely become confused as to what the article was about. Which would have meant a little intelligent tweaking until we were on the same page again.

Not all keywords are created equal

When we created Scribe, we knew that not all words found on a page were equal. So we created a way to rank keywords (and filed a patent on the process).

When a keyword is ranked as Primary, it indicates that search engines would index this page for the Primary Keywords discovered with a 95% degree of statistical certainty.

So even before this page was published on the web, Scribe showed that the search traffic to the page would include one or more of the terms Facebook and Cover Photo.

Now, look back at the analysis of the search queries people used.

Notice how many queries contain the Primary Keywords? Every one contains either one or both of the terms.

In other words, Scribe predicted what terms the content would rank for before the content was published.

There’s no substitute for your judgment

Sure, it’s still a good idea to do some keyword research to discover the terms that general people are using to find content like yours.

Sure, it’s still a good idea to check the partial data we do receive, and see if the traffic is coming in for the terms you think it should come in for.

But at the end of the day, while mechanistically trying to reverse engineer the data has its place, it isn’t even close to the whole story. Abstract analysis should be an enhancement to your judgment as a business owner. Because you know your audience better than anyone else can.

SEO expert Jenny Halasz was recently quoted on Search Engine Land as saying,

There’s no doubt that not having keywords provided will make it a little harder to discover customer intent, but there are a lot of other ways to get clues about that, including actively engaging with your customers on social media and such.

And we’ll leave you with the words of our own Executive VP of Operations, Jess Commins:

We still have general data in Webmaster Tools, Bing, and various data-stitching methods for analysis. But really, it’s not about the keywords bringing people in … it’s about what you do with them when you get them, and whether you deliver on why they came to you in the first place.

Great content will not lose in this battle. And tools like Scribe are even more important now, because they can help writers fine-tune their message for readers, not engines. They’re the ones who matter.

If this year is the year of the writer, changes like Hummingbird signal that next year will probably be the year of the reader. Writers run the show, but shows won’t matter if you can’t keep an audience in their seats.

Editor’s Note: Stay tuned on Copyblogger for more thoughts on the Hummingbird algorithm in the coming weeks …

Print Friendly

What do you want to learn?

Click to get a free course and resources about:

Reader Comments (56)

  1. says

    Am I the only one who’s always preferred the keyword information provided by Webmaster Tools anyway?

    I like to see both impressions and clicks at the same time. That way, I can get an overview of both people who are clicking through to my site and those who aren’t (though I suspect loads of impressions are just from SEOs doing ranking checks on their sites).

    Maybe it’s because I’ve only been using Google Analytics for a few months and have been used to living with keyword ‘not provided’ from practically day one.

    • says

      Kevin, I would also strongly recommend Bing Webmaster Tools.

      Duane Forrester and his team at Microsoft have developed a strong (and in some ways better) tool system for webmasters with a wealth of keyword data and research features.

  2. says

    My SEO strategy used to have 3 steps:

    1. Research search search term “themes”
    2. Create an awesome piece of content that addresses that theme
    3. Publish it, promote and and see how it ranks
    4. Look at long tail keywords that are driving traffic – I’m always surprised by this!
    5. Optimize the page by the long tail keywords I “accidentally” rank for

    I will no longer be able to do much of step #4 and #5. The data I will receive will mostly be the “fat head”, not the long tail surprises that I’ve been basing my strategy on.

    • says

      That’s definitely a way to get to audience questions you may not have known about (and you still have around 1/3 of that data), but you can also get those by paying close attention to your comments, and also to the audience questions that come up on social media platforms (on your accounts or others).

      At the end of the day, there’s more than one way to dig deep into your audience and create content that serves their needs. And optimizing for your long term queries may not even be the strongest way to do that.

      Also, too much long tail research can lead you the wrong direction. For example, the post example Sean used gets a lot of search traffic. But it isn’t a post that does a tremendous amount for our business. What Jess is pointing to in the last paragraph is that a focus on the content that serves your business goals is more important than stumbling across a long tail term that happens to get good traffic.

      The main point is to do strong work with what you do have, because Google is doing what they always do — exactly what they feel like doing. They don’t serve us and they never have. So we have to be resilient enough to bounce back.

      • says

        Couldn’t have said it better myself. I even did a mini fist pump to myself when I read it, well said. Reminds me a lot of how Marcus Sheridan talks about his first steps in content marketing.

        Great piece guys!

    • says

      Hey Hashim!

      Regarding the long-tail assessments, I really loved the way Avinash broke down his methods: http://www.kaushik.net/avinash/google-secure-search-keyword-data-analysis/

      That’s pretty much how I back into them (and view long-tail results) as well. Keyword data hasn’t been reliable for more than “basic trends” for some time — almost to the point where I personally consider them vanity data.

      If you *really* want to know what’s going on, segment, segment, segment. :)

      Chances are good, if you see which pages are getting hit the most by organic traffic, you *know* what you’re trying to deliver on them and can test your variables until it’s converting like a mofo.

      Google *is* getting smarter, and they’re doing a lot more to make searching more effective and efficient. My best advice is segment all the things and use that to analyze the outcomes you previously studied in #4 and #5. Using those reports, your data can tell you stories that are far more intuitive than the keyword samples ever could… and if acted upon, can help you convert far better. :)

    • says

      We do pretty much the same as you, without knowing where you are accidentally pulling traffic from it makes it harder to market your business!

  3. says

    “They work very hard to think like Google’s algorithms — when they’d be better off thinking like the intended audience.”

    Couldn’t agree more! In my experience, I’ve found that when you think like your audience and put their needs first the SEO work you do is almost always the kind of SEO work the search engines reward. It’s not a game, it’s just about building your brand in real ways.

  4. says


    You’ve said…

    “They work very hard to think like Google’s algorithms — when they’d be better off thinking like the intended audience.”

    Google and their algo want to help us better serve our audience.

    How can you “optimize” your site with content which serves your audience if they hide this data?

    That’s very weird and nonsense.

    I feel like Google has cut our left hand.

    It’s not that Google confirms our instinct, it’s that they’ve provided
    (until now) the EXACT keywords people type in to LAND
    on our site (not just guess, and visit competitors sites)

    You guys don’t get what this means as you definitely
    didn’t optimize your content @ CopyBlogger
    on keywords, from day #1 I guess…

    To me, Google (algo and team) is getting either dumber with each day which passes or evil…

    • says

      You guys don’t get what this means as you definitely
      didn’t optimize your content @ CopyBlogger
      on keywords, from day #1 I guess…

      Codrut, I’m afraid you’re mistaken. Turn off personalized search and google copywriting, content marketing, email marketing, landing pages, seo copywriting, keyword research and internet marketing. Those are our cornerstone topics, and the ones we intended to rank for.

      You’ll see we do quite well. Not to mention all the traffic we get from related keyword phrases.

      Has it been helpful over the years to see how much traffic those terms bring us? Sure. But it’s not hurting us that we can’t any longer.

      When you do upfront keyword research, you know approximate search volume. And when you see that you’re ranking well for those terms, you know you’re getting traffic.

      Ultimately, traffic to a page doesn’t mean anything. What do you want those visitors to do? How many people are doing that?

      For us, that’s getting the people who visit those ranking pages to register for MyCopyblogger. That’s a meaningful and trackable action, and we track it all day long.

      Too many people have been watching traffic instead of conversion. The thing that’s been truly not provided all along is their bank accounts.

    • says

      You say the Google team is getting dumber, but that seems to assume that they’re there to serve your needs as a publisher. They are not. (They are also getting smarter, not dumber.)

  5. says

    I was wondering what was going on. Sometimes, I’m just clueless! That was a long read, and I’ll have to read it again without the kiddos breathing down my neck to make sure I grabbed everything. But it did answer the question of “what happened?” for me. And, how I can figure out what to do next.

    Honestly, and this may sound stupid, I’ve never put that much stock into the Bing and Yahoo! analytics. Stupid? I think so. I guess my reasoning for this was because Google commands such a large portion of the search traffic that it didn’t matter what the others said.

    Shame on me.

    Thanks again, Sonia and Sean, for the insight.


    • says

      Explain to me why you need this particular data (other than it would be nice)? Isn’t conversion on the page more important than traffic to the page? Yes, you won’t be able to determine conversion rate, but you can track total number of conversions from the page, and tweak and test to raise that total number. That’s what truly matters, but not many are doing it because they’re hung up on otherwise meaningless traffic totals.

      • says

        “First, Google isn’t the only search engine”

        Yes, we have access to queries used on other search engines…for now.

        I agree that measuring conversions is the general goal we should strive for. But I work with government clients whose websites are very informational in nature. Knowing words and phrases the public is using to get to the site can really help to improve content on the site.

        For example, people were searching for “swine flu” and not “H1N1.” The query data helped them adjust their content to be more relevant and useful to the citizens.

        • says

          I get you. Because we use our own Scribe tool, it suggests related keywords based on existing content. So we would get a suggestion of “swine flu” from an article focused on H1N1. Helpful.

        • says

          Ty, you provide a great example. It is important to note that Google and Bing Webmaster Tools will have this data for the site.

          The problem is, in the case of Google, associating the keyword data to a specific page is not easy/possible.

          Bing, however, does have in their Webmaster Tools a Keyword report that shows the page, or pages, served for that keyword.

          Since a government site should have a broad audience reach, the data in Bing should be able to provide a good sample of keywords and pages that Google Analytics does not show.

      • says

        That’s certainly been a speculation. The question is, if you had to pay for it, would it truly be worthwhile? Or are there better ways to get to audience intent?

  6. says

    Thank you! I’ve been waiting for an article like this to come up. The few things to keep in mind are (1) Google is not the only search engine, (2) there are other ways to find keywords, (3) keywords are no longer the new “essential;” rather, the contextual association of those words are, and (4) social media isn’t going anywhere, so interact with your audience more to get what you need.

    The fourth one, I feel, is the most important. Especially if your business depends on how you look online.

    Anyway, thanks for the great post!

  7. says

    Facebook is a better platform than Google, at least for now. It’s cheap, it’s pretty easy to use and it’s reach is huge, not to mention the SEO value if you do it properly.

    SEO needs to change the way it thinks and look more towards interest group targeting instead of only keyword targeting if it’s to survive. Matt Cutts has been saying it all along, focus on great content and user experience, it’s the only way to do it. When you do this, the majority of traffic will find you naturally.

    Watch out for Google+ they are the next big opportunity, Google are not going to accept FB lying down.

  8. Tim Hardesty says

    Am I missing something in your first screen shot of Google Analytics? I don’t see any quantitative data which shows that 68% of the queries are (not provided) and that 33 percent. Should their be another column(or two) showing?

  9. says

    I have a semi-related question. Our website is getting ready to launch a major refresh. Included in that is new forum software that lets us tie easily to Google Authorship. Now, we have roughly a decade worth of archived articles (part of our defunct Zine) that we’ll be re-submitting with Authorship and 301 redirects. I’ve been hesitant to update titles because I don’t want to lose the Google juice many of these articles already enjoy, but my God, the titles for the bulk of them are horrid, and at times, downright confusing. Suffice it to say, precious few of them have strong organic keywords in the actual title. (Live and learn, lol)

    Is it better or worse to create new headlines after the fact?

  10. says

    >>>Is it better or worse to create new headlines after the fact?


    >>>I’ve been hesitant to update titles because I don’t want to lose the Google juice many of these articles already enjoy … precious few of them have strong organic keywords in the actual title.

    What do you have to lose in terms of “Google juice” then? 😉

  11. says

    The issue i have is that I was using this tool of knowing what people used to add the stupid +MONEY WASTING keywords that people used to get to my site through adwords. I would see my ads were showing up for things I didn;t even sell, then mark them as negative key words in google. NOW i have no way of knowing how much money google is stealing from me now that they figured out that if they didn;t show the keywords, then people couldn;t prevent the adwords ads from showing up.

    This is just a game by google to extract as much money as possible through shady ways. Anyone remember when 15-20 sites from the same company would show up in search results and the only way to get your site seen, was to purchase adwords ads? – another shady practice that made google billions in revenue.

    Doesn’t matter though, you need google and they can do whatever they want.


      • says

        @ Brian – I like to think I have a firm understanding on google stuff, but please tell me where I can see which keywords brought people to mysite in adwords.

        I use an outside site to track the stupid keywords that google would show my ads from and now it says (not provided) which is what happens when someone signed into their google account and searches. There is no way that I know of where I can see what keywords actually brought people to my site through adwords. They show me the keywords I bought, but that doesn’t show the additional keywords used adding something that has zero meaning for me.

        Not going to give away what i’m selling, but lets say i sell a white board. People searching for white cars are clicking on my ads. Now I used to see “car” in my stats caused people to click on my ad, so I added “car” to the negative key word. So then people search for white cat and my ads show up. I add “cat” to the negative list. etc. etc.

        Now I see people who are signed into google as (not provided) they click on an ad and bounce right back off in less than a 5 second window telling me that the person isn’t in the right place.

        So google is getting millions of clicks like this, in millions of ads, (at $2-6+ a pop) every day. So they figured out that if they hide these keywords so that they can’t be added to the “negative” keyword list bu advertisers, they all of a sudden don’t have to worry about it…

          • says

            @ Ryan – No they don’t give you the exact keywords, only what they want you to spend money on. Google doesn’t give you 100 of what people type to get to your site. That is why I used a completely separate monitoring tool so that I could see what to block. Google isn’t stupid ad when they saw people like me blocking these garbage keywords, they didn’t like it and now they don’t show them.

            I just compared what my third party site has on key words that brought people to my site through google and what google provides and it leaves out a lot of data. Not sure you understand completely if you keep trying to tell me I can get teh data. Google doesn’t give 100% of the keywords because it would cost them a lot of money. I see and have seen both sides of the tracking, because I don’t trust them spending thousands of dollars as quick as they possible can.

            There is a reason Google is where they are today and it isn’t by giving people the info they know makes them a lot of money. Anyone know how their algorythms work?

        • says

          Paul – I think below is what you are looking for, but if I misunderstood, please tell me. I’ve been running PPC campaigns since the Overture days and live and breathe it daily, so hopefully I can help…

          While in your AdWords account, here’s the sequence to find out not just what keywords were “triggered” by searches, but what those searches were:

          “Campaign” tab (big green bar)

          “Keywords” tab

          “Details” drop-down (below the graph that is there by default)

          “Search Terms –> All” (or “selected” if that applies)

          If you already knew about this, my apologies. But this is something that 90% of the people I talk to don’t know about so I wanted to mention it.

          Also, this solution has its own drawbacks…and I know this doesn’t solve the SEO side of things which is the whole idea of this post, but what the heck!


          • says

            @ Scott:
            I did know about this, but unfortunately, it still doesn’t show what i was using the thrid party tracking service for. Google still leaves out the additional key words that were used to get to the ad. It does tell you which keyword brought the person to the site, but not the part I want to see, so that I can either accept it is a good keyword, or not the correct one and mark it as a negative keyword. I just compared those again to my third party info and it isn’t there.

            I do appreciate you trying to help, but I am also older and have been doing this a doing time. It isn’t my only job function, so I may miss some things, but this was a huge issue for me when Google started to hide what people are searching for.

            Google is a love hate relationship. You cannot do business without them and love the business it has brought, but hate that you have to monitor and adjust things on a constant basis to be effective. –

            For the people who are looking at Bing tools, i would make sure bing is bringing people to your site before spending any time with their monitoring, or webmaster tools. I spent a ton of time trying to use yahoos tools back in the day and it was really a waste of time for my products. I need more than 5% of my traffic to come from Bing to even spend any time on them. Google is really the only place I needed to monitor after seeing my stats.

  12. says

    Not provided is kinda of a drag and since what you do with the social connects you generate on G+ and FB the other top socials, will impact conversion rates. Google+ Circles leads to G+ Communities leads to where the real action is!

  13. says

    Being new to blogging, I have focused on finding my voice and providing quality content. I use Google Analytics but my main focus is on building a decent amount of quality content. One pitfall I have seen this year is some bloggers fixate on their analytics so much that it becomes a case of “paralysis by analysis”. It does not happen to every one but I think it does play a part in the high attrition rate among bloggers.
    I am trying to drive traffic but to Brian’s point, traffic without conversion is not a sustainable model, if you are using a blog as part of your business model. Once I am established, I would be more worried about conversion rates versus just a lot of traffic with no engagement. Just the two cents of a relative beginner.

    • says

      Absolutely agree. Obsession with mechanistic analysis can lead you to completely overlook the larger picture. Which is why the (Not Provided) development is annoying, but in the larger picture, working around it may actually lead to a better business result.

  14. says

    Good content shared in the right context will allow the cream to rise to the top. I like your comments re using your own “good judgment” to connect.

    BTW, best way to get new customers is personal referrals. They are referred. They go to the link provided. They look, read and watch. They like. They call or click and buy. No random fishing type search necessary. Good products or services => referrals => ball’s in your court => done deal. Just saying… :-)

  15. says

    Well, why is Google taking that info away from us?

    You said this would be “a major bummer for those of us who want to give web searchers a great experience with our wonderful content.” Is Google implying they don’t want us to do this?

    Obviously they don’t think we need keywords to create great content, and perhaps they’re even telling us they don’t like that. SEO is just another a fancy word for manipulation after all, and Google doesn’t like that.

    So they’re undermining themselves and their own search function then, right? Well, perhaps, but I’d bet they’re working on stuff we don’t know about. Maybe they should start buying up some headache medicine manufacturers.

    Like the article says, we don’t need to freak out.

    • says

      Remember, you can still do keyword research. That’s where you find the language of your audience for content purposes.

      All that’s happened here is that you can’t tell exactly how much traffic you get per keyword. Annoying, but not as bad as if we had no access to search engine query data in the first place.

  16. says

    The information should also be in your web server logs. Unfortunately, many hosting companies hide this info to force you to pay for access to what should normally be free.

    Your mileage may vary, but if someone clicked a link on a search engine results page to get to one of your pages, the “referring URL” will normally be in your server logs. And since, at least in the case of Google, the search term used will be part of the Google referring URL, you have what you need.

    • says

      That used to work, but Google is no longer providing the referral data to your server. If it were, we’d be seeing an insurgence of Google Analytic clones that made use of this data. The search terms aren’t being passed at the most fundamental level.

      It’s not that they just aren’t being displayed in GA, it’s that they aren’t even tracked anymore.

  17. says

    I still believe that if you have amazing content worth sharing, your going to be ok. I think that people focus way to much on technology and not what matters.

    Your talking to people! Not Google!

    Fantastic post. That explained it in a very down to earth way. Thanks…

  18. says

    I am pretty new to SEO and couldn’t follow everything in the post, however Google has been making things more and more difficult first taking their keyword tool and now this. I will give the Bing Webmaster Tools a try!

    • says

      Gavriel, we have free SEO and keyword research materials for you (sign up in the box to the right) that will help you focus on the elements that actually create results. :)

  19. says

    Hello Sonia and Sean,

    Google isn’t the only search engine and there are still keywords provided?
    A lot better than zero?
    Yes, but this is a short term solution and you improvise with what you already have. Finding what the customers want becomes more an art and an educated guess than a science.

    What really matters is to find other ways to discover customers’ intent, their needs and wants. Other ways than keywords. Also, not only finding other ways to discover people needs and wants but also other ways to teach small business owners how to do it. This is the real challenge.

    We are at the dawn of a new age, indeed. Online marketing will become more sophisticated than before. Far more sophisticated. I think a new paradigm will emerge.

    Until then we are left on our own, fishing in murky waters trying to find ways to catch the prey.

    Have a wonderful day

  20. says

    Well, you are actually right what concerns the US, but seen internationally the market share of google (except for russia and china) is far above 90%. For example: I generate 30 visitors per day through search engines and 100% of them through Google. That’s the difference between Europe and America. Bing and Yahoo haven’t managed to get shares off Google yet.

    So the (not provided) thing is a big concern for me, because, as you said, the webmaster tool is not too useful 😉

Comments are open for seven days. This article's comments are now closed.