Google Just Quietly Rewrote Your Search Terms Report: And Most Advertisers Haven't Noticed

There's a habit I have. Every Monday morning, I pull search-term reports for my clients and read what real people typed into Google. It's the single most honest piece of data in the whole Google Ads ecosystem. People type what they want. You can see it. You can act on it.

Well — used to be able to.

Last week Google quietly updated how Search Terms are reported for AI Mode, AI Overviews, Lens, and autocomplete queries. They didn't put out a press release. They didn't flag it in the Google Ads UI. The story broke in Search Engine Journal on 13 May and it's been picked up across the PPC community since. If you've felt your search-term reports getting a bit fuzzier lately, you're not imagining it.

What actually changed

In the old world, a search term was a search term. Someone typed something, your ad showed, and you saw exactly what they typed in your report. Simple.

In the new world, a lot of queries don't start as typed strings anymore. Someone asks AI Mode a question. Someone hovers over a Lens search. Someone takes an autocomplete suggestion. Google now has to decide what "search term" to record for those events — and the answer is sometimes a paraphrased, interpreted version of what the user actually did.

That's a polite way of saying: the column in your spreadsheet labelled "search term" is no longer the literal thing the user did. It's Google's best guess at the intent.

Why this matters for your business

If you sell physical products through Google Ads, you've probably done one of these things in the last six months:

  • Built a negative keyword list based on irrelevant search terms showing up in your reports
  • Created a new ad group because you spotted unmet demand in the long tail
  • Argued with a supplier about whether a product line was worth keeping based on what people were actually searching for

All of those decisions rely on the search-term report being accurate. If Google is now serving you an interpreted version of the query, your decisions are built on Google's interpretation, not the user's actual words. There's a small but real layer of judgement now sitting between you and your customer.

It's not the end of the world. It's a quiet shift in trust. And these things compound.

This isn't the only Google Ads transparency change this month

The search-term change is one of three reporting moves Google has made in the last fortnight, all pulling in the same direction.

Historical data limits. Google is imposing new restrictions on how far back you can pull reporting data, both in the interface and through the API. The full timeline hasn't been confirmed for the UK yet, but anyone running monthly reports that compare year-over-year should be quietly worried. I am. The fix — for us at least — is BigQuery. We've been archiving daily ads, GA4 and Merchant Center data into BigQuery for over a year for exactly this kind of moment. If you don't have a long-term data archive set up, this month is a good time to start.

GA4 added an AI Assistant channel group. Slightly less concerning, slightly more useful: Google Analytics has rolled out a new default channel group that separates AI assistant traffic from generic referrals. So ChatGPT, Perplexity, Gemini and friends now appear as their own channel. That's a small win. It means we can finally measure how much traffic an AI tool sends you — and more importantly, whether that traffic actually converts.

Search-term reporting interpretation. The one we started with.

Taken together, the message is clear. The Google Ads reporting layer is being rewritten around AI, and the granularity advertisers used to take for granted is going to be in shorter supply going forward.

What I'm doing about it for my clients

A few practical moves over the next few weeks:

1. Pull a long-history check. I'm running a multi-year GAQL query on a sample of accounts to find the exact cut-off where historical data stops being available. We need to know whether it's 24 months, 18 months, or something else, and we need to know it before someone asks for a year-over-year report and discovers the data isn't there.

2. Tighten the BigQuery archive. Some clients have full daily backfills. Others don't yet. Anyone running serious year-over-year analysis needs this. It's not glamorous and it's not exciting, but it's the only durable answer.

3. Use the AI Assistant channel group properly. Now that GA4 separates AI traffic, we can actually start to answer the question I've been asked five times this year: "is ChatGPT sending me any business?" For most clients, the honest answer six months ago was "we can't really tell." Now we can.

4. Be honest with clients when search-term reports look odd. If a campaign suddenly shows weird-looking search terms next week, the first question used to be "is there a broad match leak?" Now there's a second question: "is this Google's interpretation of an AI Mode query?" Worth pausing before pruning.

The bigger picture

A pattern I keep coming back to: the more AI sits between users and your ads, the less raw data you get to look at. Google's whole pitch with Performance Max is "trust us, we'll show your ads to the right people." It works, mostly. But you can't audit it the way you could audit a Search campaign in 2018.

The search-term report has been one of the last places where you could still hear the customer in their own words. If even that's being softened by AI interpretation, the question for the next year isn't going to be "how do I read these reports?" It's going to be "where else can I hear what my customers actually want?"

My honest answer: customer service logs, post-purchase surveys, on-site search data, product reviews, and the bit of GA4 that still shows you organic Google queries (when it does). The bit of the funnel where people use their own words is still there. It just isn't in your ads dashboard anymore.

I'll be writing more about all of this over the next few weeks. If you're running an e-commerce account and you want a fresh pair of eyes on what's been happening with your reporting layer, that's exactly the sort of thing I'm here for. Drop me a line.

Peter