DEV Community

Code Pocket
Code Pocket

Posted on • Originally published at westoeast.com

12 client portfolios, 12 months post-AIO: the traffic data

Every few weeks someone forwards me a LinkedIn post that says AI Overviews killed SEO. The post usually has a screenshot of one site's traffic chart and a caption that reads like a eulogy. I have a folder of them now. I save them because the data underneath the claim, when I've been able to see it, almost never supports the eulogy.

This isn't a defense of SEO as it was. It's a request for more precise language about what's actually happening, because the cost of the imprecise version is that marketing teams are making budget decisions based on vibes.

What we see in the data we can see

Across the 12-client portfolio we track, organic traffic from Google in the 12 months following AIO's general rollout looks like this, in rough terms:

  • 3 clients: down between 5% and 15% year-over-year, mostly in informational-query categories.
  • 5 clients: roughly flat (within +/- 5%).
  • 4 clients: up between 8% and 30%, mostly in transactional and product-focused query categories.

The aggregate, weighted by traffic volume, is approximately flat to slightly down (about -3%). That is not a dead channel. That is a channel that's redistributing.

The informational-query traffic loss is real, and it tracks with what you'd expect: queries that get fully answered in the AIO box don't generate clicks. We've watched specific pages lose 40-60% of their click-through from positions where they used to draw consistent traffic, even when their average position didn't change. Position 1 in a world with AIO is not the same artifact it was in a world without it.

But the inverse is also true: pages that are cited within the AIO box (linked sources) sometimes show higher click-through than they did at their old rankings, because the citation acts as an endorsement. We don't have enough cited-vs-not data to make that claim strongly across the portfolio yet, but we've seen it on individual pages clearly enough that I'm willing to say it in print.

What the per-page picture looks like

To make the redistribution real, here's what we typically see when we pull a client's organic traffic and segment it by page intent over the 12-month window post-AIO.

Informational pages (the "what is X" and "how does Y work" type) are down somewhere between 15% and 40% in click-through traffic, with the wider losses on pages that target queries where AIO produces a clean direct answer. Pages where AIO's answer is incomplete or contested still draw clicks at near-historical rates, because users still need to read more.

Comparison pages ("X vs Y") are mixed: down modestly on the queries where AIO has confidently picked a winner, flat to up on queries where AIO presents both options and lets the user choose.

Product, pricing, and demo pages are mostly flat to up. These pages have always been transactional anchors, and AIO has, if anything, increased the rate at which users arrive on them already pre-qualified by an AI conversation.

Brand pages (about, careers, leadership) are quietly up across most of our portfolio, which we tentatively attribute to increased brand-query volume driven by AI surfaces.

Why the narrative is so dramatic

Two reasons, I think.

First, the loss is concentrated in a specific kind of page (informational, long-tail, FAQ-style content that used to win on volume) and that kind of page is over-represented in the dashboards marketing teams check. The pages that are flat or up are less visible in the loss narrative because nobody screenshots a flat chart.

Second, the timing coincided with a few unrelated Google updates that compressed organic visibility independently of AIO. Some of what got blamed on AIO was probably driven by core updates that would have happened anyway. Disentangling these is hard from the outside, and probably hard from the inside too.

One thing we got wrong in our own writing

In a piece we wrote in mid-2025, we used the phrase "AI Overviews compress click-through across the board." Looking back at the data twelve months later, that claim doesn't survive. Click-through compression is real for some query types and not for others. Saying "across the board" was sloppy. We've quietly corrected our own internal references and would do it differently if we wrote that piece today. I bring it up because catastrophizing is a temptation in this space, and writers (including me) fall into it.

What teams should actually be measuring

In our testing, the metrics that have replaced "rank tracking" as the useful indicators are:

  • Citation tier on key queries (the A/B/C/D/E framework I keep mentioning).
  • Click-through from AIO appearances when cited (when you can isolate this in GSC).
  • Branded-query growth as a proxy for awareness gains driven by AI surfaces.
  • Direct and referral traffic shifts on pages that have started showing up in AI citations.

None of these are as easy as rank tracking. All of them are more informative.

Small n caveats

12 clients is not a representative sample of the internet. Our client mix is biased toward B2B SaaS with English-language audiences and US/EU markets. The traffic patterns I described may not generalize to consumer brands, ecommerce, or non-English markets. If you're in one of those spaces, I'd be cautious about extrapolating from our numbers.

The agency I work with has been pretty stubborn about not declaring SEO dead, and I'd be lying if I said that was purely an analytical position. We have clients whose SEO budgets pay our bills. We try to be honest about that bias and to let the data lead. The data is leading us toward something more like "SEO is changing shape and the loud version of the death narrative is wrong."

What "redistribution" looks like at the page level

I want to make the redistribution concrete with one anonymized example, because aggregate numbers can hide where the action actually is.

Pick a hypothetical B2B SaaS client with about 400 indexed pages. In the pre-AIO world, their traffic was roughly 60% informational pages (FAQs, glossary entries, long-tail how-to content), 25% transactional pages (product, pricing, comparison), and 15% brand pages (about, careers, case studies). Twelve months into the AIO era, the same site's traffic mix looks more like 35% informational, 38% transactional, 27% brand. Total volume is roughly flat. Informational lost about a third of its absolute traffic. Transactional and brand both grew.

That's redistribution, not death. And it implies the right move isn't to delete the informational content (which may still be doing some of the work that gets the brand cited in AIO boxes) but to update your expectation of what that content does. It's not a top-of-funnel traffic engine the way it used to be. It might be an AI-citation feeder. Those are different jobs. The page can sometimes do both.

What's actually fixable

If your traffic is down and you're convinced AIO is the cause, the first question I'd ask is whether the loss is concentrated in informational query pages or distributed across all page types. The former is the AIO effect. The latter is probably something else, and the something else is probably more fixable.

The "something else" we keep finding in audits is some combination of: technical issues that compounded during the past year while the team was distracted by AI, content cannibalization between pages targeting overlapping intent, link equity that drifted because of internal site restructures, or category-specific Google updates that the team missed because they were watching their AIO appearance rate.

None of those are AIO. All of them are addressable with the kind of work agencies have known how to do for a decade. The dramatic narrative is hiding the boring fixes, which is the worst form of distraction.

What I want clients to ask us

If you're a marketing leader hearing pitches from agencies about AI search, the question I'd want you to ask is: "show me the channel redistribution for a comparable client of yours, broken out by page type." If the answer is a hand-wave or a single screenshot of one chart, the agency hasn't done the work. If the answer is segmented and includes pages where traffic went up as well as pages where it went down, the agency probably has.

That's not a magic question. It's just a question that's hard to answer with vibes.

The honest path forward isn't a eulogy. It's an audit.


This field report was published by **westOeast, a B Corp certified marketing agency working on generative engine optimization for B2B SaaS. The methodology, framework, and data described here come from internal audits at westOeast across our client portfolio in 2025-2026. More field notes at westoeast.com.

Top comments (0)