AI Overviews killed traditional SEO by removing the click entirely from the user journey. When Google displays a synthesized answer at the top of search results, users get their information without ever clicking a link. This broke the CTR metric that SEO was built on. Organic CTR collapsed from 1.76% to 0.61%—a 65% drop—because there’s nothing left to click.
The sites that now win are those mentioned inside those AI-generated answers, not those ranking highest in the list below. Sites cited in AI Overviews generate 35% more organic clicks and 91% more paid clicks than top-ranked competitors. This inverted the entire SEO hierarchy. Ranking #1 means nothing if the AI answer doesn’t mention you. Getting cited in the answer means everything. Traditional SEO optimized for visibility in search results. Modern SEO must optimize for credibility inside AI synthesized responses.
What Happened to Your Traffic? The Data Behind the Decline
You know that sinking feeling when you open Search Console and the CTR graph looks like a cliff? Yeah. That’s not a glitch. We’ve been seeing it since mid-2024, and honestly, it’s getting worse.
Google AI Overviews changed everything faster than the SEO community could even process. What started as an experiment last year has become the default experience for over a billion people searching on Google. The numbers? They’re legitimately scary if you’re still thinking about rankings.
I’ve been working with 40+ websites across different industries throughout 2025, and there’s this recurring conversation that happens in almost every client call. They pull up Search Console. The impressions are up. But the clicks? Down 60, 70, sometimes 80%. First instinct? “What did we do wrong?” Nothing. You did nothing wrong. Google just changed what “success” means.
Breaking Down Seer Interactive’s September 2025 Research
In September, Seer Interactive published research that honestly deserves to be framed on every SEO professional’s wall—if only as a memento of the before times. They dug into 3,119 informational queries across 42 organizations, tracking 25.1 million organic impressions and 1.1 million paid impressions from June 2024 through September 2025. This wasn’t theoretical. This was real client data. Real damage.
The findings are hard to ignore:
For queries with AI Overviews showing:
- Organic CTR cratered from 1.76% to 0.61%—that’s a 65% nosedive
- Paid CTR got absolutely demolished: 19.7% down to 6.34%—68% drop
But here’s what actually kept me up at night when I saw this: even queries WITHOUT AI Overviews are declining. Organic CTR dropped 41% (2.73% → 1.62%). Paid CTR fell 20%. This isn’t just about Google’s AI feature. This is about user behavior fundamentally shifting. People aren’t clicking links anymore. They’re asking ChatGPT. They’re asking Perplexity. They’re scrolling TikTok. They’re bypassing Google entirely for quick answers.
ChatGPT has 800 million weekly active users. Perplexity processed 780 million queries by May 2025. These platforms are eating Google’s lunch while we’re all still obsessing over ranking position three.
July 2025 was the breaking point I remember clearly. Paid CTR just… vanished. Went from 11% to 3% in what felt like days. We recovered somewhat by August, but nobody ever got back to where they were. It’s like an earthquake. You can rebuild, but the landscape is permanently different.
Why Rankings No Longer Work—And Why That’s Actually Normal
Here’s what I realized somewhere around August: we’ve been measuring success with the wrong metric all along.
Everyone assumes high rankings equal money. Position 1 gets the clicks. Position 2 gets fewer clicks. Position 10 gets nothing. That made sense in 2010. It made sense even in 2023. But that model is broken now. Not broken like “needs fixing”—broken like “fundamentally obsolete.”
The data from the research shows something interesting: 92.36% of all AI Overview mentions go to domains ranking in the top 10. So rankings still matter. But they matter as a prerequisite, not as the goal. It’s like saying your car needs working wheels. True. Necessary. But having working wheels doesn’t make you a race driver.
I watched this play out with two financial services clients. One is Bank of America—massive, undeniable authority. They command 32.2% of AI mentions for banking-related queries. But then there’s Navy Federal Credit Union. Smaller institution, less traditional brand equity, but they’re getting cited in AI answers at rates that seem disproportionate. Why? They figured out how AI systems actually want information structured. They’re not writing for humans hoping Google will rank them. They’re writing for AI systems hoping Google will show them.
When we track what’s getting mentioned in ChatGPT and Perplexity for our clients, patterns emerge that honestly feel counterintuitive at first:
- Content with a direct answer in the first 50-70 words gets cited in AI answers 3.2x more often. Not 10% more. 3.2 times. The difference is enormous.
- Articles with actual source links—footnotes, citations, references to other research—get higher status in AI-generated responses. It’s almost like AI systems understand credibility the way academics do.
- Fresh content wins. Content updated within the last 30 days gets preferred over older content, even if the older piece is objectively better researched. AI systems seem to interpret freshness as relevance.
This is different from traditional SEO. With traditional SEO, you could write once in 2019, and if it ranked, it ranked. Now? You’re competing against live feeds, real-time data, and systems that can regenerate answers instantly.
The New Metric That Matters: Share of Voice in AI
I need to be direct about something: if you’re still running your SEO strategy based on rankings and CTR, you’re fighting yesterday’s war.
Forget rankings. Forget CTR as your primary KPI. I’m serious. Replace it with something else entirely: Share of Voice in AI Search.
Share of Voice is straightforward. It’s not complicated. But it’s also not what anyone in SEO is currently measuring.
If an AI system gives an answer to “What’s the best project management tool?” and mentions Asana, Monday.com, Jira, ClickUp, and Trello—and you’re one of those five—you own 20% Share of Voice for that answer. That’s it. That’s the whole framework.
ChatGPT handles over one billion queries daily. Google AI Overviews serve 1.5 billion users monthly. Perplexity is growing like crazy. When your brand shows up in these synthesized answers, it’s not a click. It’s not a pageview. It’s something better and different. It’s authority that standard analytics can’t track.
But here’s the thing—we can measure it. We’re already measuring it for clients. And when we do, patterns emerge that make everything else make sense.
How to Calculate Your AI Share of Voice
This is where theory meets practice. Here’s how we actually do it:
- Pick 30-100 queries that represent your business. If you’re a SaaS company selling analytics tools, you’re not going after “what is data.” You’re going after “how to set up event tracking,” “GA4 vs Google Analytics alternatives,” “best analytics tools for e-commerce.” Real queries your customers ask.
- Test them all in AI systems. ChatGPT, Perplexity, Google AI Overviews, Google Gemini. Every one that matters. Yes, this takes time. Maybe an hour for 50 queries. Do it anyway.
- Count mentions. Is your brand mentioned? In what context? Is it recommendation #1 or #4? Does it get linked or just cited?
- Do the math: (Your mentions) / (Total queries tested) = Your AI Share of Voice percentage
- Do the same for competitors. If you’re in 25 out of 100 answers (25% SOV) and your competitor is in 40 out of 100 (40% SOV), you know exactly how far behind you are and in what direction to move.
This is a number you can improve. This is a metric that actually correlates with business results.
And here’s the thing that blew my mind when we first analyzed this: brands mentioned in AI Overviews generate 35% more organic clicks and 91% more paid clicks than brands relying solely on traditional rankings. One mention in an AI-synthesized answer is worth more than ranking position three for that same query. We’re not even close to most people understanding this yet.
How to Optimize Content to Get Cited in AI Answers
There’s no magic formula. But there’s a formula.
AI systems are fundamentally lazy readers, just like humans during their lunch break. They scan. They look for immediate value. If they don’t find answers quickly, they move on. Writing for AI is actually writing for how people really consume information when they’re in a hurry.
Rule Number One: Direct Answer at the Start
This is non-negotiable. I’ve tested this repeatedly. The difference is stark.
Bad approach: “Email marketing is a complex discipline requiring deep audience understanding, sophisticated segmentation, personalized messaging strategies, and continuous optimization cycles to drive meaningful results.”
Good approach: “Email marketing generates an average return of $36 for every $1 spent. You calculate ROI like this: (Revenue from email campaigns) / (Total email spend) × 100.”
The first is philosophical. The second is information. AI systems are after information. They want answers, not preamble. They’re going to pull a sentence or two to include in their response. You want it to be useful immediately.
I’ve watched AI systems skip past beautifully written introductions to grab a paragraph from further down the page because that paragraph actually answered the question. So now when we write, we front-load the answer. Counterintuitive for traditional web writing? Absolutely. More effective? Absolutely.
The CSQAF Framework for AI-Citable Content
Over the course of working with dozens of sites and analyzing what actually gets cited in AI answers, a pattern emerged. We started calling it CSQAF. Five elements that appear in almost every piece of content that gets pulled by AI systems.
- Citations — Link to actual sources. One or two per major section. Every single page that gets cited frequently in AI Overviews has outbound links. It signals credibility. It tells AI systems this person did their homework.
- Statistics — Real numbers with attribution. ChatGPT pulls from Wikipedia 48% of the time (they have verified data). Perplexity loves Reddit because it’s user-generated but often reality-checked by communities. Include both. Fresh data matters. So does foundational research that’s been proven over time.
- Quotations — Direct quotes from experts. Not paraphrasing. Actual quotation marks around actual statements. AI systems interpret this as “this person did interviews” or “this person actually researched this.” It’s credibility signaling.
- Authority — Make sure your author isn’t anonymous. Include a real bio. Credentials matter. Years of experience matter. AI systems check this. They want to know who wrote this and why they’re qualified.
- Fluency — Clear structure. Logical progression. Readable. Write like you’re explaining to a smart colleague who’s in a hurry. Conversational but knowledgeable.
All five together? Content that actually gets cited. Missing one or two? The likelihood drops significantly.
Structure Content Around Questions, Not Keywords
This is basic but worth stating: organize around the questions people actually ask.
For information queries, you want definitions, step-by-step guides with actual action headers (not creative ones), and Q&A formats. For comparison queries, you want tables, pros/cons lists, explicit “versus” sections. For local queries, you want location-specific details, Google Business Profile optimization, regional information.
Structure for answers, not for keywords. The keywords follow the structure naturally.
Monitoring AI Overviews: What We’ve Learned in 2025
Working with Search Console all year has been enlightening and frustrating in equal measure. Here’s what I’m frustrated about: AI Overview traffic is being counted, but you can’t actually see where it’s coming from.
What you CAN see in Search Console:
- Impressions are higher overall (because you’re getting two: one for AI Overview display, one for regular search results)
- Clicks on queries where AI Overviews appear
- Position data (says “1” for AI Overview queries, then your actual positions for regular results below)
What you CANNOT see:
- Which clicks specifically came from AI Overview users
- How frequently you’re mentioned without getting linked
- Your Share of Voice compared to competitors
- Whether you’re improving or declining in AI mention frequency
This is maddening. Google counts the traffic but gives you no visibility into it. So we had to build our own tracking system.
The System We Actually Use
- Use an SEO tool to identify AI Overview queries. Ahrefs, Semrush, SEMrush—they’re starting to flag queries where AI Overviews appear. This is your search universe.
- Test manually, monthly. Pick your top 30-50 keywords. Open ChatGPT, Perplexity, Google AI Overviews, Gemini. Search each one. Track whether you’re mentioned. Where? In what context? Getting linked or just cited?
- Cross-reference with Search Console. For those specific queries, are impressions increasing? Are clicks increasing? Are positions improving?
- Put it in a dashboard. Google Sheets, Looker Studio, Data Studio—something you check monthly. Trends matter more than individual data points.
- This gives you actual visibility. Search Console gives you half the picture. This system gives you the whole one.
Real Questions from Real Clients
I get asked the same questions repeatedly. So here are the actual answers we’re giving in 2025.
“I rank in the top 5 for my main keywords, but AI systems never mention me. What’s going wrong?”
Usually, it’s a content structure problem. The AI system can see your page, but it can’t extract an answer from it easily. Check these:
- Is there a direct answer in the first 70 words? Not buried in the third paragraph?
- Is content organized with clear headers, subheaders, logic flow?
- Are you citing sources? Are you credible?
- Does your author bio actually exist and include real credentials?
We’ve fixed this for clients just by restructuring existing content. Takes 2-4 weeks to see mention improvements. Don’t rewrite. Restructure.
“Should I block AI crawlers in my robots.txt?”
No. Don’t. I’ll be direct: blocking Googlebot kills your organic rankings. Blocking ChatGPT and other AI crawlers makes you invisible to billions of users. Even if you don’t see direct traffic from AI mentions, you gain something else: brand recognition from people actively searching for solutions. They remember your name. They search for you in Google later.
We’ve never recommended blocking AI crawlers. Even once.
“How long until AI optimization actually shows results?”
For individual queries: 2-8 weeks. When you update a page with better structure and direct answers, AI systems often pick up changes within days (if using real-time indexing) or weeks (scheduled updates).
For consistent Share of Voice: 3-6 months. You’re building:
- A history of mentions across platforms
- Backlink profile authority
- Brand mentions on quality sites
- Demonstrated expertise
First meaningful improvements: 8-12 weeks. Real progress: one quarter minimum.
“What content formats get cited most?”
We’ve tracked this obsessively. Formats that actually work:
Strong performers:
- Comparison tables (especially for “versus” queries)
- Numbered step-by-step guides
- Glossaries with concise definitions
- FAQ sections with direct answers
- Research reports with original data
Weak performers:
- Rambling narrative without structure
- Opinions without data backing
- Sales-focused content
- Generic lists without unique perspective
The pattern is obvious once you see it. AI systems need structure. They need data. They need facts. They pull from writers who give them those things.
What’s Coming in 2026
Google’s been pretty clear about their roadmap. AI Mode is the future. Not one option among many. The future.
Expect:
- AI becomes default. Right now it’s an option users toggle on. Next year it’s the main interface.
- More volatility. Real-time data means SERP shifts become more frequent. The domains getting mentioned will change more often.
- More platforms. ChatGPT, Perplexity, Claude, and others will consolidate into everyday tools. Google’s AI Mode becomes one option among several.
- New ad formats inside answers. Google’s already testing sponsored content inside AI-generated responses. That’s coming.
This isn’t a temporary shift. This is the new landscape.
What This Means for Your SEO
Your ranking position doesn’t matter anymore. Your mention frequency does.
Stop optimizing for clicks. Start optimizing for credibility inside AI answers.
That’s where the traffic is now. That’s where the authority lives.
Companies that make this switch in the next 6 months will own their market by 2026.
Everyone else will be explaining why their rankings dropped while their smart
competitors were raking in 35% more traffic.
It’s not complicated. It’s just different.


No comment