Posted on: 2 February 2026
For twenty years you chased the whims of an algorithm. You counted backlinks like beads on a digital rosary, stuffed keywords into meta tags with pharmaceutical precision, sacrificed readability on the altar of keyword density. And it worked, because you were optimising for stupid machines that read superficial signals.
The problem is those stupid machines are disappearing. And with them, everything you knew about SEO.
The numbers tell a story that many prefer not to see. HubSpot, one of the most cited brands in digital marketing, lost 80% of its organic traffic after Google made AI Overviews more prevalent in 2025. Not 80% on a marginal keyword: 80% overall. Their informational content, those top of funnel pieces built with years of painstaking work on traditional SEO, are now being read, synthesised and served directly by AI without anyone ever needing to click through to their site. According to a 10Fold Communications study of 400 senior marketing executives across North America and Europe, only 11% of B2B companies have their content ready to be discovered by AI. The remaining 89% are still playing a game that no longer exists.
This is not a prediction. It has already happened.
RAG systems, the ones powering ChatGPT, Perplexity, Claude and Google's own AI Overviews, do not work like the old spiders. They are not looking for the "best page" according to proxy metrics like backlinks: they are looking for the "best answer" to a specific question. They break the user's query into multiple searches, retrieve relevant chunks of content from different sources, then synthesise a response. Your site could have stellar domain authority, thousands of editorial backlinks, an impeccable technical profile: if the content is not structured to be extracted and cited by an AI, you are invisible.
BrightEdge data shows that AI agents, the ones with names like GPTBot, ClaudeBot and Perplexity Bot, already account for 33% of organic search activity. These are not spiders indexing for later: they retrieve information in real time to assist users at the very moment of the request. They do not render JavaScript, they require high performance, and they need clean, structured text. If your content is not visible to these crawlers, you are becoming invisible to the next generation of consumers.
There is a term circulating: "The Great Decoupling". It describes the growing gap between impressions and traffic. Your pages appear, they are read by AI, your brand is cited in generated responses, but nobody clicks. Zero-click searches have exceeded 70%, with mobile already over 75%. According to Ahrefs, the presence of an AI Overview reduces the click-through rate by up to 34.5%. For news sites, the figures are even more brutal: 56% to 69% of news searches generate no clicks whatsoever.
Yet look at what most marketers are doing: continuing to optimise for keyword density, continuing to build backlinks, continuing to measure success in SERP positions. It is as if, in 1998, someone had continued buying advertising space in the Yellow Pages whilst everyone was migrating to Yahoo and then Google.
The pattern is familiar to anyone who has lived through other technological transitions. I saw it when the creative sector had to reckon with the arrival of the Internet: those who had built competencies and business models on the old paradigm tended to deny or minimise the change, clinging to metrics that were losing meaning, until it was too late to adapt. The psychological mechanism is always the same: cognitive and economic investment in the existing system creates structural resistance to evidence that the system is becoming obsolete.
What strikes me is not that the paradigm is changing. What strikes me is the disconnect between declared awareness and actual behaviour. In the same 10Fold study, 35% of marketers cite GEO, Generative Engine Optimisation, as their primary success metric, surpassing brand awareness (34%) and traditional SEO (29%). They say they know the game has changed. But 89% have yet to do anything to adapt their content.
This is the characteristic signature of a transitional moment: everyone talks about the change, few incorporate it into daily operations. The gap between narrative and action is the real signal to observe.
There is an ironic aspect to all this. For years traditional SEO had created perverse incentives towards content optimised for stupid machines: keyword stuffing that made text unreadable, meta tags crammed with keywords, backlinks bought or exchanged with no editorial value. It had become a game where whoever deceived the algorithm better won, not whoever produced better content. The AI indexing era is reversing this dynamic. RAG systems reward exactly what we should have been doing all along: writing excellent content for intelligent readers, with expository clarity, analytical depth, honest attribution of sources.
Information density per token matters, because AI systems have limited context windows and prefer content with high value concentration. Inline citations to authoritative sources matter, because they increase the AI citation rate by 2.3 times. Freshness matters, because 65% of AI bot traffic targets content published within the last year. Backlinks, on the other hand, show weak or neutral correlation with visibility in LLM systems. Only 4.5% of URLs cited by AI correspond to the top organic result on Google.
For those who have always produced content of real substance, this is excellent news. For those who built their strategy on technical optimisation for stupid machines, it is a disaster.
And here comes the paradox that should keep awake anyone who thinks they have found the shortcut: using AI to scale content production is precisely the wrong strategy in the AI indexing era. Not because RAG systems technically "recognise" generated text, but because they structurally reward what generic AI cannot produce: original research with proprietary data, direct empirical experience, insights emerging from unique perspectives not present in training data, pattern recognition requiring decades of accumulated expertise. If everyone uses AI to produce content, you get a levelling towards mediocrity: everyone recycles the same training data, everyone produces variations of the same "average" content. RAG systems seek sources that add value to the existing corpus, not those that regurgitate it in different forms. The race to scale with AI produces exactly the type of content that AI indexing ignores.
There is another datum that deserves attention. According to BrightEdge research, 34% of AI citations come from PR coverage, with another 10% from social channels. This means that external reputation work, the kind that builds brand mentions on authoritative publications, directly influences AI visibility. SEO is becoming inseparable from brand marketing and integrated communications. It is no longer a technical silo that can operate independently.
The question you should be asking is not "how do I optimise for AI" but something more fundamental: "does my content deserve to be cited as an authoritative source?" If the answer is no, no technical optimisation will save you. If the answer is yes, adaptation becomes a matter of structure and format, not substance.
What I am observing in the market is a stark bifurcation. On one side, marketers who continue to invest in traditional tactics, hoping the change is temporary or exaggerated. On the other, a minority that is completely redesigning their approach: treating AI citation as an autonomous KPI, structuring content in extractable conceptual units of 150 to 200 words, front-loading insights in the first 50 words of each section, building authority through original research and cross-platform mentions.
The second category is building a competitive advantage that will be difficult to recover in two years' time. The first is optimising for a game that will no longer exist.
I am not saying Google will die or that traditional SEO will completely disappear tomorrow. Google still processes 5 trillion searches per year with 20% annual growth. But user behaviour is changing structurally. ChatGPT is already the fifth most visited site in the world with nearly 5 billion monthly visits. Perplexity grew fivefold in one year. We are in the "Early Majority" phase of the technology adoption cycle: AI has crossed the chasm and is becoming mainstream.
The most fitting historical parallel is not the arrival of Google, but the transition from print to digital. Newspapers continued for years to measure success in copies sold whilst readers migrated online. By the time they realised the model was broken, they had lost a decade of competitive advantage. Marketers who today measure success in SERP positions whilst readers migrate to AI responses are making the same mistake.
The good news is that the window for adaptation is still open. Eighty-nine percent of your competitors have not yet done anything. But this window is closing, because those building AI visibility today are accumulating a cumulative advantage: more citations generate more authority, which generates more citations.
The real question is not whether you are doing SEO well. It is whether you are still playing the right game.