Why breaking news still wins in the age of AI
There’s a lot of fear these days in the media world over the “zero-click” future. AI chatbots and search engines ingest content, interpret it, and then summarize it for users, with the inevitable consequence being that people no longer visit your site. This is not theoretical. Data from Chartbeat, an analytics company that serves media sites, shows global publisher traffic from Google dropped by one-third last year, with smaller publications hit hardest.
So yes, AI substitutes content, but it doesn’t do so evenly. A recent analysis from Define Media Group looked at how the presence of Google AI Overviews affected traffic to different types of content over the past year. Indeed, organic search traffic overall is down 42%, but it turns out that clicks to breaking news stories are actually up by quite a large margin—103%.
The main reason: Google doesn’t show AI Overviews for breaking news queries. That makes a lot of sense, since a breaking-news situation usually involves a lot of rapidly changing and inconsistent information as reporters across several publications sort out what really happened from all the rumors, exaggerations, and outright misinformation that surround a news event. When you ask about a breaking news event on Google, you usually get a Top Stories carousel instead of an Overview, a feature that’s existed for a long time.
Why breaking news still wins
When you peel back the numbers, you see that the reason for the big jump in news traffic is Google Discover—the built-in news feed that exists on most Android phones. Google also made some changes to Discover recently that apparently boosted news from publishers even more. But even putting Discover aside and just looking at the web, traffic to news is essentially flat, which makes it clear that breaking news is the content type that is most resilient to AI substitution.
None of this is to say that a publisher can or should focus on news alone. The category has other challenges, and competing on breaking news is expensive, requiring continual monitoring and staffing. Also, the news isn’t meaningful without context and analysis. News publishers need to explain to their readers why the news is important to them, even if that explanation is at risk of being summarized by AI.
Moreover, when AI summarizes content, it doesn’t necessarily mean you’ve “lost”—it shifts the competition to another arena: citation. As I’ve pointed out before, competition for presence in AI summaries is a battle worth fighting, even if the rewards shift considerably from the click-based advertising business model that is still important to the majority of media companies. Sites that are consistently cited in summaries will ultimately be the ones that define consensus, and data suggests that the share of the audience that does click through to sources, while smaller, is more intentional, meaning there’s more opportunity to turn them into loyal readers.
However, the competition for citations is fierce. Publishers aren’t just competing with each other. In fact, they’re not even the most favored sources. A report from Semrush, a search-analytics firm, ranked the top sources most often cited in AI answers: Reddit, LinkedIn, and Wikipedia. The top publisher on the list is Forbes at No. 11, and I suspect it has a lot to do with its extensive contributor program.
This ranking isn’t the whole story—other studies, like this one from Muck Rack, do show that AI search engines favor journalistic content over brand-created or social content. At the same time, AI is clearly building answers from a broad set of sites it deems authoritative, not just news publishers.
The new playbook takes shape
So from these ingredients—news is a moat, AI likes journalism but looks wider, and explanatory content is the battleground—we can start to cook up a strategy. The Semrush study found that AI absolutely loves citing LinkedIn content. The reason, according to the authors: LinkedIn content, especially the longer articles that are native to the platform, are almost always clearly framed, structurally obvious, and written to a fairly meaty length (between 500 and 2,000 words).
If AI likes structured explanatory writing on LinkedIn, it probably likes those same traits in explanatory journalism elsewhere. The Muck Rack data says journalism makes up about a quarter of AI citations, but it also says more than half of citations are from the last 12 months and that the highest citation rate is in the first seven days.
Define, meanwhile, says breaking news is up while evergreen is down, taking a 35% hit. Put together, that points toward a specific kind of publisher explainer: not static archive content, but fresh, tightly structured explanatory journalism that accompanies the news.
Now we’re starting to see what an AI-resilient news operation looks like. Publishers should still invest in breaking news because it remains defensible and difficult for AI to compress into a zero-click summary. But they should pair that with explainers that are updated quickly, tied closely to live topics, and written in a format that is easy for both humans and machines to parse. The vulnerable category is the old generic evergreen article that is neither essential live reporting nor especially useful as source material for AI answers.
So AI isn’t erasing journalism into irrelevance. It’s made that value more specific. Breaking news still commands attention because platforms are cautious about compressing fast-moving events into a single summary. And when the news settles, the publishers that still matter are the ones that can turn their reporting into clear, timely explanations and analysis. That is where the next fight for authority will be won.