What the ‘AI inflection point’ means for journalism
It’s hard to tell AI news from AI hype at the best of times, but the most recent surge around agents, triggered by many developers embracing Claude Code a couple of months ago, feels like something different. With the viral freakout over Moltbook, the agent social network, and the Super Bowl ad slap fight between OpenAI and Anthropic, AI has escalated to a new level of mainstream attention.
Everyone’s forgotten about the AI bubble and is instead dancing around the AI “inflection point,” when AI in general and agents in particular begin to take over huge swaths of knowledge work, with massive consequences for the economy and the workforce. The recent sell-off of SaaS stocks is an indication of how seriously the industry takes this.
For journalists, all this mainstream AI noise, coupled with the steady drumbeat of layoffs in the media industry, quickly turns into a familiar feeling: pressure to do more. As newsrooms shrink and AI tools get framed as productivity machines, it’s easy to assume the right response is higher output. But AI isn’t just changing how stories get made. It’s changing how stories get found. So the temptation to use AI to do “more with less,” which in many cases will be to tell the same kinds of stories, just more quickly and more often, is misguided.
This is because of the contradiction in how AI systems surface information: While they look for sameness to reinforce the patterns they’re seeing, they don’t reward it. That’s the difference between being cited in an AI summary vs. being in the background. AI only needs one competent version of the commodity story; it goes looking for the one that looks authoritative and adds something new.
More isn’t more
In practice, yes, you could use AI to accelerate news production, letting you cover more stories than you could before, and a few newsrooms are doing that. And on an individual level, that might even signal your value to your employer in the short term. But if it’s effectively the same story reported elsewhere, an AI engine has no reason to prioritize yours over another.
Instead, the more logical path is to invest in the parts of journalism that only humans can do: finding new and novel information through sourcing, research, interviews, and analysis. In other words, while the instinct to do more isn’t wrong, it should be aimed at going deeper, not wider.
AI can still be an accelerant here, speeding up ideation, research, and even things like reaching out to sources. A digital media researcher, Nick Hagar, recently showed what this looks like in practice, using coding agents to recreate a deep analysis from a human-authored journalistic investigation on Virginia police decertifications. The interesting thing about his case study is that, when used with very specific tools (such as Claude Code “skills,” which essentially turn certain research tasks into templates), he could quickly replicate the work, but ultimately his human judgment was required throughout. “Even with skills enforcing a structured workflow, I made dozens of judgment calls…. Skills make the workflow more systematic; they don’t eliminate the need for human attention,” he wrote.
That points to the better way journalists should think about AI: The goal isn’t to create more stories, but to create stories that are so valuable and definitive that AI search engines can’t ignore them.
Authority over output
To succeed in this new environment, the No. 1 habit that journalists will need to break is the natural instinct to cover more. Very few reporters think they’ve got a full grip on all the stories on their beat, and as newsrooms shrink, they have less help than ever. It doesn’t mean you ignore all breaking news, but it does mean a mental shift from reaction to discernment. In many cases, that might mean narrowing a beat to a micro-beat (say, from “energy” to “nuclear power”).
A lot of what I’m describing is happening naturally as many reporters, either victims of layoffs or entrepreneurially minded, flock to platforms like Substack and Beehiiv to put out a shingle. It’s not just the best-worst option—the system is pushing incentives in this direction, rewarding people who build authority via content that goes deep in a specific subject area and brings original insights and information to the table.
Certainly, you don’t have to strike out on your own to take this approach, though it does require discipline to put aside story FOMO and focus on where you can bring something original to the table. And the rewards go beyond simply having a better chance at surfacing in AI answers: you’ll have a stronger connection to your audience because they’ll be coming to you for information you can’t get anywhere else. The value of shaping narratives instead of chasing them is much greater than any short-term traffic spike.
That’s a hopeful idea, and paired with the changing incentives of the media ecosystem, it points to a key insight. AI’s ability to summarize and transform content has caused many to wonder what the “atomic unit” of journalism is. Some think it’s the unique facts, quotes, or insights that are woven into stories, but I think all this implies it’s something more abstract: editorial judgment. As AI systems absorb more of the mechanical labor of journalism, they’re inadvertently clarifying the thing they can’t absorb: human judgment about what matters and why. If this is an inflection point, it isn’t in the tools. It’s in the work we choose to do.