In partnership with

Hi, I’m media innovation journalist Ulrike Langer and you’re reading News Machines. If this newsletter was forwarded to you becoming a subscriber means you will never miss an issue. If you want to support my work please spread the word, become a paid subscriber or click on the ad.

This week’s newsletter is a special edition. It’s not a case study but an essay, connecting the dots between case studies, industry discussion and my own analysis. It’s longer than my usual posts, but I believe it’s worth your time.

As a bonus for paying subscribers there is a link to a diagnosis at the end of this essay, twelve questions for newsrooms positioning for the AI transition, designed to be used in real conversations between editors and publishers, between leadership and boards. Subscribers are welcome to share it inside their own newsrooms.

Articles in the newspaper and website are no longer the default output of local news. A robust portfolio strategy should consists of elements like personalized newsletters, intelligence products for local governments officials and a civic accountability layer.

Act 1

The Federal Register publishes between fifty and two hundred new filings on a typical working day. Buried in those filings are the regulatory moves that determine which industries get reshaped, which companies face new exposure, and which agencies are quietly redirecting how American government works. For a wire reporter trying to cover federal policy, the Register is both the richest source available and an impossible one - too much volume, too much routine, too much that looks like a notice and is actually a story.

Andy Sullivan has been a Reuters wire editor for twenty-five years. About a year ago, he built a small system that reads every new Federal Register filing as it posts, applies the editorial logic he has spent a quarter-century developing, and surfaces the ones a human reporter should read. He didn't write it because he learned to code. He wrote it because the tools to write it now exist for someone like him who doesn’t code, and because only someone like him could write it well. The system is useful because the judgment encoded in it is useful. He is the judgment.

Sullivan now runs 14 of these systems. None of them are articles. None of them produce articles directly. What they produce is a continuous stream of editorial signals - this matters, that doesn't, here is something a reporter should look at within the hour - that make the rest of the newsroom's reporting faster, more focused, and more comprehensive than it could otherwise be.

This is not a story about a clever editor learning to use AI. It is a story about what journalism work actually looks like now, and about what part of the work has quietly become the bottleneck.

The bottleneck was never the writing. And it isn't the building either, though for the past 18 months a substantial part of the industry conversation has sounded as if it might be. Vibe coding - the practice of building working tools through conversational AI rather than through code - has dominated conference programs, newsletter sends, and LinkedIn feeds. The case for it isn't wrong. Working journalists really are building real systems on the side, in days or weeks, without engineering teams. That part is true and it matters.

What's missing from most of that conversation is the question of what makes the systems work once they're built. The answer, in every case I have looked at, is the same: editorial judgment. The judgment that decides what to surface, what to ignore, what to structure, what to refuse, what to kill. That judgment is what made Sullivan's first tool work, and what makes his 14th tool work, and what nearly every newsroom in the United States and Europe has spent the last 15 years quietly cutting at the layers that used to train it.

That is the problem this essay is about.

Sponsored

Write docs 4x faster. Without hating every second.

Nobody became a developer to write documentation. But the docs still need to get written — PRDs, README updates, architecture decisions, onboarding guides.

Wispr Flow lets you talk through it instead. Speak naturally about what the code does, how it works, and why you built it that way. Flow formats everything into clean, professional text you can paste into Notion, Confluence, or GitHub.

Used by engineering teams at OpenAI, Vercel, and Clay. 89% of messages sent with zero edits. Works system-wide on Mac, Windows, and iPhone.

Act 2

2.1 The article is no longer the unit of work

The inverted pyramid, the two-minute radio hit, the 800-word column - these were containers shaped by analog distribution constraints. The internet broke the volume constraints decades ago: A newsroom can now publish as much as it wants, in whatever form it wants, in real time. In fact, for a long time, the internet drove newsrooms to prioritize quantity over quality, eyeballs over meaningful audience connections. What it didn't break was the container itself. The article was still an article. The video was still a video. Generative AI dissolves the container. What a newsroom produces is no longer a stack of articles or a queue of videos. It's a set of systems - pipelines, workflows, agent fleets, structured archives - each running continuously, each producing output that gets reassembled per reader, per agent, per query.

This isn't a forecast anymore. The most widely used term for the dissolution of containers is liquid journalism. Burt Herman, co-founder of Hacks/Hackers, describes it as a shift from content management to context management - the format handled by the system, the judgment kept human. Francesco Marconi, in a recent analysis, lays out journalism as a portfolio of six content types, each with its own time horizon and economic moat. Each angle catches part of the same shift. But what does this shift actually mean for the survival of local and general-interest news?

2.2 What the role shift looks like in working journalism

The role that's emerging isn't reporter-as-narrator. It's reporter-as-system-designer. Both will continue to exist - the long-form profile, the war dispatch, the named-writer column - but the relative balance is shifting, and the shift is what determines who newsrooms will hire and pay over the next several years.

The case studies are accumulating fast enough that they no longer feel like outliers.

Andy Sullivan's tools work because of what he knows about how the Federal Register actually behaves. A routine-looking notice can be the first signal of a major regulatory shift; a high-volume filing day can mean nothing; the same agency posts important rules and trivial corrections in identical formats. Sullivan has spent 25 years learning when each pattern means what. His Federal Register Bot encodes some of that knowledge - the rules about which agencies, which filing types, which language patterns deserve attention - but only some of it. The tool is useful because Sullivan keeps tuning it. He revisits the rules every few weeks - sometimes adding new patterns the system should learn to spot, sometimes retiring patterns that have stopped mattering, sometimes catching that the system has missed something it should have flagged. The maintenance is also continuous redesign. A developer with the same engineering capacity could not have built the bot, and could not now keep it running, because the editorial logic is what makes it worth running. The bot is leverage on Sullivan's judgment, not a substitute for it.

Mattea Vecera, two years out of journalism school, has built a workflow at Staten Island Advance that compresses a full day of weekend events curation into about an hour. She has been explicit, in public, that the compression worked because she preserved editorial judgment - not despite it. Her senior collaborator, David Cohn, who runs AI strategy at Advance Local, posted in response: "Happy to be in the passenger seat for any of her AI ideas." That sentence is the role inversion in plain language. The senior figure isn't gatekeeping. He's following the editorial judgment of the person closest to the work.

Cohn himself runs roughly 500 small agents across Advance Local's newsrooms. The governance rule is short: AI does not touch the CMS. Agents handle research, monitoring, sorting, drafting at the edges - but the act of publishing remains human. The rule sounds simple. It is also a portfolio decision: which systems run, what each is for, what each is not allowed to do.

Joe Amditis built Reroute NJ, a working hyperlocal product, over a weekend and a half. Within months it had 2,000 daily users. He is not running a magazine. He is running a system that produces a magazine-shaped output every day, and the system is what he intends to sell, license, or replicate.

In every case, the system works because someone has the judgment to design it and steer it. The system is not a substitute for that person. It is the leverage that person now has.

Act 3

3.1 Where the framework runs out

Journalism is splitting in two directions. At the high end, structured intelligence sold to professional buyers - municipal-bond analysts, pharma companies, energy traders, security firms - is already a viable, profitable category at prices no consumer audience will match. Bloomberg and Dow Jones have built it. Trade publications most journalists have never heard of are building it in clinical trials, energy regulation, cybersecurity. At the other end, the mass-attention business is collapsing. None of that is in dispute.

What's in dispute is what happens to everything else. The current consensus answer is that civic journalism, local news, anything that doesn't slot into the high end or the dying middle, must be funded as infrastructure. Tax revenue. Philanthropy. Public broadcasting. They are treated as a public good in the economic sense - something the market won't price, that society pays for collectively or doesn't get.

That answer assumes a political consensus and a funding base that don't reliably exist. Not in the United States. Not in Germany, Austria, or Switzerland. Not at the scale local journalism would actually need. And even where it sort of exists, it's brittle - a single political cycle can gut it.

The wish that public-interest media wouldn't have to be a charity case is commendable. But wishing for something doesn’t make it so. The mechanism for profitable public-interest local news isn't where most of the conversation is looking for it.

3.2 The portfolio of systems

Start with the terrain. Every piece of journalism a newsroom publishes is now consumed by two audiences simultaneously: the human reader who has always been there, and the machine consumer that wasn't. The summary appearing in someone's chatbot. The agent fetching primary sources for a research firm. The model ingesting a council meeting writeup to answer a constituent's question three states away. This is the operating environment. It is not in serious dispute anymore.

What's still in dispute is what a newsroom is on this terrain.

The temptation is to treat this as an organizational identity question - to specialize, to commit, to be one thing. “Do what you do best” is a well-known mantra. It carries the risk that civic-minded journalism gets left out. However, a single local newsroom can produce intelligence, attention, and civic work at once, in parallel, with the same staff and the same masthead, by running multiple distinct systems.

Take Tagesspiegel, the Berlin-based daily that has been running this kind of portfolio for years. The Tagesspiegel Background briefings are vertical-specific intelligence products - in policy, digitalization, healthcare, energy, sustainability - sold by subscription at premium prices to government officials, lobbyists, corporate strategists, and other professional decision-makers. That is the intelligence layer of the portfolio. The same masthead publishes a daily consumer paper, runs newsletters, builds reader habit through named-writer columns and city reporting. That is the attention layer. The same masthead also funds investigations into Berlin city government, federal politics, and accountability work that nobody pays for piece by piece. That is the civic layer. One newsroom. Three distinct product streams. Three monetization models running in parallel. The civic work doesn't have to pay for itself article by article because the Background briefings and the consumer subscriptions hold the portfolio together.

This is the model. The civic work survives not because someone funds it from outside, but because it sits inside a portfolio that's already viable through its other systems. Public-interest journalism doesn't need a tax base. It needs to be one product line in a portfolio that's solvent overall.

This is what needs to be worked out. The pieces are on the table - the dual world is documented, the shift from articles to systems is starting to be named, the case studies of working journalists building these systems are accumulating - but the connection back to the survival economics of local news is the part that's still missing.

3.3 The editorial judgment that holds it together

A portfolio of systems is meaningless without someone deciding which systems to build, what each is for, who each serves, what each refuses to do, and when each gets killed.

That capability has a name and it's older than any of this. It's senior editorial judgment. Not editorial oversight in the sense of a human checking AI output for errors. Not "a layer of human review” or “human in the loop”. Senior editorial judgment - the kind that holds a newsroom's identity, makes hard portfolio decisions, and has the authority to enforce them. The kind that decides the council-vote feed will not include unverified sources. That the explainer video will not summarize without flagging dissent. That the procurement investigation gets the resources even though it doesn't pay its own way, because that's what the masthead is for.

This is the capability will get paid now, next year and after. It's also the capability that answers a worry that's started showing up in the conversation: that the new economics rewards only the elite few, the world's best cybersecurity reporter or pharma analyst, and leaves everyone else stranded. A local newsroom doesn't need the world's best local cybersecurity reporter. It needs someone with the judgment to design and run a portfolio. That's a more reachable bar - and it's the bar local journalism can actually clear.

Act 4

4.1 The pipeline is what's been cut

Senior editorial judgment is real, reachable, and learnable. It is also scarce, and getting scarcer.

Two decades of layoffs have not hit newsrooms evenly. The masthead is mostly intact. Editors-in-chief still exist. Top-of-house roles have, if anything, proliferated - chief content officers, heads of standards, heads of AI, executive editors for verticals that didn't exist a decade ago. What has been cut is everything underneath: the staff reporters whose beat work taught them what was worth covering, the copy editors who learned what stories actually meant by editing them line by line, the mid-level desk editors whose portfolio sense developed over years of deciding what ran where, the regional bureau chiefs who understood how a story played differently across the country.

Those layers were the pipeline. They were where the next generation of senior editorial judgment got trained, slowly, by doing the work under people who had done it longer. When the layers thinned, the training thinned with them. The capability that's now required at every desk - because every desk now runs systems that need someone deciding what they should and shouldn't do - is concentrated in fewer people, less reproducible, less institutionally embedded.

The bottleneck is not that editors-in-chief are missing. It's that the pipeline producing future people with that judgment has been cut. And the cut happened at the same moment AI dramatically expanded the demand for that judgment - because every system a newsroom builds now needs someone with the judgment to design it, steer it, and know when to redirect it.

Some publishers saw this earlier than others. Thomas Kaspar, who leads digital innovation at Ippen Digital in Germany, has been describing the company's content infrastructure as 'liquid content' since before the term entered the AI conversation. Ippen built its publishing platform from the start around strict separation of content, structure, and design - and from 2017, around what Kaspar calls 'Intelligent Content,' meaning every asset (every text block, image, map, table) is tagged with metadata for recombination across brands and formats. More than 80 regional portals now run on the same platform, with content reusable across all of them, Kaspar told the German publishing magazine Kress Pro. The system was built for human readers and for machines to read at the same time.

Kaspar's own line is the one that lands: It began as Big Data, and it now runs as AI, and what helps is having had everything structured to the asset level for over a decade. The publishers positioned for the AI moment are the ones who did the structural work before the moment arrived. Most newsrooms didn't. They are now scrambling, simultaneously, to build the infrastructure and to find the editorial judgment to run it - while the pipeline that produced that judgment has been thinning for two decades.

In other words - the industry has been cutting the resource it now urgently needs.

4.2 What’s at stake for the industry in the DACH countries

The American conversation about local journalism's survival assumes a philanthropic infrastructure that the German-speaking countries (DACH) do not have at the same scale. With Knight, MacArthur, Ford, the Democracy Fund and many others, the U.S. has built an ecosystem of foundations writing meaningful checks for civic journalism. DACH has foundations - some quite large - but most of them aim their grants at the public good in a broader sense, or at science, or at education. Journalism-as-such is rarely the priority. That's the structural disadvantage. Local journalism in DACH can't reach this kind of philanthropic capital because, for journalism, it largely isn't there.

It also assumes a collaborative relationship between commercial newspapers and public broadcasters that does not match the DACH reality. ARD, ZDF, ORF, and SRF are funded by mandatory license fees totaling billions of euros and francs every year. Commercial newspapers are funded by shrinking advertising revenue and the limited share of consumer media budgets households are willing to spend on subscriptions. When a household downsizes its media spending, the newspaper subscription is what gets cut. The license fee cannot be cut. News publishers have argued for many years that the public broadcasters are competing on uneven ground, expanding into digital territory the publishers depend on, with state-guaranteed revenue the publishers will never have.

What the AI moment has produced, against that backdrop, is something genuinely new: islands of cooperation. Most players in the German market do not have the capacity to build AI infrastructure alone. AI initiatives open to multiple publishers - the AI for Media Network organized by Bayerischer Rundfunk, the work at Medialab Bayern, APA Digital Innovation in Vienna - either did not exist before or have shifted their focus toward generative AI. There are also project-specific bilateral experiments: Bayerischer Rundfunk built an Oktoberfest chatbot last year combining BR's content base, Ippen Media's content base, and city-of-Munich service data, and the cross-publisher combination demonstrably produced better answers than any of the three could have produced alone. Two million views, 90,000 questions. The cooperation is real and worth noting.

It is also nowhere near the scale of the American philanthropy-and-research apparatus. And there is a distinction worth making clearly: Small bilateral experiments around specific products are one thing, and they fit comfortably inside the portfolio-of-systems argument - any newsroom can decide to combine its data with a partner's for a specific use case. Industry-wide infrastructure is another thing entirely. Uli Köppen, BR's Chief AI Officer, has talked in the Newsroom Robots podcast about a possible “data marketplace” or shared data warehouse for the German market - a constant cooperation with sustained governance, going beyond the API-level connections of the Oktoberfest experiment. She is also explicit that this larger infrastructure does not yet exist, and that it is unclear who would build it.

That fragility shows up most clearly when cooperation requires industry-wide buy-in. The so-called “Leistungsschutzrecht” - the law that obligates search engines to pay German news publishers for snippets in search results - has produced more internal strife than significant revenue. Most of it has gone to the largest publishers, leaving smaller legacy outlets and digital-native startups with little or nothing. And there is still no Spotify for news in the German-speaking market (or anywhere else). The cooperative spirit required to build one, on terms the participants would all accept, has not materialized.

This matters for any AI licensing platform that would require buy-in from the same set of players. The prisoner's dilemma is well documented in this market. The track record on collective deals is thin. A solution that depends on virtually every publisher signing the same agreement on the same terms is, on the available evidence, unlikely to work. The portfolio-of-systems argument has the advantage of not requiring that. It is something each newsroom can decide and execute on its own, with whatever cooperation is genuinely available, without being held hostage by the prisoner's dilemma the rest of the industry is stuck in.

The DACH version of the survival path is the same shape as the American version. The terrain is rougher. The cooperation that exists is real but limited, and the solutions that depend on industry-wide cooperation are, judged on past performance, unlikely to be the ones that work.

4.3 Close

Local journalism doesn't need any exotic new capabilities. It needs the editorial judgment that has always made newsrooms work - upgraded for an environment in which articles are no longer the unit of work, audiences include machines as well as humans, and the leverage on a single skilled editor is now substantial. That capability gets built deliberately, with budget and time, the same way every other capability gets built. The industry has spent two decades acting as if it could be cut without consequences. It can't.

Editorial judgment is the bottleneck. It is also the path through. Pricing it correctly, staffing for it deliberately, and rebuilding the pipeline that trains it - that is the work the next several years will be about. None of those decisions get easier by waiting.

A diagnostic accompanies this essay, for paying subscribers: twelve questions for newsrooms positioning for the AI transition, designed to be used in real conversations between editors and publishers, between leadership and boards. Subscribers are welcome to share it inside their own newsrooms. It is meant to make the conversation more honest. That is where the work starts.

logo

Subscribe to the premium tier to read the rest.

Become a paying subscriber to get access to the full post and other premium content.

Upgrade now

A subscription gets you:

  • Access to full posts with more metrics and actionable checklists
  • Quarterly AI in media trend reports (starting in April 2026)
  • Immediate download access to my new 28-page report "The Human-Augmented Newsroom"
  • No ads (except for occasional special promotions)

Keep Reading