March 26, 2026

Of the 440 applications Texas Tribune chief product officer Darla Cameron received for an AI engineering role, 90% were junk.

Many appeared to be written using the very tools candidates were supposed to understand. “Here’s a short response that’ll work for this,” one application read — an obvious artifact of copying and pasting from ChatGPT.

The anecdote, shared during the Hacks/Hackers and Poynter AIxJournalism Day at SXSW, captures the strange ways AI is transforming newsrooms. The technology can be genuinely useful when used responsibly, but it’s already producing unintended and often absurd side effects.

I spent the day at the event listening to panels on trust, product strategy, public media, newsroom AI tools and the economics of an industry in transition. Here are five lessons I took away.

1. Start with a problem, not a mandate

The AI tools actually being used in newsrooms all started with a specific pain point.

At the Pew Research Center, audience editor Claire Dannenbaum noticed her team was spending 95% of its time writing formulaic social media posts and only 5% actually engaging with audiences. So her team developed a WordPress plugin — currently in beta testing — that automatically drafts the formulaic posts. Researchers review a few options, pick one and move on. Now her team has time to read comments, answer questions and make content based on what readers are actually asking.

Alex Mahadevan of Poynter and Upasna Gautam of the News Product Alliance at the Hacks/Hackers and Poynter AI x Journalism Day during SXSW in Austin, Texas. (Courtesy: Spencer Najera)

“Look at stuff you do every day in your job. What is your team complaining about?” said Upasna Gautam, News Product Alliance chairwoman, who shared insights from a session she moderated with entrepreneur and businessman Mark Cuban. “Where is the mundane bulls— work that you’re sick of doing? That is a great pathway to have AI support an integration or an optimization or a workflow.”

2. Draw a clear line between AI for thinking and AI for writing

Nebraska Public Media chief innovation officer Chad Davis stopped using AI for writing entirely. It just wasn’t good enough. But he called it a “good curiosity partner” for research, brainstorming and exploring ideas.

There is a clear line: Corporate writing, like grant applications and marketing copy, is fine. Editorial work is not. Nebraska Public Radio’s labs team uses AI for concept art, music prototyping and vibe-coded game demos. The point is, instead of describing ideas in a pitch meeting, to show them.

Most newsrooms haven’t drawn that line clearly enough. Where does AI drafting end and editorial judgment begin? If you don’t decide, your staff will make their own rules, and they won’t be consistent.

3. Be clear about where the human sits in the loop

Everyone says, when using AI, it’s important to keep a “human in the loop.” The newsrooms getting results are being specific about where.

Northern California’s KQED tested using AI to identify important snippets from Forum, their hourlong radio show.

“I’m not ready to say the AI can choose the four most notable moments,” said editor-in-chief Ethan Toven-Lindsey. “But if you put a producer in the loop to make sure those are the right moments, that felt doable.”

At SWR, a public broadcaster in southwest Germany, community managers review AI-flagged comments before anything gets deleted or sent to the legal department.

4. Use AI to get closer to your audience, not further away

Pew’s web traffic has been in steady decline since 2022, and it’s not alone. Newsrooms, nonprofits and research organizations are all watching search referrals and direct traffic shrink. One in five Americans now gets their news from influencers and creators. For younger audiences, it’s more than one-third.

Dannenbaum studied what those creators do differently and found a feedback loop. They read their comments, answer questions and make new content based on what their audience is asking. The topics come from the audience. The stories evolve through discussion. That process is what builds the personal connection that keeps people coming back.

“This is huge for what that means for our audience team’s time and what they’re going to be able to do with responding to our audience’s needs every day,” she said.

The Texas Tribune’s chatbot does something similar, but from the reporting side. It was trained on the newsroom’s school voucher coverage. When readers asked questions the Tribune’s reporting hadn’t yet addressed, reporters got a new story out of it. The AI didn’t replace the journalism. It connected the newsroom to what readers wanted to know.

Any AI strategy that doesn’t free up staff time to focus on the work audiences actually care about is worth rethinking.

Paul Cheung and Burt Herman of Hacks/Hackers welcome the audience to AI x Journalism Day at SXSW. (Spencer Najera)

5. Learn to build things yourself

I’m not a software engineer. But I built the prototype of a fact-checking research assistant for PolitiFact poolside on vacation by using AI coding assistants. It was a slog at first: Copy code from Gemini and drop it into VS Code. Find a bug. Copy it into Gemini. Find the solution. Rinse and repeat.

Agentic coding — with tools like Claude Code — makes that process seamless: Just click a few buttons and an AI agent will start building. With OpenClaw, I can even make tweaks to the tool with a few WhatsApp messages while walking my dog.

Trei Brundrett, founder-in-residence of New_Public, has been building things on the internet for decades, and said this moment reminds him of the early web, when the gatekeeping vanished and people could just build. AI has changed software development so much that his small team can put prototypes in front of communities, get feedback and rebuild in days.

You don’t need an engineering team to test an idea anymore. If you can describe what you want clearly enough, you can build a working prototype with tools like Claude, Cursor or Replit.

The newsroom people who pick up that skill are going to have an outsized advantage over the next few years. The ones who wait for someone else to build it for them will be waiting a long time.

There’s still time to register for Hacks/Hackers’ AIxJournalism Summit in Baltimore, May 13-14. As the organization’s AI ethics, governance, and literacy partner, Poynter is sponsoring the Govern track, which will include lessons from our AI ethics guidelines, previous summits and research.

Poynter is a nonprofit dedicated to keeping journalism strong and grounded in sound values.

Join our donor community to help us continue this critical work.

DONATE
Alex Mahadevan is director of MediaWise, Poynter’s digital media literacy project that teaches people of all ages how to spot misinformation online. As director, Alex…
Alex Mahadevan

More News

Back to News