AI builds capacity, but what about internal trust?
Newsroom Innovation Initiative Blog | 11 February 2026
I’m just back from a week in Austin, Texas, where I had the privilege of helping run an AI event for local news organised by our brilliant friends at Blue Engine Collaborative.
We talked about audience growth and engagement, revenue growth and diversification, digital products, efficiency, data, and insights. We tossed around ideas for AI testing as well as for product development. We discussed community-listening projects and creating stories for different platforms.
One interesting finding: Why do AI projects commonly fail at news companies these days? The biggest single reason an AI project dies is because of internal resistance from staff.

In other words, it is about trust.
As anyone who has ever played with GenAI knows, this is understandable – our employees are worried about the implications — on their credibility, their jobs, the environment, and on their closely guarded sources. (And, as one participant pointed out, we should actually be glad there is lack of trust and some skepticism because going through the process of questioning is better than blindly adopting technology.)
How, then, do we build trust?
Some interesting tips emerged from a smaller group discussion around this:
Building internal trust in AI can be hard without a solid policy. If leadership shows a clear direction, it helps to build trust.
Create and communicate to your newsroom a decision tree that aligns with your mission and values for AI use. This is easier than writing a prescribed policy that needs regular updating.
Be clear about how AI is being used with colleagues. For example, make it clear it is a companion to written work, not a replacement. And be prepared to put in some time and effort to publicise this. A lot of evangelising has to take place.
Be very specific about how you are using AI when it comes to disclosure for external audiences. For anything for which AI was substantially used that gets in front of your audience — not research or backdesk editing, but consumer-facing products — you need to tell the audience. That’s where the relationship and trust are built.
There is a difference between AI-generated and AI-assisted. The latter means lower legal barriers and less resistance from the newsroom as well.
We need clear processes for catching stuff that can go wrong. For AI-created content or headline suggestions, build in layers of evaluation so dubious content is flagged.
Don’t necessarily call it AI all the time. It can be counterproductive. For example, an algorithm for content promotion or sentiment analysis is an old use case that now faces pushback when it is called AI. But be aware of the concern that underlies this: A lot of AI tools feel like a black box rather than an algorithm that has been designed with explicit rules that can be explained.
Find pain points and use AI to solve them. Find stuff journalists don’t like doing, like metadata tagging. Get newsroom members involved in designing the solution.
Value is what drives adoption. If you can prove the value of a tool in daily work, your colleagues will be more likely to use it.
Involve your staff in drafting AI policy. Hear their voices, address their concerns — and get their buy-in.
Concerned about the environment? Use AI judiciously. To obviate tension between newsroom coverage of environmental issues and our using tools that cause those issues, we have to be mindful about how we use it. Maybe we don’t need to use AI for everything. Measure where it is actually proving its value. We can also look for cheaper models that use less energy and have better privacy controls and governance.
Do not use external applications for stories involving sensitive sources. Depending on where you live, records of conversations with sensitive sources on Otter, Slack, and other applications can be subpoenaed and made public.
You can read more about the event here.
If you’d like to subscribe to my bi-weekly newsletter, INMA members can do so here.
Banner photo: Adobe Stock By itchaznong.








