What Otter’s AI vision means for news media companies
Conference Blog | 20 October 2025
When news executives visited Otter on Monday during INMA’s Media Tech & AI Week study tour in Silicon Valley, they found a company rethinking how spoken words can become lasting, searchable knowledge — and what that could mean for modern newsrooms.
Otter Founder and CEO Sam Liang and Enterprise Account Executive Elliot Rogers met with INMA study tour attendees to discuss how AI can transform voice data into institutional intelligence.
Their central message to the 40 study tour attendees: Every conversation is data, and every newsroom is sitting on an untapped archive.
Conversations as first-party data
“Our mission is to make conversations more valuable,” Liang said.
Founded in 2016, Otter was built on a simple idea: People have always learned and collaborated through speech, yet almost none of that knowledge is captured.

“People have been talking for 100,000 years … and most of the voice data was lost,” he said.
For journalists and news publishers, that observation lands close to home. Countless interviews, editorial meetings, and brainstorming sessions disappear once the call ends. Otter’s goal is to capture that communication layer and make it useful — a new form of first-party data that can strengthen institutional memory and accelerate collaboration.
From transcription to corporate knowledge
Otter built its own AI engine because “none of the other technologies on the market was working, including Google, Microsoft, [and] Nuance,” Liang said. Its first app in 2018 introduced live transcription; by 2020, it could join Zoom, Google Meet, or Teams calls automatically.
Yet Otter now sees transcription only as a foundation. “A lot of people thought Otter is just a transcription tool, but it’s actually a knowledge … base,” Liang said. The company’s platform lets teams store and search meeting transcripts through shared channels, breaking down silos between departments.
For newsrooms, this offers a model for unifying the fragmented conversations across editorial, commercial, and product teams — turning voice data into something that can be queried and analysed like web analytics or CMS data.
Unlocking unstructured data
At least 30% of knowledge workers’ time is spent in meetings and up to 80% for managers, Liang said.
“All that meeting data is unstructured,” he said, but AI can convert it into summaries, action items, and insights. He described voice data as a huge, hidden resource: “The voice is a channel that generates the most data these days in enterprises … all that voice data is actually all lost, which is a tremendous amount of waste.”
For news publishers facing shrinking traffic and tightening margins, the idea of reclaiming wasted knowledge has direct business appeal. Structured conversation data could power better collaboration, onboarding, and even training for generative-AI systems that learn from a newsroom’s own context.
AI meeting agents and avatars
Otter is now developing AI meeting agents that can participate in real time. “Our vision is that pretty soon AI can participate just like another human colleague,” Liang said. “It can join the conversation with you, do brainstorming … like actively talk in meetings.”
For media organisations, such tools could eventually pull up background data during editorial planning sessions or produce summaries of advertising calls automatically.
Liang also described personalised avatars trained on an individual’s voice and style. “It’s trained by using hundreds of meetings I’ve spoken in the past,” he said. “Sam’s avatar can almost talk just like me.”

He acknowledged the idea is early but suggested avatars might one day attend routine meetings or interviews on a person’s behalf.
Culture, privacy, and security
Despite the promise, Liang said adoption requires “culture change, behaviour change.” Large institutions, he noted, still see recording as a liability. “It usually happens bottom up,” he said, comparing the process to how employees once brought their own smartphones into work.
“Some of the laws are obsolete … created … pre-AI, pre-Internet,” and need to evolve to handle modern data realities, he said.
On security, Liang was blunt: “We have to take … [it] super, super seriously. We have SOC 2 Type 2 compliance … [and] HIPAA compliance.” Cybersecurity, he said, “is a huge industry by itself … There’s never best security; there’s always better security we have to do.”
Accuracy and human oversight
When INMA Product & Tech Initiative Lead Jodie Hopperton, who organised the study tour, asked about hallucinations and errors, Liang stressed that reliable transcription remains the base layer.
“It’s actually first and foremost most important to get it right first,” he said. “Otherwise, it’s hard to summarise things accurately if the initial words were wrong.”
Otter’s technology powered Zoom’s transcripts from 2017 to 2023 because of its accuracy, he said. The company trains models on diverse accents — “including myself,” Liang said — and uses contextual clues such as slide text to improve precision.
Users can also give direct feedback, edit transcripts, and label speakers. That human-in-the-loop approach, Liang suggested, mirrors how newsrooms must validate AI output before publication.
Practical newsroom use
Rogers demonstrated Otter’s live environment, showing how journalists can record interviews on mobile devices, extract quotes, and generate summaries and social-media snippets. Otter also transcribes and translates in English, Spanish, French, Japanese, Portuguese, and German.
Many reporters use Otter personally before their companies formally adopt it, he said. The ability to search across conversations, he said, helps teams quickly surface insights or verify facts across multiple interviews.
Legal and ethical frontiers
Liang was asked about ownership of personal avatars. “Maybe there’s some legal issues,” he answered. “Can we still use your personal agent or not? Or if we use your personal agent, should we pay you?”
For news publishers, that question cuts to the heart of AI ethics: Who owns the work of a digital version of a journalist, and how should it be credited? Liang acknowledged such issues are “interesting” and unresolved — but inevitable.
A call to see meetings as assets
As the discussion ended, Liang reflected that Otter’s technology is still in its early stages but rooted in a clear belief: Conversations themselves are data.
For news organisations experimenting with AI, that idea reframes the newsroom as a living information system — one whose meetings, interviews, and planning sessions contain as much value as the stories they produce.
Liang put it plainly: “Our mission is to make conversations more valuable.” For media leaders, that may now mean learning to treat every conversation as a potential data point in the future of journalism.








