Generative AI is going to play a role in the news media industry, but what role that is — and whether it’s beneficial or not — remains to be seen.
During the recent INMA South Asia News Media Summit, panelists discussed how AI is affecting the industry and what they predict is coming.
Pawan Agarwal, deputy managing director for Dainik Bhaskar, noted every new technology that has been introduced has brought greater efficiency, and he expects generative AI to do that as well. It can help on the content side by performing tedious tasks such as tracking down links, but he still has many questions about it: “How should one integrate it, how one should leverage it in our newsrooms, and what’s that thin line between quality journalism doing by using generative AI versus just handing it over to generative AI?”
Satyan Gajwani, vice chairman of Times Internet, agreed, saying it can improve workflows and bring greater efficiency, but he wonders how companies will ensure the integrity of their brands.
“Ultimately our brand stands for trust and a lot of that trust comes from the humans and the people that we invest a lot of energy to build as our teams,” he said. “I think it would be a little irresponsible and dumb for us to chase a few extra pageviews by putting out stuff cheaper or faster [using generative AI] and compromise the integrity of what our brands are meant to stand for.”
News publishers are just starting to explore how it will be used, and Praveen Someshwar, managing director and CEO of HT Media Ltd, said the key is going to be how to maintain a balance by using technology to assist humans, not replace them.
“We will play around with this technology as we go on. It’s an exciting time; we don't know where it’ll end,” he said. “You have to just keep playing around doing the right things, in an assisted manner, and never lose out on the human touch as you go through this journey.”
Preparing for change
Jaspreet Bindra, founder and CEO of The Tech Whisperer, a U.K.-based company that provides thought leadership around technology topics including generative AI, moderated the panel. He pointed out the news media industry was broadly disrupted by digital technology, mainly social media platforms and search. He asked the panel what they have learned from that experience and what the industry should do to prepare for the new set of changes that are imminent.
“I think this is the time publishers and platforms need to sit together, probably have long-term discussions,” Agarwal responded, saying partnerships with platforms will allow publishers to invest in their newsrooms and quality journalism whilst providing the content that powers the platforms.
While publishers won’t be building their own platforms for generative AI, they will be a source of content — and now is the time to discuss what that will look like.
Unlike previous platforms, like search and social, AI is looking for truth and answers, Gajwani said, which makes publishers’ content more valuable.
“Publishers probably are more important in this situation than they were in others in terms of being disseminators of truth,” he pointed out. “In a world where there will be so much fake content, it’s probably the right time for us to start … having those conversations to figure out what is the right way for us to play our role in the value chain and the overall ecosystem of what comes out.”
Bindra asked if the publishing industry should band together and create its own large language model/generative AI model. Someshwar said it would be a waste of time: “I think there are many fabulous use cases coming out of it. Let’s first focus on the use cases versus building our own products.”
The ethics of AI
Bindra noted generative AI has many potential ethical ramifications, such as plagiarism and fake news. He asked the panel what role the news media industry will play in addressing some of those issues.
While it does bring several risks, such as plagiarism, Someshwar said those same tools can be used for good, assisting editors in creating content that is fact-checked. But it will remain the responsibility of the news media to stay alert and aware.
“It is to up to us to champion it, to work ahead of the curve to make sure that we are ensuring that these AI tools and the big tech who are responsible for them are partnering with the publishers across the world so we provide that element of trust, which is what we are known for as an institution or as organisations,” he said.
While Gajwani predicts systematic solutions will be developed to thwart fake news, that will be a long-term effect and he’s more concerned about what the immediate fallout will be:
“I do think that you’re going to see more stresses on authenticity and trusted content for the next few years. And my hope is that globally this gets solved through some sort of technology solution soon after that,” he said. “I think in 10 years it won’t be an issue because there will be solutions to establish what’s authentic and what’s not.”
Agarwal expressed the most concern over the entry of generative AI into the newsroom, saying news institutions have built trust over several hundred years and he’s concerned AI could muddy the waters and undermine the public’s faith in the institution of journalism.
For that reason, they are cautioning newsrooms against using any tools that aren’t yet mature and proven. That means not letting machines do the fact-checking, but following up on information in ways that it has been done for decades, he said.
“I'm sure the technology will develop and will give us better solutions,” Agarwal said, admitting he is still sceptical about it. “So we are keeping it away from our newsrooms for now. It’s like playing with fire.”