The news media industry has missed out on the opportunities created by several major disruptions, but learning from those mistakes could help it capitalise on what generative AI has to offer. During INMA’s recent Asia/Pacific News Media Summit, Stuff owner Sinead Boucher outlined some of the industry’s missteps while offering advice on how to use what it has learned to make better decisions moving forward.
“I remember the first computer being rolled into our newsroom for the Internet and only the librarian was allowed to use it,” she said. “Back then, we didn’t see how that was going to be relevant to our jobs. But when we started to be able to search for things and understand the part that news had in the search experience.”
After that came mobile, which forced the industry to change and adapt, followed by what has “had the biggest impact on our industry and broader society,” according to Boucher: the rise of social media and social platforms.
Now, as generative AI transforms digital, Boucher said the industry is at the cusp of another massive disruption. And there’s no more time to sit around and talk about what needs to be done.
“The thing that is different about generative AI compared to some of these other big waves of disruption in the past are the speed of the technological advances,” she said.
Since the introduction of ChatGPT in November 2022, new applications and developments have been rolling out daily. Already, she said, the Internet is filled with AI-generated content that is “bland, repetitive, and not well-written.” That makes the spread of disinformation easier and also places a bigger burden on the news media industry to ensure it is reinforcing truth and restoring trust amongst readers.
To do that, the industry must learn from the past.
“I think the best chance that we have of making the right decisions when we look ahead is to look back and see what we did in the last wave of disruptions,” Boucher said.
One of the industry’s first mistakes was that it was too slow to understand the significance of social media. At the same time, the industry adapted its content to suit the platforms of tech companies rather than looking out for its own interests.
“We did things like throw all of our content into social platforms, shape the type of content that we produced, change the way that headlines were done and the type of images to suit how those platforms worked — not necessarily to suit what our businesses needed or what our audiences needed,” she said.
“We helped build those businesses through those platforms through the value of our content.”
Instead of boosting the news media companies providing the content, it handed over the power to social platforms. And, at the same time, public trust in the media was disintegrating, but the industry didn’t take notice — or action, Boucher said: “I think we were complacent about the level of public trust in us. We felt we had had it for so long that we would always have it and that we deserved it. And we were slow to react to the implications for us and for society on the attacks from bad actors on that trust.”
Managing new threats
Looking at generative AI through the lens of what the industry should have done with social media and Big Tech provides a better roadmap moving forward.
“When we look now at generative AI, I think there is a real risk for us as an industry and as our society that this becomes more about degenerative AI,” Boucher said, adding that social media has inflicted harm to elections, damaged the mental health of young people, and contributed to polarization.
While generative AI can help businesses do their jobs faster and cheaper, it will also allow misinformation to spread faster — and appear much more believable.
“Already we are seeing a flood of pink slime Web news sites all over the Internet,” she said, citing an article from News Guard that counted 347 news sites that look and feel like legitimate news sites but are actually filled with AI-generated content pushing false narratives.
“I think there is a real sense that all of the risks and harms we’ve seen in the last 10 years are speeding up and the potential is for them to get much worse.”
That puts the news industry in a position to determine how AI is going to be used to generate journalism. News media must find ways to defend and protect its content and intellectual property while also embracing it.
“We need to be able to harness the power of this technology for ourselves in ways that we never did in the era of social and search,” Boucher said. “The success of generative AI products produced by tech companies is going to depend enormously on the high-quality journalism and content produced by us and other companies. And we are going to have to find ways to protect that.”
Already, some companies are taking steps to protect their content — or at least get some compensation if it’s used. She pointed to the deal AP made with OpenAI to receive payment for any content it uses to train OpenAI. And other news media companies have written to the Big Tech platforms, saying they would not give permission for the large language models to crawl their content and learn from it. Others are trying to take steps to block the bots from scraping their content.
In some countries, news media companies are lobbying governments to secure regulations and legislation to govern the use of such technology.
“I think wherever possible as an industry, we have to stand together to collaborate where we can for the benefit of the industry,” she said. That includes sharing information and tactics on what’s working and ensuring that the value from that content goes back to the creators and publishers.
“We have to make sure really that it generates value for journalism because I do worry that this might be our last chance,” Boucher said. “If we don’t get it right in this current wave of disruption, I think that wave is going to wash right over us.”
What can go right?
While the news media industry must be aware of the dangers and the need to protect itself and its IP, Boucher said it’s also essential to look at how to harness and leverage generative AI. While it’s easy to think about what could go wrong, that shouldn’t come at the cost of recognising what could go right and how to get there.
For news organisations, that means protecting and defending their IP, harnessing the tools and technology available, and then creating business models, products, and content that add value to people’s lives.
“We’ve heard a lot about how this technology could potentially hold the secret to really tackling climate change, to finding revolutionary new medical treatments, to giving access to education and healthcare to those who otherwise would not have it,” Boucher said. “If those are the things that could go right, we need to think about what it would take to make them go and start to bring that into the work and the journalism we do on behalf of our communities.”