AI brings many challenges to the newsroom

By Sarah Schmidt


Brooklyn, New York, United States


Journalists and media leaders can’t stop thinking about the repercussions of the recent advances in generative AI. Most have at least somewhat mixed feelings, but the challenges are real and immense. 

Karen Silverman, CEO and founder of the Cantellus Group, spoke last week at the INMA World Congress of News Media. Silverman is a technology governance specialist with expertise in AI and a member of the World Economic Forum’s Global Future Council on the Future of Artificial Intelligence.

The concerns she hears from journalists echo some of those she hears from leaders in banking and finance about security and automation, but they are also in a special position with particular responsibilities. 

“The advanced technologies that we are producing today put extreme pressure on distinguishing between data, information, and knowledge,” Silverman said.  

Most media people are excited about the potential AI has to improve efficiency — and newsrooms already commonly use AI to reduce repetitive tasks, like transcribing interviews, aggregating news, and producing summaries. There are also plenty of opportunities to entertain and expand immersive storytelling techniques.   

But there is tremendous worry about the dangers generative AI poses to the work of journalists. Not only is AI capable of generating content that can compete with the work of actual journalists, but it can also be used to disseminate misinformation and create deep fakes that are difficult to recognise. 

For all the worry and excitement surrounding AI, one element of the issue has gotten too little attention, Silverman said: Navigating the new world will require more true human effort and traits like curiosity and critical thinking.  

“We have our work cut out for us,” she said.  

The latest iterations of AI are at heart very powerful prediction engines. They are software models that produce outputs based on what is put in. The outputs are dependent on the training information — which is content that has been created by humans, which includes biases and errors.  

“It’s not done in isolation. It’s part of a system that very much involves people.” 

Humans do and always will play a critical role in curating information and making sense of it.

“The reality is that we’re in the loop the whole way. We select the data that goes in … and we really do decide how to use the outputs,” Silverman said.

AI has no knowledge 

When you think of generative AI, you might think it has a lot of knowledge. But what AI has is information, not knowledge, Silverman said.  

The distinction is subtle but crucial: AI has information about, say, an apple, but it doesn’t “know” what an apple is. It can produce an accurate image of an apple if asked, but it’s never tasted an apple, it’s never smelled or held an apple, and it doesn’t care what an apple is.  

AI also lacks the human experiences that influence what we ask of it, Silverman pointed out. It has never been a child or missed a train or had its heart broken — and all of these basic things about the human condition influence the information that goes into it and the tasks that it’s asked to do.  

Journalism is created by human beings who do have these experiences, and that’s part of why journalists have a critical role to play in this distinction between information and knowledge. Journalists will continue to shape the outcomes of the information AI produces. But AI presents tremendous challenges to journalists across many dimensions.

Truth: Is seeing believing? 

Generative AI has created an authenticity crisis, and what people see and hear and read is becoming a less reliable way to learn the truth. This is going to put a huge emotional strain on all of us. 

“We’re not equipped for a world in which we can’t rely on what we see and hear,” Silverman said. Deep fakes and misinformation pose particular challenges, and it’s incumbent on journalists to work hard to discern what the truth really is. Strong newsroom leadership will be crucial as this issue continues to shift. 

Trust is a closely related issue, and journalists will need to use skepticism and vigilance to verify the truth and earn people’s trust. “News media is going to have to constantly orient and reorient on this,” Silverman said.

Errors and biases 

Similarly, the potential for AI to make errors will require attention. Small errors in the data set can have big repercussions — depending on how the outputs are used — and it will largely be up to journalists to root out the source.  

Biases are also inherent in the information that trains generative AI. Most models were trained overwhelmingly on Western data, which is a huge part of the problem. AI is also drawing on years and years of information that is filled with bias.  

Our languages are also lagging, and some labels, for instance, “user” and “consumer” fail to capture current reality, and journalists will also need to play a role in moving language forward. 

“Critical thinking about word choice has never been more important,” Silverman said.  

Moving forward, the challenge of generative AI isn’t so much that it will replace journalists, it’s that it will require them to be agile and work harder to tell the truth and earn people’s trust. Journalists and media leaders will need human agency, curiosity, and critical thinking to combat the potential dark side.

About Sarah Schmidt

By continuing to browse or by clicking “ACCEPT,” you agree to the storing of cookies on your device to enhance your site experience. To learn more about how we use cookies, please see our privacy policy.