AI is changing the editorial landscape in Africa

By Paula Felps


Nashville, Tennessee, United States


As AI continues evolving at a breakneck pace, newsrooms around the world are looking at how to leverage it — while simultaneously grappling with some of the new ethical questions it raises. During the week’s Webinar, INMA members got a closer look at the possibilities and challenges AI holds for the news media industry.

AI and the future of editorial practices in Africa offered far-reaching insights into how AI can be used, what’s in the works, and what news media companies need to consider for the future. Harvey Binamu, tech officer of Magamba Network in Zimbabwe, and Dean Arnett, a documentary producer formerly with BBC Africa, provided a fascinating overview of the current and future AI landscape.  

“AI is a tool,” Binamu reminded members. “The choice about how it gets deployed is ours.”

He noted that the four key areas where AI is being used in his newsroom are:

  • Content and news gathering.
  • Content processing.
  • Content/news distribution and audience engagement.
  • Changing editorial practices and newsroom structures.

By implementing AI, Magamba Network has streamlined its editorial workflow, creating a new process Binamu outlined.

Magamba Network used AI to streamline its editorial workflow.
Magamba Network used AI to streamline its editorial workflow.

While AI helps in the process, there’s still a strong human element involved. For example, stories are commissioned and written by a person, then ranked for SEO and given a title using technology. A human editor re-enters the workflow to provide feedback on the results and gives the green light to publish the story. From there, technology measures the engagement and provides feedback on its performance.

Binamu shared an example of how this changed how an election was covered: originally, Magamba offered a way for users to search for their candidate by area. What the data told them, however, was that people were more interested in finding out where to go to vote. Based on that feedback, the company adjusted its information to match the searches — and saw engagement double.  

Addressing the challenges

Many of the challenges posed by AI could be applied to any change; gaps in knowledge mean that publishers need to provide training, and access is limited to the financial resources each company has available. Because it is so new, business strategies are being created on the fly: “We are figuring out how to integrate it into our businesses and focusing on whether we’re trying to grow or get better revenue,” Binamu said. “So it’s shaping up where we are learning as we go.”

One of the biggest universal fears is what AI means to human workers; Binamu said that although it has created feelings of job insecurity, that fear is greatly exaggerated.

“AI is a multiplier,” he explained. “It’s a tool you use to become more efficient and perhaps look for stories beyond your traditional [coverage].”

Cultural resistance is another hurdle for publishers to clear, and Binamu said this resistance typically falls along generational lines. Younger generations often prefer tablets and phones, while older workers opt for laptops and desktops. This creates a cultural clash.

To get around such a clash, he advised getting everyone in the same room and explaining the goal and what tools will be used. Then, explain how the tools might work better on their laptop or phone —  even if it’s not something they’re immediately comfortable with.

“Get everyone on the same page to bridge the gap between the paradigms and just focus on the goal,” he suggested.

AI offers incredible possibilities that are transforming news media organisations.
AI offers incredible possibilities that are transforming news media organisations.

A promising future

While there are plenty of challenges, the opportunities are abundant, too. Binamu identified what he sees as the three most significant opportunities that AI is ushering in: innovation, personalisation, and productivity. 

“We have new ways to tell stories,” he said. For example, AI can break down a two-hour meeting and quickly locate soundbites and main themes. It can tell the story in new ways, such as creating something suited for a younger audience on TikTok.

“It’s about who are we targeting and how we are targeting them. We have big chunks of targeted individuals, but we hope to [use AI to] break it down to something more specific,” Binamu explained.

And finally, AI’s contribution to productivity can’t be overlooked: “AI makes things faster,” Binamu said. “That becomes one of the key pillars to sell to the individuals using them. You’re not going to lose your job. You’ll do things faster and more efficient.”

Using AI as an ally

Arnett agreed that the big concern in newsrooms is whether AI will replace human workers. Taking action now and preparing for the present and future of AI is the best way to prevent that from happening.

“If we bury our heads in the sand, then that is more likely to happen because AI is developing and evolving incredibly fast,” Arnett said. “And if we ignore it, there’s going to be a tipping point where AI will take over — not in a kind of sci-fi sense — but there will be tools that will become so powerful, it will make sense to replace people with AI systems.”

The sooner news media companies embrace AI, the more control they will have over it, said Dean Arnett.
The sooner news media companies embrace AI, the more control they will have over it, said Dean Arnett.

A smarter approach is to adopt AI systems now and figure out how to fit them into existing workflows to create better content, he said.

While much of the buzz in newsrooms has revolved around written content, Arnett focused on the image and video capabilities of AI. While image generation has evolved incredibly fast, he said video generation is only beginning.

“AI is being used in all kinds of ways in video generation right now, but it looks rough and you can tell it’s AI,” he said. However, “AI video generation is moving faster than any other aspect of AI-generated content” right now. Publishers need to start looking at the available tools and determine where they fit into existing workflows.

The array of tools currently available can help in many areas of content production.
The array of tools currently available can help in many areas of content production.

Current AI tools can work as assistants, either for individuals or teams. For small digital publishers who worry about being able to use resources efficiently, AI tools can help. They can improve efficiency and even advise with strategic decisions.

There are also AI tools that can help with research and find supplemental information for stories and videos. And, when that massive amount of information is compiled, AI can help find what’s relevant to a specific project.

And, reiterating Binamu’s assurance that AI isn’t here to take jobs, Arnett clarified that working with AI is about redistributing people to places that can have a bigger impact on an organisation’s output: “The problem with AI is it creates content which lacks humanity. So the more people we get at the stage where we’re outputting this content, putting that human touch in our material, the better.”

As more AI is being used, publishers will also need to have technology on board to detect when content that’s being used as source information has been generated by AI. The challenge is that as AI gets smarter, it creates AI tools that are able to evade AI-powered AI detection tools: “If we understand those tools exist, we can have the biggest impact with our output.”

In addition to tools for creating content, publishers need to look at related tools for such things as detection and analytics.
In addition to tools for creating content, publishers need to look at related tools for such things as detection and analytics.

Publishers who are entering the AI environment have three main issues to be aware of, Arnett said:

  1. Copyright.  “If you are using AI tools to generate media, then who owns the copyright? This is a very controversial topic,” he said. As of now, there are no clear-cut answers, but news media companies need to follow this closely.
  2. Deep fakes. In just a few months, AI has mastered the art of deep fakes, and that’s something publishers must be aware of: “We need to be able to check if the material we are being sent real or if it has been generated by AI.”
  3. Wider editorial issues. Each company must develop a policy on whether it will publish synthetic or AI-generated content. Publishers run the risk of audiences thinking the material they’re running isn’t “real” if it is generated by AI. “And if they think that the material you are creating isn’t real, that undermines trust with them in you and as a news organisation, as a current affairs organisation. Can you afford that?”

About Paula Felps

By continuing to browse or by clicking “ACCEPT,” you agree to the storing of cookies on your device to enhance your site experience. To learn more about how we use cookies, please see our privacy policy.