Earlier last month, the Digital News Initiative, a European Google fund, announced it had invested more than €600,000 to fund a Press Association (PA) news project involving robot journalists that could write 30,000 stories per month. Yes, 30,000!
The Reporters and Data and Robots project, infamously nicknamed “RADAR,” swears it will benefit traditional media vehicles and independent blogs focused on local journalism. Robots will first collect data from official sources. They will then evaluate everything and, magically, the information collected will be turned into reports on health, safety, or employment, for example.
The whole thing gets even more controversial when a PA publisher says the project also involves talented human beings — in this case, five of them — who are responsible for editing and curating. To put this in perspective: Zero Hora’s Web site publishes at least 12,500 stories per month, which are produced and edited by more than 200 of its journalists.
Pete Clifton, the editor of the PA news agency, said in an interview that five such talented journalists will be vital to the process. He also said RADAR is even going to create local stories in a volume that would be “manually impossible.” That simple.
I would like Clifton to explain to me how these five journalists in question will be able to edit, correct, or approve the texts produced by the beloved robots. Even if they worked 24 hours a day, every day of the month, they would need to edit at least eight texts every 60 minutes.
You do not have to be a math genius to know human involvement in the work done by robots will be dangerously shallow, if not impossible. Other than that, it’s hard to believe this data could help local vehicles with information in addition to what they already produce.
First, those who consume local content are usually more interested in curation and analysis. Humans, in any case.
Also, those who know a little bit about journalistic practices know covering the agenda of a local government, for example, involves research, relationships with sources, and in-depth interpretation. This is something that will hardly be born in an automated feed.
Am I marching against automated journalism? No. It can be useful, yes, especially in evaluating and extracting complex data (as PA itself already does), and in helping journalists digest information.
But robots (like some humans, I know) are incapable of interpreting fictitious information, of doubting, of distrust, of provoking, of looking the other way, of squeezing data and combining insights, of finding the hole and digging more, and deeper.
The heart of journalistic production is fundamental to our democratic society. I’m sorry, Google, but that heart will never pulse in your robots.