A lesson in deception in the era of fake news

By David Murphy

Mobile Marketing Magazine

London, United Kingdom


Last week, I was duped along with about 1,000 other people — big time.

Like them, I was attending the IAB United Kingdom’s annual Engage conference. It’s a day-long, usually entertaining, and thought-provoking event for digital marketers from brands and agencies. It is designed to send attendees away re-enthused and re-invigorated about digital marketing with lots of new ideas to put into practice. 

The last session before lunch featured Ken Fawes, editor-in-chief of The Cincinnati Sentinel, who was slated to give a personal take on the future of news.

According to the session blurb, “The Sentinel played an influential role during the 2016 U.S. presidential election in the critical swing state of Ohio,” adding that Fawes would showcase “how using modern storytelling techniques drew in a new audience” and “share those findings, offering valuable lessons for brands and publishers.”

Presentation proves even professionals can be duped.
Presentation proves even professionals can be duped.

Fawes kicked off with three statements.

  1. “The future of news is not about human journalism.”
  2. “News brands will only survive if they pin their entire business around speed efficiency and scale.”
  3. “AI (Artificial Intelligence) can be programmed to do anything a human can do, but better.”

He then showed delegates his entry on Wikipedia, quoting a section of it crediting his publication for playing a key role in influencing the result of the 2016 U.S. presidential election — “a fact I don’t dispute,” he noted.

Addressing an audience of “media types,” it was a ballsy introduction. 

Next, he gave an example of one article that both justified the claim his publication had influenced the election result and gave credence to his opening three statements. He showed a headline from the Sentinel: “Tens of thousands of fraudulent Clinton votes found in Ohio warehouse.”

He then went on to explain how the headline and accompanying story had been created. The votes, he explained, were for Cincinnati, not Ohio, but the Sentinel’s AI platform had calculated that a headline with the word “Ohio” in it would draw much more traffic than one with the word “Cincinnati.” The platform had also surfaced several pictures of warehouses in Ohio to use to accompany the story. 

The story had run, and, as Fawes explained, generated hundreds of thousands of views on the Sentinel’s Web site, without a single word being written by a human journalist.

“It’s all about the ability of algorithms to analyse multiple editorial feeds, understand the tone of voice to write articles in, and what images to use,” he said.  

I must admit, that statement about the story appearing without a single word being written by a human journalist did have me somewhat confused. But everything else Fawes had said sounded perfectly plausible. 

My confusion was short-lived, however, as a member of the audience raised his hand to ask a question and, having been given a microphone, proceeded to out Fawes as a fake.

He wasn’t real and the Cincinnati Sentinel wasn’t real, the questioner claimed. And, at that point, as he approached the stage to take Fawes to task, I — and no doubt many others familiar with the IAB’s sometimes irreverent and innovative approach to event management — realised the whole thing was a ruse, a stunt.

The man with the microphone then took to the stage to reveal to everyone else that “Ken Fawes” was an actor playing a pretty convincing part.

And as Fawes — whose name is an anagram of “fake news” — exited the stage, the questioner revealed himself to be none other than David Walsh, the Sunday Times Sports correspondent whose investigative journalism led to the downfall of U.S. cyclist Lance Armstrong. In 2012, Armstrong was eventually stripped of his seven Tour de France titles as a result of Walsh’s revelations about the part banned substances had played in his triumphs. 

For the next 20 minutes or so, Walsh gave a brilliant, impassioned speech. It covered the Armstrong investigation but also focussed on the importance of investigative journalism as a whole.

He was followed on stage by Clare Rush, chief revenue officer at MailOnline, who made her own impassioned plea for the advertisers and agencies in the room to support those newspapers investing in quality journalism by investing in them (with their ad dollars, though that part was implied, rather than voiced). 

So, this was a very clever stunt with a very serious and well-conveyed message about the importance of investigative journalism in holding people accountable and exposing the truth.

What struck me more than anything, however, was the credulity of “Ken Fawes’” words, in particular those in the opening three statements: The future of news is not about human journalism; news brands will only survive if they pin their entire business around speed efficiency and scale; and AI can be programmed to do anything a human can do, but better.

In a world where the leader of the free world tries to convince anyone who will listen that respected news outlets are mere peddlers of fake news, Walsh’s speech was a welcome reminder about the power and value of real journalism, carried out by real people asking awkward questions.

But the stunt itself was an equally valid reminder that, in a world where we hear with every passing day about AI, Augmented Reality, Virtual Reality, and Mixed Reality, separating fact from fiction is only going to get harder for all of us. 

About David Murphy

By continuing to browse or by clicking “ACCEPT,” you agree to the storing of cookies on your device to enhance your site experience. To learn more about how we use cookies, please see our privacy policy.