It was 25 April, 2011. Antonio García Martínez, a newly hired employee at Facebook, attended his on-boarding session at Page Mill Road in Palo Alto, California. The speaker was Chris Cox, the head of product.

The meeting must have been intense, as five years later Martínez is able to describe it in detail in his memoir Chaos Monkeys, published by HarperCollins.

A number of books offer clues about what the future of media might hold.
A number of books offer clues about what the future of media might hold.

“What is Facebook? Define it for me,” Chris Cox said to the people in the room.

“It’s a social network.”

“Wrong! It’s not that at all.”

Cox scanned the audience for another answer, until he got the one he was looking for: “It’s your personal newspaper.”

“Exactly! It’s what I should be reading and thinking about, delivered personally to me every day,” Cox said.

“Facebook was The New York Times of You, Channel You, available for your reading and writing, and to everyone else in the world as well, from the Valley VC to the Wall Street banker to the Indian farmer plowing a field. Everyone would tune in to the channels of their friends, as people once clicked the knob on old cathode-ray television sets, and live in a mediated world of personalised social communication.

That the news story in question was written by The Wall Street Journal was incidental: your friend Fred had posted it, your other friend Andy had commented on it, and your wife had shared it with her friends,” Martínez wrote in his book.

“Here was the first taste for the new Facebook employee of a world interpreted not through traditional institutions like newspapers, books, or even governments or religions, but through the graph of personal relations. You and your friends would redefine celebrity, social worth, and what should be churning through that restless primate brain all day,” he wrote.

Martínez was fired from Facebook in 2013. When the book was published last summer, the company declined to comment about any of its claims.

True or not, I recalled this anecdote, when Cox — still at Facebook, holding now a badge of the chief product officer — recently denied publicly that Facebook defined itself as a medium.

Speaking at The Wall Street Journal’s conference, Cox said, “Facebook is a technology company focused on building tools, not a media company focused on making stories.” 

Cox’s remarks were repeatedly confirmed by his CEO, Mark Zuckerberg. And Facebook’s denial of its identity became a story in 2016.

Facebook claims it is not a media company, but some would argue that is not the case.
Facebook claims it is not a media company, but some would argue that is not the case.

Facebook and the content trap

“The language for success in media, as in technology, is less and less about content and more and more about connections,” claimed Professor Bharat Anand of Harvard Business School in his book The Content Trap, published late autumn by Random House. 

Professor Anand took readers on a trip to Scandinavia to analyse the digital transformation of Schibsted, one of the most digitally advanced news publishers in the world.

He found the Internet didn’t really kill news. It destroyed the classifieds that subsidised newsrooms and helped newspapers to establish local monopolies.

“Where news organisations went wrong [in digital transformation] was not in failing to deliver faster, cheaper, better news online,” argued Professor Anand, “but in failing to protect the classifieds subsidy or to profitably manage its migration online.”

Schibsted did way better than many others: In 1999, it launched its first classified Web site in Norway, then it expanded internationally. At the end of 2015, it operated marketplaces in 28 countries. Classifieds business brought in US$704 million, or 37% of the group’s operating revenues and 81% of profits.

The case of Schibsted and other networked businesses inspired Professor Anand’s general thesis of The Content Trap: “In content worlds, we focus on the actions, tastes, or behaviours of consumers in isolation rather than on what connects them; we focus on making the ‘best’ content rather than on what makes users share.”

Facebook itself might be an ultimate proof of the thesis, as it became the biggest newspaper in the world with 1.79 billion users focusing on user connections and surely not the “best” content.

“Content has been a curse,” said Scott Cook, a co-founder of tax software giant Intuit. The book quoted him: “It causes you to think you can make what’s going to delight customers. It causes you to ignore user contributions. It causes you to focus on your own content rather than on how to get the best content in the world — content anyone can make.”

By coincidence, a month after Professor Anand’s book hit stores, Donald Trump was elected president of the United States. Liberal America howled and blamed Facebook for the rise of fake news. Suddenly, it was again “content anyone can make” that sounded like a trap.

Fake news migrations

Facebook’s engineers that were losing sleep over fake news could find an inspiration in Wikipedia. I could suggest Platform Revolution, a book by Geoffrey Parker, Marshall Van Alstyne, and Sangeet Paul Choudary, published last spring by W.W. Norton & Company, as a primer on the platforms’ design. 

“When it was launched, Wikipedia aspired to a condition of complete openness. Maintenance of quality would be entrusted solely to the users of the platform, who would take it upon themselves to monitor the content of the site, fix errors, and challenge biases,” wrote the research fellows at the MIT Initiative on the Digital Economy.

They concluded: “This was a utopian vision that assumed good intentions on the part of all Wikipedia users; or, a trifle less idealistically, it assumed that the varying, sometimes conflicting, motivations and attitudes of users would eventually balance one another, producing content that represented the combined wisdom of the entire community, much as, in capitalist theory, the ‘invisible hand’ of the market is supposed to maximise the benefits for all through the interaction of countless self-interested participants.

However, reality teaches us that democracy — like free markets — can be messy, especially when intense passions and partisanship are involved.”

Curation was a solution. It usually took forms of screening and feedback at critical points of access to the platform. Screening decided who would be let in, while feedback encouraged desirable behaviour of those who had been granted entry.

The key choice was whether to train and employ human moderators, or to rely on users themselves to curate the platform. Wikipedia and authors of Platform Revolution favoured user-driven curation facilitated by software tools that gathered, aggregated, and applied curation decisions.

This was the way Facebook chose, too, to combat fake news, but it was not the only way.

Although Facebook laid off its editors curating Trending Topics in August, there were still big social networks that counted on the “human element.”

LinkedIn not only had a team of 25 humans but it even named Fortune’s Daniel Roth as the platform’s executive editor. The team’s task was defined as “creating, cultivating, and curating.” Luckily for Roth, his platform’s users tended to be much more careful about what they said, as they thought of LinkedIn as the office and Facebook as the home.

Snapchat could be a disco in Roth’s metaphor.

The trendiest platform worldwide secured the fake-free environment by adopting a very different design from Facebook. Firstly, there were two distinct sections in its Snapchat app: one contained the user-generated content, the other had news and entertainment from vetted publishers (“Discover”). Secondly, user-generated posts expired and disappeared once they had been viewed, so they could not endlessly spread.

Thirdly, the posts were displayed in chronological order, and not by a personalised algorithm optimised for popularity and virality. And this was what Facebook’s algorithm was doing: It optimised the News Feed mainly for engagement, and the company did not vet the sources.

A few weeks ago at the INMA webinar, I interviewed Espen Egil Hansen of Schibsted’s Aftenposten, who famously challenged Facebook in September over censorship of a news picture. “An algorithm is not neutral,” he said. “[Facebook’s] algorithms are tuned for engagement, and people are building biases into them.”

Hansen said he believed the news media should embrace algorithms but simply optimise them to reflect their values of choice; for example, “truth.” And, if needed, the publishers should even aspire to launch the “third platform” to compete with Google and Facebook. All this should be done regardless of whether regulators would step in.

Interestingly, other platforms spotted Facebook’s vulnerability and had started to consciously differentiate themselves based on the way it curated content. When I was in Boston, I found both high-crime and low-crime neighbourhoods and that they attracted different sorts of inhabitants. Likewise, there could soon be high-fake news and low-fake news social networks.

If the top media story of 2016 was the rise of misinformation online, the story of 2017 might be whether it triggered any mass migrations.