Last month I was lucky enough to attend (and present at) the INMA European News Media Conference in Amsterdam, where cracking the subscription model nut was very much a trending topic. So was the fact the personalisation era is well and truly here.
One of my fellow presenters was Isabelle Kovacsovics, project manager for the chief technology and data office at Ringier, a Swiss media company. The technological innovation taking place there is impressive — not least because it demonstrates senior leadership buy-in for investment.
The company has developed its own platform connecting all its content and marketplace businesses, which importantly gives it control over its data. The ambition is to create an ecosystem to rival Alphabet, Facebook, and Comcast. Ringier uses Artificial Intelligence to track, segment, and cluster readers based on interests, making it easier to market to them and easier to serve up content that is likely to land.
However, I couldn’t help feeling a little uneasy at the thought that the next logical step in this approach is editorial being produced based solely on the analytics, rather than the news agenda. And, as Facebook knows only too well, personalising the news experience through algorithms provokes criticism as well as accusations of undermining democracy during elections.
In the United Kingdom, The Times is taking a hybrid approach. The newspaper announced in March it is developing its own self-learning algorithm. This is currently referred to as “James,” a digital butler that will distribute content to readers based on past preferences. The first step is a personalised e-mail sent to subscribers highlighting content specific to them. In time, the plan is to extend this to push notifications and texts at times of the day deemed most suitable.
According to a Digiday article from March 15, personalisation will only impact content distribution, not the experience on the main site, which will still be updated three times a day. Subscribers receiving personalised e-mails who then visit the main site will see what everyone else sees to avoid the “echo chamber” effect.
I’ll be interested to see how this plays out and whether traffic to the Times’ home page suffers as a result of a personalised distribution strategy. We know from our own work at The Economist Group that home pages are often redundant; social media traffic drivers most often direct an audience to a specific piece of content.
Likewise, when traffic comes via search, the journey is a direct one. There has to be a compelling call to action at the end of the specific piece of content that the audience has sought out for them to continue the journey.
Facebook’s algorithm is problematic, but as an interesting 2016 Wired article pointed out, audiences have to shoulder some of the echo chamber blame. We like and share things from like-minded friends, plus people have different motivations when they are on social media. Yes, many use it as a primary news source, but it’s also a place where self-image is nurtured. We share things not just to spread information but to portray ourselves in a certain way.
It’s not a new observation, but the key responsibility of the social media giants has to be the policing of fake news. Audiences surrounding themselves with only the things they want to hear is slightly more palatable if the information is correct.
For publishers, reconsidering revenue strategies through the lens of enhanced technological possibilities is a no brainer. But, if we wish to remain champions of free speech (as any worthwhile news organisation should be), we need to continue surfacing stories that, if measured by analytics alone, our audiences supposedly do not want to consume.