Critical thinking is essential with the proliferation of data, algorithms
Content Strategies Blog | 15 January 2026
We talk endlessly about data: Audience data. Live data. Content data. Structural data. Data created by humans, collected by machines, analysed by algorithms, served back to us in endless loops.
But here’s what happened to me over the holiday season.
The algorithm’s repetition problem
I subscribed to a streaming service and noticed something familiar: a log of movies that all looked remarkably similar. I watched a couple. The algorithm dutifully noted my preference and kept feeding me variations on the same theme. More of the same. Different title, same formula.
After a few days of browsing, I felt something I didn’t expect: fatigue. Boredom. The algorithm had optimised itself into a corner — it kept showing me what I’d already proven I’d watch, but nothing surprised me anymore. No new data flowed back to it, so it did what algorithms do: It repeated.
From a technical perspective, this makes perfect sense. The system is working as designed. But it made me think about something else entirely.
What gets lost in the loop?
When you sit down with friends to decide what movie to watch, something different happens. Someone suggests something wildly outside the consensus. Someone else pushes back. You debate, laugh, reconsider. Someone says, “Wait, what about that weird film from last year?”
And suddenly you’re exploring territory the algorithm never would have suggested. Humans bring something algorithms struggle with: serendipity, context, values, and the willingness to be wrong.
The question isn’t whether data is useful. It is. The question is: To what extent should data replace human judgment?
The new exhaustion
At year’s end, we get new words and expressions reflecting the zeitgeist. Words born from trends, from the events and dynamics of our time.
And lately, I keep seeing one: scroll fatigue.
It describes exactly what I felt: the exhaustion that comes from scrolling through endless, mostly meaningless AI-generated content. Content that’s designed to engage but often leaves you feeling emptier for having consumed it. It might make you laugh, but when you realise it’s not actually real, the humour evaporates.
The irony is sharp: We’ve built systems to optimise engagement, yet people are increasingly fatigued by what those optimisations produce.

The publishing parallel
In publishing, we’re having the same conversation. We collect audience engagement data. We analyse content performance. We combine these datasets to predict what readers will want to read next — and recommend it to them.
This is powerful. It can surface genuinely relevant articles. It can help newsrooms understand what stories matter to their audience. But here’s where critical thinking becomes essential: Are we using this data to inform human decisions, or to automate them away?
A different approach: humans in the loop
The key learning isn’t to reject data. It’s to be deliberate about how we use it.
What if the role of data isn’t to decide but to suggest? What if recommendation algorithms surface patterns, but humans — editors, journalists, curators — remain the decision-makers? Suggested content approved by humans can absolutely be combined with algorithmic recommendations.
In fact, that hybrid approach might be where the real value lies. The machine brings pattern recognition and scale. The human brings judgment, editorial integrity, and the willingness to pursue stories that matter even if the data says they won’t perform.
Continuing to question
My advice to journalists, editors, and product leaders is simple: Continue to question technology and the use of technology as journalists do. Don’t let the existence of data become an excuse to stop thinking critically. Don’t let metrics become a substitute for editorial judgment.
The streaming algorithm that bored me was working perfectly. The problem wasn’t the technology; it was the assumption that optimisation alone is enough.
In publishing, in journalism — in any human endeavor that requires creativity and truth-seeking — we need something the algorithm can’t provide: the courage to pursue what matters, even when the data suggests otherwise.
Let’s use technology and humans in the loop to add real value to our subscribers.
Art: Adobe Stock Alf.








