Since watching the documentary The Social Dilemma and also investigating #SurveillanceCapitalism, I have been thinking a lot about our industry and the threshold we find ourselves in when it comes to data as business.
We are sitting in the middle of some of the most exciting times in the history of the business of media. During the hard lockdown, it was obvious to us just how many new tech tools were at our disposal. It was like being in a candy store.
In the property industry in South Africa, for instance, 90% of our advertisers had never heard of TikTok before COVID-19. Now we have clients wanting TikTok to be part of their media package.
We are also certainly seeing rapid changes in our followers’ behaviour. This is important for us to know so we can ensure we are delivering content and commerce when and how they need it. With data on our users easily available, we can see who is accessing our news, at what time of the day, where they reside, and how online behaviour has adapted.
As we mine down further — and as a cookie-less future becomes a reality — most business executives are working hard to ensure they are at the forefront of the business of gathering data. Everyone knows data is the new gold and that data analysts are the new must-have employee in a newsroom. Our quest for information data on our users has become a necessary, exciting, and important 24/7 business tool.
But I believe how we gather this and what we do with this data needs a new code of ethics and some global Industry standards set over and above country laws so that we don’t fall into the trap of regret that some of our tech peers have.
I understand things are moving fast, and that even a moment of pause can put you behind your competitor. However, history has shown us what happens when an industry doesn’t sit together to put checks and balances in place as they race toward financial security to keep the business afloat.
I have a peer working in Artificial Intelligence (AI). They are trying to bring empathy and emotion into AI, and they were using social media to train it. According to my peer: “The AI kept on coming back with racist and misogynist responses — it was what it was learning in there. We just were unable to use the tool to teach empathy and love.”
No one starts out trying to manipulate and persuade society for the bigger bad. But just in case that happens, shouldn’t we be taking the time to reflect and bring consciousness into the moment?
As we find ourselves as pioneers at the precipice of our own new dawn, perhaps creating some guiding principles may be just what we need to ensure we can look back on this time and be proud of what we have created.