Working in small teams has its benefits in providing results, but opening up to collaboration can also mean opening up to yet unforeseen opportunities. With data, the same applies: It must be seen as a whole in an organisation to meet maximal efficiency.
In other words: data unicity.
On Thursday, Guus Bartholomé and Declan Owens, two experienced analytics professionals in the media industry, shared the story of how they contributed to implementing data unicity within a large broadcasting company in The Netherlands at an INMA members-only Webinar. They discussed why this concept is important for media companies, how it can be made possible, and the great opportunities that can be uncovered to ensure reliable decision-making.
Data unicity: the concept
A company typically starts off by building a Web site, then expands with apps and mobile offerings, and perhaps moves on to voice-assisted technology, chatbots, advertising platforms, etc, Owens explained.
For each of these, a team is dedicated to that product or technology. Different brands within one company also often operate with separate teams and technologies.
“You try and bring things together, but it’s rare from my experience for it to all be unified and balanced out,” said Owens, a digital analytics expert at AT Internet, which serves customers in 32 countries throughout Europe, the United States, and Singapore with centralised data analytics. “So what we have are siloed teams.”
There are also siloed tools that teams may use together or separately. “You sort of have silos by default if you don’t set a strategy. This poses problems to unify the data and leverage different opportunities for insights.”
Why would an organisation would want to unify its data? There are multiple touchpoints for each individual user, and it can be challenging to map the customer’s journey through them, he said: “The idea is to have the capacity to unify all the data in a certain place in storage.”
Achieving data unicity
Owens looked at how an organisation can unify its various data. One option is the historical SaaS model, which collects the data and stores it, and then the company uses that data.
“But when it’s in separate tools like this in a SaaS model, these tools have lots of advantages but they are still siloed, they are still separated.” The advantages are they are ready to use, standardised, and provide data democratisation. However, they provide no holistic view, operate in a black box that doesn’t allow the user to see how it’s working, and have limited features.
The second option is ETL (extract, transfer, and load) data pipelines. These extract the data from different tools such as digital analytics, CRMs, and content meta data, transforms or engineers the data into usable format, then loads it or stores it for use.
The pros of this are that it’s extremely flexible with a centralisation of power. The cons are lower data quality, data democratisation issues, and cost of ownership.
The third solution is basically a mix between these two, Owens explained: an open SaaS model with AT Internet. “So you have the SaaS model where you still use AT Internet’s tools around behavioural data to collect, store it, and use it — but the thing is, it’s open.”
There will be other tools such as CRM on the outside, and that data can be brought into the open SaaS model. “It’s very easy to import.” Users can also extract the processed data into other outside tools to use for various purposes.
“It’s basically the best of both worlds,” Owens said. This solution offers:
- Centralisation power.
- Data democratisation.
This makes it easier to track the user journey, and a central team in the middle manages the data and drives the strategy through the organisation. “So no matter what technologies you have, no matter what you do, you can embark everyone on this unified model,” Owens said.
NPO’s organisation and experience
Guus Bartholomé, online researcher at Nederlandse Publieke Omroep (NPO) broadcasting service, discussed his organisation’s experience with data unicity. NPO has several television and radio brands, as well as digital products. These receive 12 million TV viewers, 5.8 million radio listeners, 5.4 million Web site and app visitors, and 1.8 million on-demand platform video viewers per week.
NPO does not create the content; this is done by the 11 member broadcaster team. These are all separate companies, so the entire organisation is very de-centralised. With each of these broadcasters having their own Web sites and apps, the entire portfolio consists of more than 100 Web sites and apps. These are also individual brands with their own characteristics, target audiences, and analytical needs.
The challenge NPO faced, therefore, was the implementation of a new digital analytics system for the entire organisation, Bartholomé said. De-centralisation meant NPO had different technical teams, analytical teams, and editorial staffs.
There were three main reasons NPO needed data unicity in analytics:
- Participation in panel research: one central place to deliver data about panel behaviour and the people they are trying to reach.
- Accountability: justifying for our efforts and audience, and staying close to our public values.
- Benchmarking and evaluation: to compare and evaluate brands internally, and have a consistent measurement system across titles.
“This creates a bit of a conflict,” Bartholomé said. “We have all these different titles and individual broadcasters who have their own data needs.”
To solve this, NPO created a semi-centralised data model with strict rules on particular values, but provided custom properties that provided each individual brands freedom to set their own analytics.
How to implement data unicity in one’s own organisation
Bartholomé shared some of the key learnings NPO acquired in its data unicity journey that could help other organisations wanting to do the same:
- Identify the internal and external stakeholders of digital analytics.
- Propose need intake sessions with the identified stakeholders.
- Anticipate reluctant stakeholders’ needs.
- Design a unified data model based on the collected needs.
- Identify what equipment is used for digital analytics.
- Maintain your unified model.
“Is it worth taking this challenge?” Bartholomé asked. “I think that it’s a really good investment to have data unicity early in your plans. You can really reap the rewards of it at a later stage.”
He shared the key long-term advantages of doing so:
- Data governance: a centralised data model with feature distribution control and data distribution control.
- Data democratisation: instant set-up, self-serve exploration, and out-of-the-box reports.
- Data quality: standardised measurements, transparent processing rules, and monitoring equipment.
“All in all, it leads to more efficient analytics which saves time and brings the company rewards, in my opinion,” Bartholomé said.
Advice and takeaways
Having a data unicity model requires some planning and implementation time, Bartholomé said: “Design is crucial. The way you design your model is really important.”
Maintenance is also key. But Bartholomé said setting up a proper data unicity model pays itself back and makes the teams’ lives easier in the long run.
Image Courtesy of Gerd Altmann from Pixabay.