Why audience engagement must be appropriately defined

By Greg Bright

Albuquerque Journal

Albuquerque, New Mexico, USA

Connect      

Last month I spent time talking about the difficulty in finding an acceptable way to measure online audience and engagement levels. The lack of an agreed-to standard is creating an ongoing game of cat-and-mouse between those attempting to define a measurement “standard” and those trying to show they are reaching an audience and have good (or great) engagement levels.

Will they share the evidence that a standard set by the measurement folks is “the right number?” Are these declared measurements accounting for the different uses between that of a news site and that of a site driving a shopping purchase?

What we need is a common denominator in the measurements for a local news Web site. It is not always about quantity! If you are in the fictitious Anytown, USA, and have 50,000 clicks/pageviews/sessions (or whatever the measurement is) a month, you may be as good as you can possibly get.

Let us figure out a way to show that. Is it something in a reach or penetration number within the core business area — think Nielsen designated market areas (DMA) or core-based statistical areas (CBSA) as a denominator in the math — or is it something else?

This back-and-forth about what to track is a waste of time. As long as the measurement-defining folk are changing or adjusting the standard, the site owners will figure out a way to leverage the standard to their advantage.

First it was pageviews. Then page designers began loading in bits and pieces to generate page counts. So the game continues. Let’s work together.

It is possible to know just how many people actually read a story.

It is possible to how long it took to read a story, and how far into the story the reader made it before exiting.

We can tell what time they read it and even where on the planet (or space station) they were when they read it.

We can even tell the referral site or method that lead them to the story and what did they read (or where they went) after.

With a bit of work, we can even learn about the age, income, number of children, voting habits, and so on about readers.

It is a ton of information. But story engagement is different than a search for the latest from Santana or who has the best 20x25x4 air filter.

This ability to “know” has led everyone down the thought path of pay-for-performance (PPK) and performance-based metrics. Again, the problem isn’t finding a measurement; it is finding one that that can’t be manipulated.

Google, Adobe, and similar companies are pushing their own versions of measurement. Which one is right? What is the criteria behind what is counted — a blatant call for transparency by yours truly! We need to solve this quickly, or no one will trust any of the numbers produced.

Analytics dashboards have a lot of information, but it is important to consider what information is actually relevant.
Analytics dashboards have a lot of information, but it is important to consider what information is actually relevant.

Above is an image of an engagement dashboard from Google Analytics. It has quite a bit of detail, but is it the right detail? It shows a few sets of summary stats, and some very granular detail on the actual stories actively being read.

So what?

This is where the real fun begins. Are you after generating pageviews, ad impressions, sessions, clicks, or engagement? Again, as an IT person, I can manipulate pages to generate pageviews. I can generate clicks as well, by shortening the story preview, page length, or rewriting headlines to get clicks. I can generate ad impressions by inserting ad positions in the story, to the right of the story, or on the top or bottom of the story.

Shouldn’t I be concentrating on engagement?

Sure. But is it engagement on my site or is it engagement with a particular article? Publishers have to accept that Facebook is a second homepage for our readers and that our content for many is part of a complex, diverse set of news feeds delivered to the Facebook universe via some sort of algorithm that only Facebook understands.

With a bypass of the home page and Facebook click-ins to our content or pure Instant Articles, what is the engagement measurement? Can I count some of my users’ time on Facebook as part of my engagement time?

Wait. Are we measuring time or quantity? Are the arbitrators of measurement versed enough in the complexities involved to understand (or care) how the number is defined?

Sorry. Lots of questions.

We need to get out in front with the answer. Time to quit letting the Internet companies, who have a vested interest in spinning numbers, control our metrics.

I will loop back to this thought: A general search must be measured differently than a story search. I search for that DeWalt drill and get results — DeWalt products are sold everywhere. It is a near commodity-level item and thus results in a gazillion results. Click and buy based on price.

A story is different. A search for a traffic accident on Reston Parkway will get a handful of results.

So, can we base quantity not on sheer numbers but on a quantity within a geographic area for our local/metro publications? Again, can we go back to DMA or CBSA? Maybe a DMA by postal code? Something that can be used to show reach as something relevant and effective?

For example, 1.9 million unique visitors in a DMA of 3.2 million is a pretty good reach. The size of the Internet without these bumpers makes 1.9 million sound small, but 59.3% of the market is good.

Can we get the agencies on board as well? Enough ranting for the day …

About Greg Bright

By continuing to browse or by clicking “ACCEPT,” you agree to the storing of cookies on your device to enhance your site experience. To learn more about how we use cookies, please see our privacy policy.
x

I ACCEPT