Facebook’s advertising strategy raises questions over ethical practices
Digital Strategies Blog | 15 May 2017
It’s always interesting — and usually alarming — when someone manages to peek through the heavily guarded curtains of Facebook’s advertising strategy. Where it leads to is yet another question of where the world’s biggest social network draws the line between social responsibility and financial ambition.
While we obviously think of Facebook as American company, user growth in the United States has slowed to a virtual crawl — up from 210 million to 234 million in the past two years. Compare that number with the surging growth in its biggest market, Asia-Pacific. That region’s user numbers are 716 million, up from 471 million two years ago.
And it was from Asia-Pacific that a real nugget of Facebook’s advertising strategy recently was uncovered by The Australian’s media editor Darren Davidson.
He began his exclusive on the first page of News Corp’s national daily newspaper like this:
Facebook is using sophisticated algorithms to identify and exploit Australians as young as 14 by allowing advertisers to target them at their most vulnerable, including when they feel “worthless” and “insecure,” secret internal documents reveal.
A 23-page Facebook document seen by The Australian, marked “Confidential: Internal Only” and dated 2017, outlines how the social network can target “moments when young people need a confidence boost” in pinpoint detail.
By monitoring posts, pictures, interactions, and Internet activity in real-time, Facebook can work out when young people feel “stressed,” “defeated,” “overwhelmed,” “anxious,” “nervous,” “stupid,” “silly,” “useless,” and a “failure,” the document states.
Quite extraordinary.
The document was part of a Facebook presentation by two of the company’s Australian executives to one of Australia’s major banks.
At the core of the document were insights into how Facebook could slice and dice the user behaviour from its massive amount of data to help advertisers target potential customers. Australia might be a small fish in the scheme of things, but it is a microcosm of how Facebook and Google are hoovering up ad dollars from media companies.
In Australia and New Zealand, the document said Facebook had a database of 1.9 million high school students with an average age of 16; 1.5 million college students with an average age of 21; and 3 million young workers with an average age of 26.
The next step was it had enough data from its algorithms to figure out mood shifts among those groups of people.
“Anticipatory emotions are more likely to be expressed early in the week, while reflective emotions increase on the weekend,” the document discloses. “Monday-Thursday is about building confidence; the weekend is for broadcasting achievements.”
Davidson’s story prompted the Australian government’s children’s e-safety commissioner to seek assurances from Facebook that it accepted what it did was wrong. “Facebook advised the research did not follow their established process; they have accepted responsibility for this oversight and are conducting a review,” the commissioner said.
Facebook blamed the Australian operation of not following their processes in the highly detailed presentation to the major bank.
‘‘Facebook only permits research following a rigorous procedure of review whenever sensitive data, particularly data involving young people or their emotional behaviour, is involved,” the company said. “This research does not appear to have followed this process.”
But Davidson isn’t convinced by Facebook’s defense of blaming the mistake on a rogue operator. “It’s not like they have people in other functions. All they really do is sell ads. They don’t produce any other content. They take content from newspapers, magazines, TV networks. They get it for free, they aggregate it, and then they aggressively sell ads against those eyeballs. They know very well what they’re doing.”
This month Facebook announced a record Q1 revenue of US$8 billion — up 50% over last year.