Ringier measures female representation in media with EqualVoice

By Sarah Schmidt


Brooklyn, New York, United States


At a very basic level, women are still dramatically underrepresented in media: 82% of news media reports are about men, according to one estimate, and only 4% of articles about sports include female athletes even though they make up 40% of all athletes worldwide.

As part of an effort to encourage more equitable representation, Swiss publisher Ringier has developed a tool to measure gender balance in its coverage. 

“You measure what you treasure,” Ringier CEO Annabella Bassler told attendees of of INMA’s World Congress of News Media last week.  

EqualVoice is Ringier’s in-house algorithm that measures the proportion of women represented in its articles. The company began using it on several online publications in 2019, including Blick, Beobachter, Bilanz, Cash, Handelszeitung, GaultMillau and Schweizer Illustrierte, and last year added several print publications. Ringier is now also offering EqualVoice as a service to other media companies. 

EqualVoice operates on the very simple principle of measuring the representation of women and men. It generates a “Body Score” based on the numbers of women’s first names in articles and a “Teaser Score” based on the visibility of women in images, headlines, and titles. These scores then appear in newsroom dashboards. The tool found a wide spread of proportions across its titles, according to a report on the Ringier’s Web site

EqualVoice is meant to very simple metric that will keep the discussion of gender equity open. It’s not meant to look at subtleties like how people of different genders are represented, and it is only able to look at traditional male and female representations. It also doesn’t address representation of different races or abilities. 

The idea is get the discussion started so newsroom staff can keep equity top of mind at all times, and the initiative has successfully generated deeper, active discussions in the newsrooms where it’s used, Bassler said.

“We wanted to keep it simple as possible without putting too much data on the table and just create a discussion,” she said. 

Bias is deeply embedded in the existing body of journalism on the internet now, Basslet pointed out, and that bias is amplified when generative AI uses it as its training material. 

She was struck by this during a recent family event when several relatives tried using ChatGPT to suggest gifts for children. Suggestions for 9-year-old boys included gaming consoles, smartphones, and cameras, while suggestions for 9-year-old girls included pencil cases, dollhouses, and cookbooks. 

The good news, though, is whatever journalists create moving forward will also have a voice in AI, and more balanced coverage will have an immediate impact.

“Whatever were are creating, it will have a voice in AI right away,” Bassler said. “By the time you look, it’s already changed.”

About Sarah Schmidt

By continuing to browse or by clicking “ACCEPT,” you agree to the storing of cookies on your device to enhance your site experience. To learn more about how we use cookies, please see our privacy policy.