By John Allen
An American whistleblower who previously worked for the social media giant, Facebook, has alleged that its lack of controls over harmful content is promoting ethnic violence in Ethiopia.
Frances Haugen, a former Facebook product manager who was part of a “civic integrity” team studying and proposing solutions to combat harmful practices on the site, gave evidence this week to the Subcommittee on Consumer Protection, Product Safety, and Data Security of the United States Senate.
In her evidence, which was broadcast live nationally and internationally, she said Facebook had failed to act on the information its own staff had provided because it would have threatened the company’s revenues.
“My fear,” she said in her opening statement, “is that without action, divisive and extremist behaviors we see today are only the beginning. What we saw in Myanmar [formerly Burma] and are now seeing in Ethiopia are only the opening chapters of a story so terrifying no one wants to read the end of it.”
Haugen’s appearance at the hearing followed coverage by the Wall Street Journal of internal Facebook documents showing that employees have been alarmed at how the site is being used in developing countries.
TechCrunch reports that “employees raised concerns, for example, about armed groups in Ethiopia using the platform to coordinate violent attacks against ethnic minorities.
“Since Facebook’s moderation practices are so dependent on artificial intelligence (AI), that means that its AI needs to be able to function in every language and dialect that its 2.9 billion monthly active users speak.”
But Frances Haugen told the Senate sub-committtee that 87 percent of “integrity funding” – the money Facebook spends on combatting misinformation – is spent on content in English, despite only nine percent of Facebook users being English speakers. In Ethiopia, “integrity systems” were used for only two of six languages used.
“It seems that Facebook invests more in users who make more money, even though the danger may not be evenly distributed,” she said.
Facebook founder, CEO and chairman, Mark Zuckerberg, hit back strongly at Haugen on his Facebook page. “We care deeply about issues like safety, well-being and mental health,” he said. “It’s difficult to see coverage that misrepresents our work and our motives. At the most basic level, I think most of us just don’t recognize the false picture of the company that is being painted.
“Many of the claims don’t make any sense. If we wanted to ignore research, why would we create an industry-leading research program to understand these important issues in the first place? If we didn’t care about fighting harmful content, then why would we employ so many more people dedicated to this than any other company in our space — even ones larger than us?”