By Associated Press
LONDON — Facebook whistleblower Frances Haugen told British lawmakers Monday that the social media giant stokes online hate and extremism, fails to protect children from harmful content and lacks any incentive to fix the problems, providing momentum for efforts by European governments working on stricter regulation of tech companies.
While her testimony echoed much of what she told the U.S. Senate this month, her in-person appearance drew intense interest from a British parliamentary committee that is much further along in drawing up legislation to rein in the power of social media companies.
It comes the same day that Facebook /zigman2/quotes/205064656/composite FB -1.14% released its latest earnings and that The Associated Press and other news organizations started publishing stories based on thousands of pages of internal company documents she obtained.
Haugen told the committee of United Kingdom lawmakers that Facebook Groups amplifies online hate, saying algorithms that prioritize engagement take people with mainstream interests and push them to the extremes. The former Facebook data scientist said the company could add moderators to prevent groups over a certain size from being used to spread extremist views.
“Unquestionably, it’s making hate worse,” she said.
Haugen said she was “shocked to hear recently that Facebook wants to double down on the metaverse and that they’re gonna hire 10,000 engineers in Europe to work on the metaverse,” Haugen said, referring to the company’s plans for an immersive online world it believes will be the next big internet trend.
“I was like, ‘Wow, do you know what we could have done with safety if we had 10,000 more engineers?’” she said.
Facebook says it wants regulation for tech companies and was glad the U.K. was leading the way.
“While we have rules against harmful content and publish regular transparency reports, we agree we need regulation for the whole industry so that businesses like ours aren’t making these decisions on our own,” Facebook said Monday.
It pointed to investing $13 billion (9.4 billion pounds) on safety and security since 2016 and asserted that it’s “almost halved” the amount of hate speech over the last three quarters.
Haugen accused Facebook-owned Instagram of failing to keep children under 13 — the minimum user age — from opening accounts, saying it wasn’t doing enough to protect kids from content that, for example, makes them feel bad about their bodies.
“Facebook’s own research describes it as an addict’s narrative. Kids say, ‘This makes me unhappy, I feel like I don’t have the ability to control my usage of it, and I feel like if I left, I’d be ostracized,‘” she said.
The company last month delayed plans for a kids’ version of Instagram, geared toward those under 13, to address concerns about the vulnerability of younger users. Haugen said she worried it may not be possible to make Instagram safe for a 14-year-old and that “I sincerely doubt it’s possible to make it safe for a 10-year-old.”
She also said Facebook’s moderation systems are worse at catching content in languages other than English, and that’s a problem even in the U.K. because it’s a diverse country.
“Those people are also living in the U.K. and being fed misinformation that is dangerous, that radicalizes people,” Haugen said. “And so language-based coverage is not just a good-for-individuals thing, it’s a national security issue.”
Pressed on whether she believes Facebook is fundamentally evil, Haugen demurred and said, “I can’t see into the hearts of men.” Facebook is not evil, but negligent, she suggested.