LONDON — Former Facebook data scientist-turned-whistleblower Frances Haugen on Monday told lawmakers in the United Kingdom working on legislation to rein in social media companies that the company is making online hate and extremism worse and outlined how it could improve online safety.
Haugen appeared before a parliamentary committee scrutinizing the British government’s draft legislation to crack down on harmful online content, and her comments could help lawmakers beef up the rules. She is testifying the same day that Facebook is set to release its latest earnings and that The Associated Press and other news organizations started publishing stories based on thousands of pages of internal company documents she obtained.
Haugen told U.K. lawmakers how Facebook Groups amplifies online hate, saying algorithms that prioritize engagement take people with mainstream interests and push them to the extremes. She said the company could add moderators to prevent groups from being used to spread extremist views.
“Unquestionably, it’s making hate worse,” she said.
Haugen added she was “shocked to hear recently that Facebook wants to double down on the metaverse and that they’re gonna hire 10,000 engineers in Europe to work on the metaverse,” Haugen said, referring to the company’s plans for an immersive online world it believes will be the next big internet trend.
“I was like, ‘Wow, do you know what we could have done with safety if we had 10,000 more engineers?’ It would be amazing,” she said.
It's her second appearance before lawmakers after she testified in the U.S. Senate earlier this month about the danger she says the company poses, from harming children to inciting political violence and fueling misinformation. Haugen cited internal research documents she secretly copied before leaving her job in Facebook's civic integrity unit.
The documents, which Haugen provided to the U.S. Securities and Exchange Commission, allege Facebook prioritized profits over safety and hid its own research from investors and the public. Some stories based on the files have already been published, exposing internal turmoil after Facebook was blindsided by the Jan. 6 U.S. Capitol riot and how it dithered over curbing divisive content in India, and more is to come.
Facebook CEO Mark Zuckerberg has disputed Haugen’s portrayal of the company as one that puts profit over the well-being of its users or that pushes divisive content, saying a false picture is being painted. But he agrees on the need for updated internet regulations, saying lawmakers are best able to assess the tradeoffs.
“I was like, ‘Wow, do you know what we could have done with safety if we had 10,000 more engineers?' It would be amazing."
Haugen has told U.S. lawmakers that she thinks a federal regulator is needed to oversee digital giants including Facebook, something that officials in Britain and the European Union are already working on.
The U.K. government’s online safety bill calls for setting up a regulator that would hold companies to account when it comes to removing harmful or illegal content from their platforms, such as terrorist material or child sex abuse images.
“This is quite a big moment,” Damian Collins, the lawmaker who chairs the committee, said ahead of the hearing. “This is a moment, sort of like Cambridge Analytica, but possibly bigger in that I think it provides a real window into the soul of these companies.”
Collins was referring to the 2018 debacle involving data-mining firm Cambridge Analytica, which gathered details on as many as 87 million Facebook users without their permission.