Facebook whistleblower Frances Haugen testifies before the Senate – TechCrunch

After revealing it Sunday night Frances Haugen, the whistleblower who leaked controversial Facebook documents to The the Wall Street newspaper – testified before the Senate Committee on Trade, Science and Transport Tuesday.

Haugen’s testimony came after a hearing last week when Facebook’s global chief safety officer Antigone Davis was asked about the company’s negative impact on children and teens. Davis stuck with the Facebook script, frustrating Senators because she didn’t answer questions directly. But Haugen, a former project manager on civic disinformation at Facebook, was, not surprisingly, more open to information.

Haugen is an algorithm specialist, having served as a project manager at companies like Google, Pinterest and Yelp. While at Facebook, she touched on issues related to democracy, disinformation and counterintelligence.

“Having worked on four different types of social media, I understand how complex and nuanced these issues are,” Haugen said in his opening statement. “However, the choices that are being made inside Facebook are dire – for our children, for our public safety, for our privacy and for our democracy – and that is why we must demand that Facebook make changes.”


Throughout the hearing, Haugen made it clear that she believes Facebook’s current algorithm, which rewards posts that generate meaningful social interactions (MSI), is dangerous. Deployed in 2018, this news feed algorithm prioritizes interactions (such as comments and likes) of people whom Facebook thinks you are closest to, such as friends and family.

But as documents leaked Per Haugen show, data scientists raised concerns that the system produced “unhealthy side effects on large slices of public content, such as politics and news.”

Facebook also uses an engagement-based ranking, in which an AI posts the content it believes will be most interesting to individual users. This means that content that elicits stronger reactions from users will be prioritized, increasing misinformation, toxicity, and violent content. Haugen said she believed the chronological filing would help mitigate those negative impacts.

“I’ve spent most of my career working on systems like Engagement Based Ranking. When I come to tell you these things, I essentially condemn 10 years of my own work, ”Haugen said at the hearing.

Committee senators listen to former Facebook employee and whistleblower Frances Haugen (C) testify before a Senate Commerce, Science and Transportation Committee hearing on Capitol Hill, October 5, 2021, in Washington, DC. (Photo by DREW ANGERER / POOL / AFP via Getty Images)

As Haugen said “60 minutesSunday night, she was part of a civic integrity committee that Facebook dissolved after the 2020 election. Facebook put in safeguards to reduce disinformation ahead of the 2020 US presidential election. After the election, it has deactivated these guarantees. But after the attacks on the U.S. Capitol on January 6, Facebook turned them on again.

“Facebook changed those security defaults in the run-up to the election because they knew they were dangerous. Because they wanted that growth to come back after the election, they went back to their original defaults, ”Haugen said. “I think this is deeply problematic.”

Haugen said Facebook is emphasizing a false choice – that they can either use their volatile algorithms and continue to grow rapidly, or they can prioritize user safety and decline. But she believes adopting more security measures, like monitoring academics, researchers and government agencies, could actually help Facebook’s bottom line.

“The thing I ask is a move [away] short-termism, which is what Facebook is managed today. It is guided by measures and not by people, ”said Haugen. “With proper oversight and some of these constraints, it’s possible that Facebook will actually be a much more profitable business in five or ten years, because it wasn’t as toxic and fewer people left it.”

Establish government oversight

When asked as a ‘thought experiment’ what she would do if she were in the shoes of CEO Mark Zuckerberg, Haugen said she would establish policies on sharing information with oversight bodies, including the Congress; she would work with academics to ensure they have the information they need to conduct research on the platform; and that it would immediately implement the identified “soft interventions” to protect the integrity of the 2020 election. She suggested requiring users to click on a link before sharing it, as other companies like Twitter have found these interventions to reduce misinformation.

Haugen also added that she believes Facebook, as currently structured, cannot prevent the spread of vaccine misinformation because the company is too dependent on AI systems which Facebook says will likely fail to capture. never more than 10 to 20% of the content.

Haugen later told the committee that she “strongly encourages” reform of Section 230, part of the United States Communications Decency Act that exempts social media platforms from being held accountable for what their users publish. Haugen believes Section 230 should exempt decisions about algorithms, which would allow companies to face legal consequences if their algorithms cause damage.

“User-generated content is something that businesses have less control over. But they have 100% control over their algorithms, ”Haugen said. “Facebook shouldn’t get a free pass on the choices it makes to prioritize growth, virality and responsiveness over public safety. “

Senator John Hickenlooper (D-CO) asked how Facebook’s results would be affected if the algorithm promoted security. Haugen said it would have an impact, because when users see more engaging content (even if it’s more infuriating than engaging), they spend more time on the platform, which earns more ad dollars for them. Facebook. But she believes the platform would still be profitable if she took the steps she outlined to improve user safety.

International security

As reported In one of the Wall Street Journal’s Facebook Files articles, Facebook employees reported cases of the platform being used for violent crimes overseas, but the company’s response was inadequate, according to documents disclosed by Haugen.

Employees raised concerns, for example, about armed groups in Ethiopia using the platform to coordinate violent attacks against ethnic minorities. Since Facebook’s moderation practices depend so much on artificial intelligence, this means that its AI must be able to function in all languages ​​and dialects that its 2.9 billion monthly active users speak. According to the WSJ, Facebook’s AI systems do not cover the majority of languages ​​spoken on the site. Haugen said that although only 9% of Facebook users speak English, 87% of the platform’s disinformation spend is on English speakers.

“It looks like Facebook is investing more in the users who make the most money, although the danger may not be evenly distributed based on profitability,” Haugen said. She added that she believes Facebook’s constant understaffing in counterintelligence, information operations and counterterrorism teams is a threat to national security, which she communicates with others. parts of Congress.

The future of Facebook

Members of the Senate committee have indicated they are motivated to take action against Facebook, which is also in the midst of an antitrust lawsuit.

“I’m actually against the Facebook break-up,” Haugen said. “If you separate Facebook and Instagram, it’s likely that most of the advertising dollars will go to Instagram, and Facebook will continue to be that. Frankenstein that is endangering lives around the world, but now there will be no more money to fund it.

But critics argue that yesterday’s six-hour Facebook outage – unrelated to today’s audience – showed the downside of a company with so much control, especially when platforms like WhatsApp do. such an integral part of communication abroad.

In the meantime, lawmakers are drafting legislation to promote safety on social media platforms for minors. Last week, Senator Ed Markey (D-MA) announced that he reintroduce legislation with Senator Richard Blumenthal (D-CT) called the KIDS (Internet Design and Safety for Children) Act, which seeks to create new protections for online users under the age of 16. Today Senator John Thune (R-SD) introduced a bipartisan bill that he introduced along with three other committee members in 2019 called the Bubble Filter Transparency Law. This legislation would increase transparency by giving users the ability to post content that is not organized by a secret algorithm.

Senator Blumenthal even suggested that Haugen return for another hearing over his concerns that Facebook is a threat to national security. Although Facebook’s superiors spoke out against Haugen during the trial, decision-makers seemed moved by his testimony.