KTLA

Whistleblower Frances Haugen tells U.K. committee Facebook makes online hate worse

In this Oct. 5, 2021 file photo, former Facebook employee Frances Haugen speaks during a hearing on Capitol Hill, in Washington. (AP Photo/Alex Brandon, file)

Amid fallout from the Facebook Papers documents supporting claims that the social network has valued financial success over user safety, Facebook on Monday reported higher profit for the latest quarter.

The company’s latest show of financial strength followed an avalanche of reports on the Facebook Papers — a vast trove of redacted internal documents obtained by a consortium of news organizations, including The Associated Press — as well as Facebook whistleblower Frances Haugen’s Monday testimony to British lawmakers.


Facebook said its net income grew 17% in the July-September period to $9.19 billion, buoyed by strong advertising revenue. That’s up from $7.85 billion a year earlier. Revenue grew 35% to $29.01 billion. The results exceeded analyst expectations for Facebook’s results.

The company’s shares rose 2.5% in after-hours trading after closing up 1% for the day.

“For now, the revenue picture for Facebook looks as good as can be expected,” said eMarketer analyst Debra Aho Williamson. But she predicted more revelations and described the findings so far as “unsettling and stomach-churning.”

CEO Mark Zuckerberg made only a brief mention of what he called the “recent debate around our company.” Largely repeating statements he made after Haugen’s Oct. 5 testimony before a U.S. Senate subcommittee, he insisted that he welcomes “good faith criticism” but considers the current storm a “coordinated effort” to paint a “false picture” of the company based on leaked documents.

“It makes a good soundbite to say that we don’t solve these impossible tradeoffs because we’re just focused on making money, but the reality is these questions are not primarily about our business, but about balancing difficult social values,” Zuckerberg said.

Haugen, meanwhile, told a British parliamentary committee Monday that the social media giant stokes online hate and extremism, fails to protect children from harmful content and lacks any incentive to fix the problems, providing momentum for efforts by European governments working on stricter regulation of tech companies.

While her testimony echoed much of what she told the U.S. Senate this month, her in-person appearance drew intense interest from a British parliamentary committee that is much further along in drawing up legislation to rein in the power of social media companies.

Haugen told the committee of United Kingdom lawmakers that Facebook Groups amplifies online hate, saying algorithms that prioritize engagement take people with mainstream interests and push them to the extremes. The former Facebook data scientist said the company could add moderators to prevent groups over a certain size from being used to spread extremist views.

“Unquestionably, it’s making hate worse,” she said.

Haugen said she was “shocked” to hear that Facebook wants to double down on what Zuckerberg calls “the metaverse,” the company’s plan for an immersive online world it believes will be the next big internet trend.

“They’re gonna hire 10,000 engineers in Europe to work on the metaverse,” Haugen said. “I was like, ‘Wow, do you know what we could have done with safety if we had 10,000 more engineers?’” she said.

Facebook says it wants regulation for tech companies and was glad the U.K. was leading the way.

“While we have rules against harmful content and publish regular transparency reports, we agree we need regulation for the whole industry so that businesses like ours aren’t making these decisions on our own,” Facebook said Monday.

It pointed to investing $13 billion (9.4 billion pounds) on safety and security since 2016 and asserted that it’s “almost halved” the amount of hate speech over the last three quarters.

Haugen accused Facebook-owned Instagram of failing to keep children under 13 — the minimum user age — from opening accounts, saying it wasn’t doing enough to protect kids from content that, for example, makes them feel bad about their bodies.

“Facebook’s own research describes it as an addict’s narrative. Kids say, ‘This makes me unhappy, I feel like I don’t have the ability to control my usage of it, and I feel like if I left, I’d be ostracized,‘” she said.

The company last month delayed plans for a kids’ version of Instagram, geared toward those under 13, in order to address concerns about the vulnerability of younger users.

Pressed on whether she believes Facebook is fundamentally evil, Haugen demurred and said, “I can’t see into the hearts of men.” Facebook is not evil, but negligent, she suggested.

It was Haugen’s second appearance before lawmakers after she testified in the U.S. about the danger she says the company poses, from harming children to inciting political violence and fueling misinformation. Haugen cited internal research documents she secretly copied before leaving her job in Facebook’s civic integrity unit.

The documents, which Haugen provided to the U.S. Securities and Exchange Commission, allege Facebook prioritized profits over safety and hid its own research from investors and the public. Some stories based on the files have already been published, exposing internal turmoil after Facebook was blindsided by the Jan. 6 U.S. Capitol riot and how it dithered over curbing divisive content in India. More is to come.

Representatives from Facebook and other social media companies plan to speak to the British committee Thursday.

Haugen is scheduled to meet next month with European Union officials in Brussels, where the bloc’s executive commission is updating its digital rulebook to better protect internet users by holding online companies more responsible for illegal or dangerous content.

Under the U.K. rules, expected to take effect next year, Silicon Valley giants face an ultimate penalty of up to 10% of their global revenue for any violations. The EU is proposing a similar penalty.