As lawmakers mull curbs on social media, Haugen weighing in
FILE – Facebook whistleblower Frances Haugen leaves after giving evidence to the joint committee for the Draft Online Safety Bill, as part of British government plans for social media regulation, at the Houses of Parliament, in London, Monday, Oct. 25, 2021. After Haugen revealed Facebook’s failings to curb online hate and extremism and protect young users from harmful content, U.S. lawmakers are putting forward proposals to curb social media giants by limiting their free-speech protections against legal liability. (AP Photo/Matt Dunham, File)
WASHINGTON (AP) — U.S. lawmakers are putting forward proposals to curb social media giants by limiting their free-speech protections against legal liability.
Their efforts are coming after a former Facebook product manager presented a case that the company’s systems amplify online hate and extremism and fail to protect young users from harmful content.
That whistleblower, Frances Haugen, is expected to weigh in on the lawmakers’ proposals at a House hearing on Wednesday. Her previous disclosures have spurred legislative and regulatory efforts around the world aimed at cracking down on Big Tech, and she made a series of appearances recently before European lawmakers and officials who are drawing up rules for social media companies.
Haugen, a data scientist who worked in Facebook’s civic integrity unit, buttressed her assertions with a massive trove of internal company documents she secretly copied and provided to federal securities regulators and Congress.
When she made her first public appearance this fall, laying out a far-reaching condemnation of the social network giant before a Senate Commerce subcommittee, she had thoughts on how Facebook’s platforms could be made safer and prescriptions for actions by Congress. She rejected the idea of breaking up the tech giant as many lawmakers are calling for, favoring instead targeted legislative remedies.
Most notably, they include new curbs on the long-standing legal protections for speech posted on social media platforms. Both Republican and Democratic lawmakers have called for stripping away some of the protections granted by a provision in a 25-year-old law — generally known as Section 230 — that shields internet companies from liability for what users post.
Facebook and other social media companies use computer algorithms to rank and recommend content. They govern what shows up on users’ news feeds. Haugen’s idea is to remove the protections in cases where dominant content driven by algorithms favors massive engagement by users over public safety.
That’s the thought behind the Justice Against Malicious Algorithms Act, which was introduced by senior House Democrats about a week after Haugen testified to the Senate panel in October. The bill would hold social media companies responsible by removing their protection under Section 230 for tailored recommendations to users that are deemed to cause harm. A platform would lose the immunity in cases where it “knowingly or recklessly” promoted harmful content.
A subcommittee of the House Energy and Commerce Committee is holding Wednesday’s hearing on the bill and other proposed legislation to curb abuses in social media platforms. The senior Democrats on the committee, including Chairman Rep. Frank Pallone of New Jersey, brought forward the bill targeting algorithms.
“The committee has seen mounting evidence that when social media companies are faced with the choice between making more money or protecting public health and safety, they will continue to choose money,” Pallone said recently. “The lack of transparency within these companies has serious repercussions for all Americans. The time for self-regulation is over. Congress must now come together in a bipartisan way to thoughtfully consider proposals that bring about real accountability.”
Some experts who support stricter regulation of social media say the legislation could have unintended consequences. It doesn’t make clear enough which specific algorithmic behaviors would lead to loss of the liability protection, they suggest, making it hard to see how it would work in practice and leading to wide disagreement over what it might actually do.
Meta Platforms, the new name of Facebook’s parent company, has declined to comment on specific legislative proposals. The company says it has long advocated for updated regulations.
Meta CEO Mark Zuckerberg has suggested changes that would only give internet platforms legal protection if they can prove that their systems for identifying illegal content are up to snuff. That requirement, however, might be more difficult for smaller tech companies and startups to meet, leading critics to charge that it would ultimately favor Facebook.
Other social media companies have urged caution in any legislative changes to Section 230.
___
Follow Marcy Gordon at https://twitter.com/mgordonap
Copyright 2021 The Associated Press. All rights reserved