Skip to content

Social media verdicts could buoy online regulatory bills

Bills seek to sidestep third-party online speech protections

Sen. Marsha Blackburn, R-Tenn., attends a Senate Commerce hearing on April 15.
Sen. Marsha Blackburn, R-Tenn., attends a Senate Commerce hearing on April 15. (Tom Williams/CQ Roll Call)

Recent verdicts against social media companies in New Mexico and California were lauded as “monumental,” “powerful,” and “historic” by members of Congress after juries found platforms liable for harms caused by their design features, rather than the third-party content they hosted.

The verdicts show that courts may be open to accountability for social media platforms based on design rather than speech, which could help legislators to write challenge-proof bills among a continued push to keep kids safe online.

Last month, a jury in New Mexico found that Meta, which owns Facebook and Instagram, violated the state’s Unfair Practices Act by deceiving users about the safety of its platforms. The jury ordered that Meta be held liable for $375 million in civil penalties.

One day later, a California jury found Meta and Google, which owns YouTube, liable for negligence in the design of their platforms that led to harm for a young user. Meta was ordered to pay $4.2 million and Google was ordered to pay $1.8 million.

While New Mexico’s case depended on consumer protection law and the California verdict is based on personal injury, each case succeeded because judges did not dismiss them under Section 230 of the 1996 communications law, which says that online platforms are not the publishers of third-party speech posted on their sites and cannot be held liable for that speech or for their content moderation.

Speech vs. design

J.B. Branch, AI governance and technology policy counsel for consumer advocacy group Public Citizen, called the verdicts a “blueprint” for regulating companies that have “abused” Section 230 in the past.

“It provides a new legal theory, a new angle for legislators to consider, which is focusing on the product liability, focusing on these algorithms from a product lens and not so much a speech lens,” Branch said.

The New Mexico and California cases were able to succeed under their states’ existing consumer protection and personal injury laws, but Branch said that additional legislation is still needed.

“If the intention of existing law is to serve as a [deterrent], I don’t think people with a straight face could say that these large tech companies are deterred from putting things onto the market that are … disproportionately dangerous,” Branch said.

He argued that the dollar figures attached to the recent verdicts, while large, aren’t big enough to matter to companies making billions of dollars. He would like to see legislation with “more teeth” to figure into platforms’ cost analysis.

Congress already has options for bills that would take that path.

One such bill, known as the Kids Online Safety Act, sponsored by Sen. Marsha Blackburn, R-Tenn., would place a “duty of care” on social media companies to design their platforms to prevent certain harms to kids, including eating disorders, depression and anxiety, and patterns of compulsive use.

The U.S. Court of Appeals for the 9th Circuit recently made a mixed ruling on a California law that required platforms to design age-appropriate systems. That ruling upheld requirements that platforms estimate users’ ages and set child users’ privacy settings to the highest level by default, but kept other portions on hold because they were deemed too vague.

Clay Calvert, a fellow with the conservative-leaning American Enterprise Institute, suggested that lawmakers trying to thread the constitutional needle and avoid triggering First Amendment protections might include a legislative history section to make a bill’s intention clear.

“They’ll try to make explicit ‘we are not focusing on the content that’s being delivered to minors. We are solely concerned with the way the platforms deliver it and the features that they offer, such as metrics like number of likes, number of reposts, algorithms, those types of things’” Calvert said, though he noted that any such legislation would have to be narrowly tailored.

Legislators will also have to determine which technical features of a platform can be regulated.

A Senate aide, speaking on condition of anonymity, said that the cases show a lane focused on design features including infinite scroll or platforms’ choice to drive users toward “certain content that’s trying to get you to stay on the platform for longer, and longer and longer.”

Branch suggested that platforms could be required to set time limits for teens’ use.

The plaintiff in the California case cited infinite scroll, push notifications and algorithmic recommendations as causing addictive behavior.

In November, Sen. John Curtis, R-Utah, introduced a bill that would place a duty of care on platforms using recommendation algorithms and remove Section 230 protections if the platform did not exercise care in an algorithm’s design.

The House Energy and Commerce Committee recently forwarded a bill, dubbed the Kids Internet and Digital Safety Act, that would set new requirements for parental controls and require certain online platforms to put policies in place to address certain harms to kids online. It would also require platforms to provide parents an option to opt their child out of personalized recommendation systems.

Any regulation of social media algorithms will have to consider Moody v. NetChoice, a 2024 Supreme Court case in which the court remanded back to the lower courts challenges of laws from Florida and Texas regarding content moderation. The decision said that platforms are making “expressive choices” that are protected by the First Amendment.

Lessons for the AI age

Another bill taking the product liability approach would use it to address artificial intelligence, rather than social media.

The legislation, titled the “Aligning Incentives for Leadership, Excellence, and Advancement in Development” or AI LEAD Act, would classify AI systems as products and allow users or the government to bring lawsuits for property damage, mental anguish, death, illness and financial injury.

The Senate aide highlighted the bill’s enforcement mechanism, which allows for suits by the federal Justice Department, state attorneys general, or individuals who have been harmed. While the New Mexico case was brought by the state’s attorney general, the California case was on behalf of an individual.

“The best way to enforce this is empowering those who have been directly harmed to bring their own claims,” the aide said, in part because individuals wouldn’t have to convince the government their case was worth it.

AI is also prompting questions over whether the systems could be protected by Section 230. Courts have not yet weighed in on the issue.

Section 230’s protections for social media are also being called into question.

In December, Sen. Lindsey Graham, R-S.C., introduced a bill that would sunset the law after two years. The bill is co-sponsored by Senate Judiciary Chairman Charles E. Grassley, R-Iowa, and ranking member Richard J. Durbin, D-Ill., as well as a bipartisan group of seven other Senate Judiciary members.

What comes next

Meta said the company plans to appeal both verdicts. Google also said it plans to appeal the California verdict.

Additional cases are expected to go to trial soon, including in Kentucky and California.

Despite two important verdicts coming close together last month, Calvert expects that the length and complexity of social media liability cases means that it will take time for the tide to actually change.

“It would take … in my view, probably three or four plaintiffs’ verdicts in a row that are upheld on appeal — and again, that’s going to be years down the line — before we really start seeing serious settlement negotiations,” Calvert said.

Despite the time frame, another Senate aide repeated a refrain from Durbin and other critics of big tech that the recent verdicts could be a “Big Tobacco moment” for the industry.

They also said that “the hope is these verdicts will … bring tech companies to the table” and help create “a push toward greater bipartisan conversations” about what federal regulation of social media could look like.

Recent Stories

Fed chair nominee Warsh to field questions about Fed independence

Social media verdicts could buoy online regulatory bills

Supreme Court to hear arguments on agency authority over violations

This week: Senate girds for budget vote-a-rama in busy week for Congress

Trump hits cultural issues in midterms pep talk to Turning Point crowd

Former House member David McKinley dies at 79