Technology

Dozens of states sue Meta over addictive features harming kids

A bipartisan slate of states are alleging Meta’s addictive features on Facebook and Instagram violated a federal privacy law and state consumer protections.

The Meta logo is seen at the Vivatech show in Paris, France.

More than 30 states filed a federal lawsuit against Meta, the owner of Facebook and Instagram, alleging the platforms’ apps are designed to be addictive and harm children’s mental health.

The lawsuit signed by 33 state attorneys general was filed Tuesday in a San Francisco federal court. The suit claims Meta violated both federal children’s online privacy law and state consumer protection laws by making its products addictive and then lying about how they harm children’s mental health.

Additionally, eight state attorneys general and the District of Columbia are filing separate lawsuits in their own state courts alleging Meta’s practices violate state consumer protection laws. In total, 42 states, including the District of Columbia, filed lawsuits in federal and state courts Tuesday.

If successful, the states’ lawsuits could force Meta to change the way it designs and markets its platforms to the public, and lead to hefty fines. The legal strategy has drawn comparisons to the various lawsuits filed against the tobacco industry in the 1990s, which led to hundreds of billions of dollars in damages, and changed how the industry markets its products.

“We refuse to allow Meta to trample on our children’s mental and physical health, all to promote its products and increase its profits,” said California Attorney General Rob Bonta, who is leading the federal lawsuit, during a virtual press conference with the other states. “We refuse to allow the company to feign ignorance of the harm that’s causing, we refuse to let it continue business as usual.”

At a separate press conference Tuesday in San Francisco, Bonta spoke of the importance of bipartisanship in bringing the federal case, which has 15 Republican and 18 Democratic AGs signed on. “Folks that don’t team up too often are teaming up today,” Bonta said. “I think that speaks volumes, less so about the success on the merits, but I think more about the scope of the problem and how it touches every corner of this country.”

The federal lawsuit alleges that Meta deceived users by making the “false and misleading” claims that its features were not manipulative, that its products weren’t designed to promote unhealthy engagement with children and that its products are safe for younger users.

The lawsuits are designed to circumvent Section 230 of the Communications Decency Act, a decadesold law that protects platforms from being held liable for most content users post. The consumer protection lawsuits don’t target specific content, instead claiming that Meta deceived the public about the safety of children on its apps.

Meta pushed back on the lawsuits, saying it has made more than 30 design changes to improve children’s safety across its products. “We’re disappointed that instead of working productively with companies across the industry to create clear, age-appropriate standards for the many apps teens use, the attorneys general have chosen this path,” a Meta spokesperson said in a statement.

The federal lawsuit is the largest state-led challenge alleging a social media company violated the Children’s Online Privacy Protection Act and consumer protection laws. Tuesday’s filing follows a similar strategy used by Indiana, Arkansas and Utah, which have each filed state consumer protection lawsuits against TikTok in the past year.

COPPA, a federal law passed in 1998, requires platforms to obtain parental consent before collecting data from children under the age of 13. The lawsuit claims Meta has “actual knowledge” that children under 13 are using its services, including Instagram and Facebook, but has failed to obtain parental consent before collecting those users’ data. A significant portion of the details on the COPPA violations are redacted.

And while Meta has a policy of banning users who are younger than 13, the lawsuit argues the company does not enforce that restriction.

The lawsuit alleges that until December 2019, Instagram didn’t ask new users to disclose their age to create new accounts. And when it began asking new users their ages, Meta’s sign-up page would automatically generate a date of birth that would put the user’s age at 13 years old, according to the lawsuit.

The lawsuit noted that Instagram only recently changed this to automatically generate the current date and year instead of auto-populating a birth date 13 years prior. Despite these changes, the attorneys general charge that it’s still simple enough for children to lie about their age when signing up.

The lawsuit includes a viral exchange during a Facebook-hosted event in 2021 between teen influencer JoJo Siwa and Adam Mosseri, the head of Instagram, to highlight this lack of enforcement.

Siwa told Mosseri that she had been on Instagram since she was 8 years old, and had followers who were under 13, as well. Mosseri responded, “I don’t want to hear it,” according to the lawsuit. The interaction was first documented by the Wall Street Journal.

The states are pushing for changes that would overhaul how Meta’s platforms work. Bonta suggested limiting the frequency and duration of time young people can spend on the apps, as well as changing how the algorithms display content. “It’s an area of discussion and dialogue,” Bonta said. If Meta doesn’t engage, the states will get a judge to weigh in on what’s necessary, he said.

Bonta declined to comment on the substance of any settlement negotiations, but said “the door is wide open.”

The lawsuits come as Congress has failed to act on legislation to update COPPA or pass bills to create new protections, such as the Kids Online Safety Act. The bipartisan bill requiring platforms to audit their risks to minors advanced out of committee this summer but hasn’t advanced to a Senate floor vote — in part because it has faced vocal pushback from civil rights and advocacy groups over the potential it could violate teens’ privacy online and lead to detrimental impacts particularly on LGBTQ youth.

President Joe Biden urged Congress to pass kids’ safety and privacy bills during his 2022 and 2023 State of the Union addresses. He reiterated his call for lawmakers to act days before a key Senate committee advanced KOSA. Additionally, his U.S. Surgeon General issued a warning in May that extended use of social media apps like Instagram and TikTok harm children’s mental health.

The multi-state suit is the result of an investigation of Instagram by Bonta and a group of other state AGs that began in November 2021 after Facebook whistleblower Frances Haugen testified before Congress that Instagram knew its algorithms pushed unhealthy eating content to teen girls. Her testimony inspired KOSA and other legislation.

Haugen applauded the states stepping in the void of federal action. “And the fact that the litigators are acting before the legislators isn’t surprising, it’s happened many times before,” she told POLITICO, referring to past state lawsuits against tobacco companies in the 1990s.

The use of state consumer protection laws against social media companies is still a relatively novel legal approach and will be tested in the federal and state courts. During the virtual state AG press conference, the state AGs predicted that Meta will likely try to use Section 230 or raise First Amendment free speech claims in a bid to convince a judge to dismiss the lawsuits.

“We expect that will be their first line of defense: Section 230, the First Amendment,” Bonta said in San Francisco and he remains confident their cases will succeed.

Kids safety advocates said that Congress still needs to pass a law, especially since litigation could take years to be resolved.

“Litigation is no substitute for legislation,” said Alix Fraser, director of the Council for Responsible Social Media, a non-partisan group working to address mental health harms from social media, in a statement.

”It’s time to put our children, our communities, and our national security before Big Tech profits. Congress needs to step up with solutions that hold the platforms accountable.”