A federal judge in California hinted Friday that Google, Meta, Snap and TikTok could very likely have to face allegations by consumers that the social media companies harmed young Americans’ mental health with addictive features built into their respective platforms — and that Big Tech’s signature liability shield, known as Section 230, may not be enough to deflect those claims.
The judge overseeing the litigation — which includes nearly 200 individual cases against the social media companies — said repeatedly that the tech companies may be unable to escape liability for what the consumer plaintiffs allege are vast harms to America’s children posed by algorithmic rabbit holes, image filters that encourage eating disorders or limitless content feeds.
Should the claims be allowed to proceed, it could mark a significant blow to the tech industry, which is currently fending off a nationwide legal assault on their services linked to mental health allegations. And it could mark a turning point for how courts have interpreted Section 230, a sweeping 1996 law that has exempted websites from a wide range of suits targeting their content moderation decisions.
This week, dozens of states filed a virtually identical federal lawsuit against Meta alleging the company knew that the design of its social media platforms had been harmful to kids. Eight additional states filed similar suits in their respective state courts. (In response, Meta has said it’s committed to providing safe experiences online.)
Addressing lawyers for both the consumer plaintiffs as well as the tech companies on Friday, District Judge Yvonne Gonzalez Rogers of the US District Court for the Northern District of California said she was unpersuaded by arguments that either all of the claims should be thrown out, or none of them.
She also expressed skepticism in response to claims by industry lawyers that tech companies have no legal obligation to ensure their platforms are safe for children.
Gonzalez Rogers criticized the consumer plaintiffs for making a disorganized grab-bag of allegations, and faulted them for appearing to make numerous complaints about the content that appears on social media platforms, as opposed focusing on the design decisions that serve that content to users.
Still, she said, the burden falls on the tech platforms to prove why she should throw out the cases at an early stage in litigation.
And she pointed to the potential limits of Section 230 in two critical exchanges. In one, she said there are “more objective functionality decisions” being litigated than simple content moderation decisions that would be protected by Section 230.
“It doesn’t seem to me that you can escape that,” Gonzalez Rogers said.
Later, she suggested that “it’s not clear to me that the entire thing is thrown out” due to Section 230, implying that some claims could be tossed while others survive.
The more than four-hour hearing saw attorneys sparring over numerous legal theories of liability, and it is still possible that Gonzalez Rogers may throw out some claims based on factors other than Section 230.
But one thing is certain, Gonzalez Rogers said: “Your billing fees today exceed my annual salary.”
Read the full article here