Social Media’s Legal Reckoning Has Begun: ‘We Are in a New World’ | Analysis
In less than 24 hours, Meta CEO Mark Zuckerberg was dealt a pair of legal blows that will open up his company and his fellow social media giants to years of liability claims over the harm to users’ mental health caused by their platforms.
On Tuesday, a jury in New Mexico ordered Meta — the parent company of Instagram, Facebook and WhatsApp — to pay $375 million in damages for misleading consumers about the safety of its platforms and failing to protect children.
The following day, a Los Angeles jury found Meta and Google, the parent of YouTube, negligent in a landmark case alleging that the companies deliberately designed products to drive addiction among minors, leading to mental distress. The jury awarded the plaintiff, a 20-year-old California woman referred to as Kayla G.M, $6 million in compensatory and punitive damages.
The financial damages are trivial for a company that drives billions of dollars of profit every quarter. But the decisions signal a reckoning for Big Tech as concerns about social media’s harms move from public debate to the jury box, with ordinary Americans now holding companies legally liable for product design decisions. The bellwether case in Los Angeles is the first of a consolidated group of cases with more than 1,600 plaintiffs including hundreds of families and school districts.
Legal and tech experts told TheWrap that the jury verdicts are likely to affect similar cases moving through the courts, and potentially force design and policy changes to social media. Outside of the U.S., countries have already taken steps to curtail the influence of social media on children.
“This is going to be a bonanza for plaintiff lawyers, as they can point to this verdict as precedent that the platform design can be blamed for negative outcomes,” Avi Greengart, a consumer tech analyst for Techsponential, told TheWrap. “Longer term, I expect Meta, Google, Snap, TikTok and others to work with legislators to define guardrails and software design guidelines that, once implemented, protect social media platforms from further liability.”
Both Snap and TikTok settled with the plaintiff known as K.G.M., whose first name is Kaley, prior to the closely watched trial, in which the plaintiff’s attorney, Mark Lanier, adapted the legal playbook used against Big Tobacco in the 1990s — arguing that companies hooked customers despite knowing the risks — and turned it against Big Tech.
While tech companies have largely been shielded by Section 230 of the 1996 Communications Decency Act when it comes to liability over third-party content on their platforms, attorneys argued that they are responsible for design choices, such as infinite scroll, and algorithmic suggestions that were aimed at addicting users to the platform.
Dona J. Fraser, senior vice president for Privacy Initiatives at BBB National Programs, said following the verdicts that “courts are beginning to really treat social media harm less like a speech issue” and “more like a product liability issue.”
“When you look back to what happened with tobacco, if that’s the path we’re going down, then this could potentially snowball,” she added.
‘A new world’
During the trial, Meta and YouTube disputed that their products were responsible for the plaintiff’s mental health struggles. Zuckerberg, taking the stand for the first time in a jury trial, defended Instagram’s age verification practices, while Instagram CEO Adam Mosseri argued that the platform is not “clinically addictive.”
“This jury saw exactly what we presented from the very first day of trial: that these companies built digital spaces designed to negatively influence the brains of children, and they did it on purpose,” Lanier said after the verdict. And the jury, he said, “told them that is not acceptable, and you are being held accountable.”
“We respectfully disagree with the verdict and will appeal. Teen mental health is profoundly complex and cannot be linked to a single app,” a Meta spokesman said. (The company is also appealing the New Mexico verdict.)
“We disagree with the verdict and plan to appeal,” a Google spokesman said. “This case misunderstands YouTube, which is a responsibly built streaming platform, not a social media site.” The jury apportioned liability, ordering Meta to pay 70% of the damages and YouTube 30%.
The jury’s decision was heralded by social media critics and advocacy organizations.
“As of today, we are in a new world: a new era in the fight to protect children from online harms,” Jonathan Haidt, author of “The Anxious Generation,” wrote Wednesday on X. “A jury sided with Kaley and therefore with millions of children: Big Tech is harming kids on an industrial scale.”
“For the first time, the law aligns with common sense: social media companies no longer have a special exemption to harm children with impunity,” he added. “Their shield is gone. They will be treated like any industry that knowingly harms children and lies about it. History will judge them as harshly as the tobacco industry.”
Sacha Haworth, executive director of the Tech Oversight Project, said the “ruling is an earthquake that shakes Big Tech’s predatory business model to its core,” adding that “this trial was proof that if you put CEOs like Mark Zuckerberg on the stand before a judge and jury of their peers, the tech industry’s wanton disregard for people will be on full display.”
What’s next
James Rubinowitz, New York-based personal injury attorney who lectures at Cardozo School of Law, said “if Meta and if Google do not come up with a reasonable number to settle these thousands of claims against them, the plaintiffs’ attorneys for these cases will have no problem trying each individual case.”
Rubinowitz, who acknowledged being a friend of Lanier, noted that the verdict also affects Snap and TikTok, which, despite settling one case, “have the same exposure that Meta and Google have.” He added, “These cases are not going anywhere.”
Kate Ruane, the director of the Center for Democracy and Technology’s Free Expression Project, said that “if these decisions hold, it is clear that the social media companies are going to have to change some things about how they currently operate.”
“In some ways, that might be a good thing, but in others, we could see a lot more censorship of content just because there is some degree of potential concern that someone somewhere might be negatively impacted by it,” Ruane said, adding that “while these lawsuits were motivated to protect people, there could be some significant downstream effects that wind up hurting the very people that they were seeking to protect.”
Fraser said that “if the harms are directly related to the product itself, then the smartest move to mitigate for future lawsuits is making a conscious decision to treat youth safety as a product safety issue.”
Adam Hoffman, a professor of psychology at Cornell University, said he expected a “major overhaul” in social media design. “This isn’t really about ultimately eliminating social media because that will never happen, but it’s really about making sure that we’re designing it better in ways that can support rather than undermine the well-being that we see happening with our teenagers today.”
“This is honestly a really great opportunity to kind of rebalance the responsibility so that we have the individuals who are supported by safer systems and not really left to kind of manage them alone on their own right,” Hoffman said. “The onus isn’t on that individual or the parents trying to monitor and figure out what their kids are doing all the time when they’re trying to deal with their own lives.”
Maribel Lopez, an analyst at Lopez Research, said that “social media and AI moved very quickly, oftentimes without enough thought to the potential consequence.”
“This may be overlooked in the short term but after years of social media use and discussion of issues, it should be expected that these companies would be tasked to support a higher standard,” Lopez continued. “The verdict isn’t surprising. It’s a wakeup call that all tech providers need to focus more on ethics, governance and guardrails.”
— Roger Cheng and Tess Patton contributed reporting
The post Social Media’s Legal Reckoning Has Begun: ‘We Are in a New World’ | Analysis appeared first on TheWrap.