Big Tech’s Terrible, Horrible, No Good, Very Bad Week

On Tuesday, Meta lost in New Mexico. On Wednesday, Meta and YouTube lost in California. Meta says it plans to appeal. Some days Big Tech just wants to move to Australia. Too bad Australia banned kids from social media.

The losses were not small. A Santa Fe jury ordered Meta to pay $375 million for endangering children on its platforms. A Los Angeles jury found both Meta and YouTube liable for addicting a young woman and damaging her mental health, awarding $6 million in compensatory and punitive damages. Both cases centered on children. And both signal that the legal ground beneath Silicon Valley is shifting fast.

New Mexico

The Santa Fe jury found that Meta violated New Mexico’s Unfair Practices Act by misleading users about platform safety and failing to protect children from sexual predators. New Mexico Attorney General Raúl Torrez brought the case in 2023 after his office ran an undercover operation in which law enforcement agents posed as children on Instagram and Facebook. The investigation revealed that Meta’s platforms showed minors sexually explicit content without prompting, allowed adult predators to contact and exploit children, and facilitated the spread and exchange of child pornography.

The jury found Meta engaged in “unconscionable” trade practices and awarded the maximum penalty of $5,000 per violation, totaling $375 million. According to AG Torrez:

“The jury’s verdict is a historic victory for every child and family who has paid the price for Meta’s choice to put profits over kids’ safety. Meta executives knew their products harmed children, disregarded warnings from their own employees, and lied to the public about what they knew. Today the jury joined families, educators, and child safety experts in saying enough is enough.”

A second phase will occur on May 4, when New Mexico’s First Judicial District Chief Judge Bryan Biedscheid will consider whether Meta’s platforms constitute a public nuisance. According to the New Mexico Department of Justice, the state will “seek injunctive relief that requires Meta to pay additional damages and make specific changes to its platforms and company operations, including enacting effective age verification, removing predators from the platform, and protecting minors from encrypted communications that shield bad actors.”

California

The next day, a Los Angeles jury found both Meta and YouTube liable for the depression, anxiety, and social media addiction of a young woman, who began using YouTube at age six and Instagram at nine. The jury awarded $3 million in compensatory damages and an additional $3 million in punitive damages, with Meta responsible for 70 percent. The jury determined that the companies’ conduct met the legal standard for “malice, oppression, or fraud.”

This case is the first in a consolidated group of more than 1,600 social media addiction lawsuits filed in California state court. A separate group of federal cases is set to go to trial this summer. The California verdict proves these cases can win in front of a jury, and the implications for every pending lawsuit are massive.

“Today’s landmark verdict is not just a financial win for the plaintiff,” said Jim Daly, President and CEO of Focus on the Family. “It is an acknowledgment that Big Tech cannot willfully, recklessly and irresponsibly poison young hearts and minds in order to generate a profit.”

The Shield Is Beginning to Crack

For nearly 30 years, Section 230 of the Communications Decency Act has shielded tech companies from liability for what happens on their platforms. The National Center on Sexual Exploitation (NCOSE) has called it the greatest enabler of online sexual exploitation, arguing that without the threat of lawsuits, companies have no incentive to protect children.

Both the New Mexico and California plaintiffs found a way around Section 230 by targeting design rather than user content. The argument is not about what people post. It is about how these platforms were engineered, and the deliberate decisions made to maximize engagement at the expense of children’s safety.

Vaishnavi Jayakumar, former head of safety and well-being for Instagram, testified that Meta’s enforcement policies gave predators extraordinary room to operate. “You could incur 16 violations for prostitution and sexual solicitation, and upon the 17th violation, your account would be suspended,” Jayakumar testified, adding that “by any measure across the industry, [it was] a very, very high strike threshold.”

Meta is not the only platform losing its legal cover. Last August, the Ninth Circuit ruled that social media companies cannot hide behind Section 230 when through a defective design they fail to act on reports of child sexual abuse material, in a case brought against X, formerly Twitter. Center for Arizona Policy (CAP) President, Peter Gentala, who brought the case during his time as Senior Legal Counsel at NCOSE, remains counsel for the victims. That ruling, combined with this week’s verdicts, means the era of blanket immunity for platforms that endanger children is coming to an end.

Both Meta and Google have indicated they will appeal, and those appeals will almost certainly center on Section 230. But the legal ground has shifted. As Monte Mann, a business trial lawyer at Armstrong Teasdale, told Fox Business: “This verdict is going to attract additional claims and accelerate all the existing ones.” He called it “a direct hit on Big Tech’s core defense.”

What This Means for Families

Here is what changed this week: juries are now willing to treat social media platforms as defective products. Not because of what users post, but because of how the platforms themselves were built. That legal theory has now been tested twice, and it won both times.

The last time an industry faced this kind of cascading legal exposure over a product that addicted children, it was Big Tobacco. That fight ended with a $206 billion settlement and an industry forced to stop targeting minors.

Our policy team is working with Arizona lawmakers to empower parents to protect their children and bring more accountability to Big Tech. This is not easy. Even now, the billion-dollar companies built on children using their dangerous technologies resist reform. The tragic outcomes demonstrated in the Los Angeles and New Mexico cases are the latest evidence that technology companies must change their ways and prioritize safety—people over profit.

Big Tech had a terrible, horrible, no good, very bad week. For the children and families these companies have harmed, it is long overdue.

ICYMI

Share This