Meta Ordered to Pay $375M After Trial Over Child Exploitation, User Safety
A New Mexico jury on Tuesday found Meta Platforms violated state law in a lawsuit brought by the state attorney general, who accused the company of misleading users about the safety of Facebook, Instagram and WhatsApp and of enabling child sexual exploitation on those platforms.
“We respectfully disagree with the verdict and will appeal.” a Meta spokesperson said in a statement. “We work hard to keep people safe on our platforms and are clear about the challenges of identifying and removing bad actors or harmful content.”
Representatives for the New Mexico attorney general did not immediately respond to requests for comment.
Meta Faces Broad Challenge Related to Youth Mental Health
The jury’s decision capped a six-week trial and marked the first jury verdict on these claims against the social media company, as it faces a broader challenge over how its platforms affect young people’s mental health.
New Mexico Attorney General Raúl Torrez, a Democrat, accused the company of allowing predators unfettered access to underage users and connecting them with victims, often leading to real-world abuse and human trafficking.
Meta had denied the allegations, saying it has extensive safeguards in place to protect younger users.
Meta has come under increasing scrutiny in recent years over its handling of child and teen safety, spurred in part by whistleblower testimony before Congress in 2021 that alleged the company knew its products could be harmful but refused to act.
Separately, Meta is facing thousands of lawsuits accusing it and other social media companies of intentionally designing their products to be addictive to young people, leading to a nationwide mental health crisis. Some of the lawsuits, which have been filed in both state and federal courts, seek damages in the tens of billions of dollars, according to Meta’s filings with financial regulators.
Meta has argued the company is shielded from liability in both the addiction and the New Mexico lawsuits by the free-speech protections of the U.S. Constitution’s First Amendment and Section 230 of the Communications Decency Act, which generally bars lawsuits against websites over user-generated content. The company has said the state’s allegations of harm cannot be separated from the content on the platforms, because its algorithms and design features serve to publish content.
The New Mexico lawsuit grew out of an undercover operation, which Torrez, a former prosecutor, and his office ran in 2023. As part of the case, investigators created accounts on Facebook and Instagram posing as users younger than 14. The accounts received sexually explicit material and were contacted by adults seeking similar content, leading to criminal charges against multiple individuals, according to Torrez’s office.
The state claims Meta told the public Instagram, Facebook and WhatsApp are safe for New Mexico teens and children, while hiding the truth about how much dangerous and harmful content the company hosts. According to the state, internal company documents acknowledged problems with sexual exploitation and mental health harm. Yet the company, the state says, did not institute basic safety tools such as age verification and insisted it was safe.
The state also accused Meta of designing its platforms to maximize engagement despite evidence they were harming children’s mental health. Features such as infinite scroll and auto-play videos keep kids on the site, fostering addictive behavior that can lead to depression, anxiety and self-harm, the lawsuit claims.
New Mexico’s lawsuit sought monetary damages, as well as an order directing Meta to make changes to improve children’s safety while using the platforms.
“Over the course of a decade, Meta has failed over and over again to act honestly and transparently,” Linda Singer, an attorney for the state, told the jury during closing arguments on Monday. “It’s failed to act to protect young people in this state. It is up to you to finish this job.”
Singer told the jury it could award more than $2 billion in damages.
Reuters viewed the trial on Courtroom View Network.
Meta has argued it has been transparent about the fact that it cannot catch all the harmful content on its platforms.
“What the evidence shows is Meta’s robust disclosures and tireless efforts to prevent harmful content. And these disclosures mean that Meta did not knowingly and intentionally lie to the public,” Kevin Huff, an attorney for Meta, told the jury during closing arguments.
In May, Judge Bryan Biedscheid, the judge who oversaw the trial, is slated to hold a bench trial on the state’s claims that Meta created a public nuisance that harmed state residents’ health and safety. The state will ask Biedscheid to direct Meta to make changes to its platforms to bring them in line with state law.