A jury in New Mexico found Meta violated state law on Tuesday. The case accused the company of failing to protect children on its platforms and not warning users about risks.
The jury ruled Meta liable on all counts, including “unfair and deceptive” and “unconscionable” trade practices. Meta must pay $375 million in damages.
This is the first time Meta has faced a jury trial over child safety on platforms like Facebook and Instagram.
A Meta spokesperson said the company disagrees with the verdict and plans to appeal.
Background of the Case
The lawsuit was filed in 2023 by New Mexico Attorney General Raúl Torrez. He accused Meta of creating a “breeding ground” for child predators.
The case focused on how Facebook and Instagram allegedly allowed predators to contact minors. Meta denied the claims.
Although the jury awarded less than the billions the state requested, a later part of the case could force Meta to change platform policies and pay additional penalties.
Rising Legal Pressure on Social Media
This lawsuit is part of a broader wave of cases against Meta and other platforms.
In Los Angeles, a separate case accuses Meta and YouTube of creating addictive features that harmed a young woman’s mental health.
Hundreds of other lawsuits are also pending. They involve individuals, school districts, and state attorneys general. Some will go to trial later this year.
The Trial
The trial lasted six weeks. It included testimony from Meta executives, former employees turned whistleblowers, and undercover investigations by the attorney general.
The jury had to decide whether Meta knowingly misled users or designed its platforms in a way that harmed children.
Meta argued it works hard to keep people safe. A spokesperson said identifying bad actors and harmful content is challenging.
Attorney General Calls Verdict Historic
AG Raúl Torrez called the decision a historic victory. He said it protects children and families affected by Meta’s choices. He said.
“Meta executives knew their products harmed children. They ignored warnings from employees and misled the public. Today, the jury said enough is enough.”
Meta’s Defense
Meta argued that the New Mexico lawsuit cherry-picked documents to make the company look bad.
Kevin Huff, a Meta attorney, said the company has 40,000 employees working to keep Facebook and Instagram safe. He emphasized that the company invests heavily in child safety measures.
Whistleblower Testimony
Former Meta engineer Arturo Bejar testified. He said his 14-year-old daughter received sexual solicitations on Instagram.
Bejar argued that the platform’s algorithms, which are designed to serve personalized ads, can also benefit predators.
“Facebook is very good at connecting people with interests,” he said. “If your interest is little girls, it will connect you with little girls.”
Former VP Brian Boland testified that safety was not a priority for top executives.
Instagram head Adam Mosseri highlighted safety features like Teen Accounts, even though they impacted user growth.
End-to-End Encryption Concerns
The case also raised concerns about end-to-end encryption in Instagram chats.
Meta plans to stop supporting end-to-end encryption for teens later this year. A spokesperson said few users opted in, and people who want encryption can use WhatsApp instead.
Fake Profiles Used in Investigation
The attorney general’s team created fake Facebook and Instagram profiles posing as children.
These accounts received sexually explicit messages from three New Mexico men. Two of the men were arrested at a motel, thinking they would meet a 12-year-old girl.
The state argued that Meta did not do enough to prevent these predators from contacting children.
Meta’s Position on Child Safety
Meta says it has spent years building technology to fight child exploitation.
Ravi Sinha, Meta’s Head of Child Safety Policy, described the company’s work with law enforcement to prevent and report exploitation.
Meta also questioned the ethics of the New Mexico investigation, accusing the AG’s office of using hacked or stolen accounts.
Torrez dismissed this as a distraction from the real issues.



