A New Mexico jury found Tuesday that social media conglomerate Meta is harmful to children’s mental health and in violation of state consumer protection law.
The landmark decision comes after a nearly seven-week trial. Jurors sided with state prosecutors who argued that Meta — which owns Instagram, Facebook and WhatsApp — prioritized profits over safety. The jury determined Meta violated parts of the state’s Unfair Practices Act on accusations the company hid what it knew about the dangers of child sexual exploitation on its platforms and impacts on child mental health.
The jury agreed with allegations that Meta made false or misleading statements and also agreed that Meta engaged in “unconscionable” trade practices that unfairly took advantage of the vulnerabilities of and inexperience of children.
Jurors found there were thousands of violations, each counting separately toward a penalty of $375 million.
Attorneys for Meta said the company discloses risks and makes efforts to weed out harmful content and experiences, while acknowledging that some bad material gets through its safety net.
“We respectfully disagree with the verdict and will appeal,” Meta spokesperson Andy Stone told CBS News in a statement Tuesday evening. “We work hard to keep people safe on our platforms and are clear about the challenges of identifying and removing bad actors or harmful content. We will continue to defend ourselves vigorously, and we remain confident in our record of protecting teens online.”
New Mexico’s case was among the first to reach trial in a wave of litigation involving social media platforms and their impacts on children.
The trial that started Feb. 9 is one of the first in a torrent of lawsuits against Meta and comes as school districts and legislators want more restrictions on the use of smartphones in classrooms.
In a federal court in Southern California, a jury has been sequestered in deliberations for more than a week about whether Meta and YouTube should be liable for harms caused to children on their platforms, in one of three bellwether court cases that could set the course for thousands of similar lawsuits.
Meta CEO Mark Zuckerberg testified in the Los Angeles trial last month, telling jurors that while users under 13 are not allowed on Instagram, it is a difficult rule to enforce because there are “a meaningful number of people who lie about their age to use our services.”
In addition, more than 40 state attorneys general have filed lawsuits against Meta, claiming it’s contributing to a mental health crisis among young people by deliberately designing Instagram and Facebook features that are addictive.
New Mexico’s case relied on a state undercover investigation where agents created social media accounts posing as children to document sexual solicitations and Meta’s response.
Jim Weber/Santa Fe New Mexican via AP, Pool
The lawsuit, filed in 2023 by New Mexico Attorney General Raúl Torrez, also says Meta hasn’t fully disclosed or addressed the dangers of social media addiction. Meta hasn’t agreed that social media addiction exists, but executives at trial acknowledged “problematic use” and say they want people to feel good about the time they spend on Meta’s platforms.
“Evidence shows not only that Meta invests in safety because it’s the right thing to do but because it is good for business,” Meta attorney Kevin Huff told jurors in closing arguments. “Meta designs its apps to help people connect with friends and family, not to try to connect predators.”
Tech companies have been protected from liability for material posted on their social media platforms under Section 230, a 30-year-old provision of the U.S. Communications Decency Act, as well as a First Amendment shield.
New Mexico prosecutors say Meta still should be responsible for its role in pushing out that content through complex algorithms that proliferate material that can be harmful for children.
“We know the output is meant to be engagement and time spent for kids,” prosecution attorney Linda Singer said. “That choice that Meta made has profound negative impacts on kids.”
A slated second phase of the trial, possibly in May before a judge with no jury, would determine whether Meta created a public nuisance and may be ordered to change course and pay for remedies.
The New Mexico trial examined a raft of Meta’s internal correspondence and reports related to child safety. Jurors also heard testimony from Meta executives, platform engineers, whistleblowers who left the company, psychiatric experts and tech-safety consultants.
The jury also heard testimony from local public school educators who struggled with disruptions linked to social media, including sextortion schemes targeting children.
“What this case is about is one of the biggest tech companies in the world taking advantage of New Mexico teens,” state Chief Deputy Attorney General James Grayson told the jury in closing arguments.
The jury was assembled from residents of Santa Fe County, including the politically progressive state capital city.
In reaching a verdict, it considered whether social media users were misled by specific statements about platform safety by Zuckerberg, Instagram head Adam Mosseri and Meta global head of safety Antigone Davis.
In deliberations, the jury used a checklist of allegations from prosecutors that Meta failed to disclose what it knew about problems with enforcing its ban on users under 13, the prevalence of social media content about teen suicide, the role of Meta algorithms in prioritizing sensational or harmful content, and more.


