A New Mexico jury has delivered a historic verdict against Meta, finding the company liable for failing to protect children from sexual exploitation on its platforms and ordering $375 million in damages.
A New Mexico jury has delivered a landmark verdict against Meta, finding the company liable for violating state law by failing to protect children from sexual predators on Facebook and Instagram. The jury ordered Meta to pay $375 million in damages, marking the first time the social media giant has been held accountable in a jury trial for child safety concerns that have plagued its platforms for years.
The case, brought by New Mexico Attorney General Raúl Torrez in 2023, accused Meta of creating a "breeding ground" for child predators and willfully engaging in "unfair and deceptive" and "unconscionable" trade practices. The verdict comes amid mounting legal pressure on social media platforms over the safety of young users, with a separate jury in Los Angeles currently considering a case against Meta and YouTube over alleged addictive features that harmed a young woman's mental health.
During the six-week trial, jurors heard testimony from Meta executives, former employees-turned-whistleblowers, and details from the attorney general's undercover investigation into child sexual exploitation on Meta's platforms. The investigation, which led to three arrests, involved creating fake Facebook and Instagram profiles posing as children that encountered sexually suggestive content and requests to share pornographic material.
Meta CEO Mark Zuckerberg's deposition was played for jurors, showing the company's leadership facing questions about platform safety. The jury's decision represents a significant blow to Meta, which has long maintained that it works hard to keep users safe and invests heavily in measures to protect young users.
In response to the verdict, a Meta spokesperson said the company "respectfully" disagrees and plans to appeal the decision. The spokesperson emphasized that Meta has a "longstanding commitment to supporting young people" and will "continue to defend itself vigorously."
Attorney General Torrez hailed the decision as "a historic victory for every child and family who has paid the price for Meta's choice to put profits over kids' safety." He accused Meta executives of knowing their products harmed children, disregarding warnings from their own employees, and lying to the public about what they knew.
The case also highlighted internal tensions at Meta over child safety priorities. Former Meta Vice President of Partnerships Brian Boland testified that he "absolutely did not believe that safety was a priority" to CEO Mark Zuckerberg and then-COO Sheryl Sandberg when he left the company in 2020. Ex-Meta engineering director Arturo Bejar, who became a whistleblower, testified about his efforts to warn executives after his own 14-year-old daughter received sexual solicitations on Instagram.
Meta's attorneys argued that the company has been honest with users about the challenges of identifying and removing bad actors, and that 40,000 people at Meta are responsible for making Facebook and Instagram safe. They also questioned the legitimacy of the New Mexico investigation, accusing the attorney general's office of using hacked or stolen accounts and photos of real, non-consenting children to lure predators.
The verdict could have broader implications for the tech industry, as social media giants face hundreds of other cases from individuals, school districts, and state attorneys general. Some of these cases are set to go to trial later this year, potentially creating a wave of legal challenges over platform safety.
Meta's decision to stop supporting end-to-end-encrypted messaging on Instagram, announced midway through the trial, underscores the complex balance between privacy measures and child safety concerns. The company cited low usage of the feature as the reason for its removal, while critics argue that encryption can make it harder for law enforcement to catch predators.
As Meta prepares to appeal the decision, the case serves as a stark reminder of the ongoing debate over corporate responsibility in the digital age. With juries now holding social media companies accountable for platform safety, the tech industry may face increasing pressure to prioritize user protection over growth and engagement metrics.
The $375 million verdict, while smaller than the billions New Mexico had sought, represents a significant symbolic victory for child safety advocates. A later portion of the case to be presented directly to the judge could also force Meta to make changes to its platforms and pay additional penalties, potentially reshaping how social media companies approach user safety, particularly for young users.

Comments
Please log in or register to join the discussion