Instagram CEO Adam Mosseri testified that social media platforms aren't clinically addictive while detailing Meta's youth safety testing protocols, as tech giants face mounting legal pressure over alleged harms to minors.

Meta Platforms Inc. finds itself at the center of a landmark legal battle as Instagram CEO Adam Mosseri testified under oath that social media does not meet clinical definitions of addiction. The testimony, delivered in federal court on Wednesday, comes amid consolidated litigation from over 1,600 families alleging social media platforms deliberately engineered addictive features harmful to minors.
Mosseri detailed Meta's youth safety protocols, revealing that the company conducts quarterly assessments of features used by young users through its Youth Advisory Consortium. This includes testing interface modifications with over 5,000 teenage participants annually, with findings reviewed by Meta's Responsible Innovation team before deployment. According to internal documents submitted as evidence, Meta spent $7.3 billion on trust and safety initiatives in 2025, representing 18% of its annual R&D expenditure.
The timing proves critical as regulatory scrutiny intensifies globally. The European Union's Digital Services Act now imposes fines up to 6% of global revenue for child safety violations, while U.S. lawmakers consider the Kids Online Safety Act requiring independent audits of algorithms targeting minors. These developments occur against Meta's $10 billion investment in a new Indiana data center campus, signaling continued infrastructure expansion despite regulatory headwinds.
Market analysts note potential financial implications. JPMorgan estimates that mandated redesigns of youth-focused features could cost platforms 3-5% in annual revenue, while settlements from ongoing litigation might reach $4.8 billion industry-wide according to Wells Fargo projections. This coincides with Meta's Q4 earnings revealing Ray-Ban smart glasses sales exceeding 7 million units in 2025, tripling prior-year sales amid growing concerns about augmented reality's impact on adolescent development.
Parallel developments across the industry underscore the sector-wide implications. TikTok recently launched geofenced Local Feeds in its U.S. app following pressure from state attorneys general, while Discord faces FTC scrutiny over minor safeguarding protocols. Anthropic committed $420 million to power grid upgrades to support AI infrastructure growth, acknowledging compute demands that could reach 8% of U.S. electricity consumption by 2030 according to DOE projections.
The legal confrontation represents more than courtroom drama—it signals a potential inflection point for platform economics. With Meta deriving 34% of its $154 billion annual revenue from engagement-driven advertising according to SEC filings, any court-ordered design limitations for young users could necessitate fundamental business model adjustments. As proceedings continue through Q2 2026, the outcome may establish precedent for balancing technological innovation with duty-of-care obligations toward minor users across the $1.7 trillion global social media ecosystem.

Comments
Please log in or register to join the discussion