
New Mexico, USA – Meta Platforms Inc., owner of Facebook, Instagram, WhatsApp, and Threads, has been hit with a $375 million fine following a jury verdict that found the company violated consumer protection laws by misleading users on platform safety and enabling child sexual exploitation.
The ruling, delivered on Tuesday, March 25, 2026, marks the first time a jury has held Meta accountable for such violations. The case, brought by the New Mexico Attorney General, alleged that Meta misrepresented its platforms as safe for children while failing to prevent exposure to harmful content.
Brandspur Brand News Desk reports that the jury determined Meta engaged in unfair and deceptive trade practices under state law, finding 75,000 violations, each carrying a $5,000 penalty, which totalled the $375 million fine.
New Mexico Attorney General Raúl Torrez hailed the verdict as a historic victory for children and families, warning that the penalty sends a strong message to big tech executives that no company is beyond legal accountability. “This decision underscores the responsibility technology companies have to protect their youngest users,” Torrez said.
Meta has publicly disagreed with the ruling and confirmed plans to appeal. “We respectfully disagree with the verdict and will appeal. We work hard to keep people safe on our platforms and are clear about the challenges of identifying and removing bad actors or harmful content,” the company said.
The case stems from a 2023 undercover investigation by the Attorney General’s office, which revealed serious gaps in Meta’s content moderation. Investigators created accounts for children under 14, which were quickly exposed to sexually explicit material and contacted by adults seeking sexual content, leading to criminal charges against several perpetrators.
The lawsuit also highlighted design features such as infinite scroll and auto-play videos, which were deliberately engineered to maximise user engagement, even when these features fostered addictive behaviour among young users.
Globally, authorities are intensifying efforts to regulate social media and protect minors. In Nigeria, the Federal Government is exploring age restrictions for social media use to enhance child safety. Other countries like Australia, Indonesia, and Denmark have already implemented or proposed similar restrictions for children under 15–16 years old.
The Meta ruling reflects a wider shift towards accountability in Big Tech, underscoring the need for companies to balance engagement with user safety while adhering to national and international regulations.





