

A trial in New Mexico has placed Meta Platforms under scrutiny over alleged harm to minors. A March 2026 jury verdict found the company violated consumer protection laws and imposed a $375 million penalty.
The case has entered the second phase and the court will now decide whether Meta’s platforms amount to a ‘public nuisance.’ A ruling in favor of the state could force structural changes to its products.
Authorities argue platforms such as Facebook and WhatsApp are built to maximize engagement. Infinite scrolling, autoplay, and algorithmic feeds are examples that help keep users, especially children, connected for longer periods.
According to prosecutors, the mechanisms make children vulnerable to exposure to unhealthy content and raise the likelihood of addiction. In addition, the complaint asserts that the platform lacks adequate mechanisms to protect children from the associated dangers.
The state of New Mexico has been pushing for regulatory reforms in terms of the platform’s operations when it comes to underage users.
The state has also requested an independent monitor and additional financial penalties that could run into the billions of dollars.
Meta has denied the allegations. The company says no clear scientific consensus links social media use directly to mental health harm. It has pointed to existing safety tools, including parental controls and moderation systems.
Meta has also argued that some proposed measures are not practical and could affect how its services operate in the state.
The trial is among the first to test whether platform design can be held responsible for harm to minors. Similar cases are underway across the United States.
The outcome could shape future regulation of social media and define how companies design products used by children.