Meta Faces Trial Over Child Safety, Facebook, WhatsApp Under Scrutiny

A New Mexico court is examining whether Meta Platforms harmed minors through Facebook and WhatsApp design, focusing on addictive features, safety failures, and potential regulatory changes.
Meta Faces Trial Over Child Safety, Facebook, WhatsApp Under Scrutiny
Written By:
Somatirtha
Reviewed By:
Sankha Ghosh
Published on

A trial in New Mexico has placed Meta Platforms under scrutiny over alleged harm to minors. A March 2026 jury verdict found the company violated consumer protection laws and imposed a $375 million penalty.

The case has entered the second phase and the court will now decide whether Meta’s platforms amount to a ‘public nuisance.’ A ruling in favor of the state could force structural changes to its products.

Why are Regulators Raising Concerns?

Authorities argue platforms such as Facebook and WhatsApp are built to maximize engagement. Infinite scrolling, autoplay, and algorithmic feeds are examples that help keep users, especially children, connected for longer periods.

According to prosecutors, the mechanisms make children vulnerable to exposure to unhealthy content and raise the likelihood of addiction. In addition, the complaint asserts that the platform lacks adequate mechanisms to protect children from the associated dangers.

What Kind of Remedies is the State Demanding?

The state of New Mexico has been pushing for regulatory reforms in terms of the platform’s operations when it comes to underage users.

The state has also requested an independent monitor and additional financial penalties that could run into the billions of dollars.

How Has Meta Responded?

Meta has denied the allegations. The company says no clear scientific consensus links social media use directly to mental health harm. It has pointed to existing safety tools, including parental controls and moderation systems.

Meta has also argued that some proposed measures are not practical and could affect how its services operate in the state.

Also Read: What are Teens Asking AI? Meta’s New Feature Shows Parents the Topics, Not Chats

Why does This Case Matter Globally?

The trial is among the first to test whether platform design can be held responsible for harm to minors. Similar cases are underway across the United States.

The outcome could shape future regulation of social media and define how companies design products used by children.

Analytics Insight: Latest AI, Crypto, Tech News & Analysis
www.analyticsinsight.ae