Meta Finally Held Accountable for Harming Teens: What's Next?

Meta Finally Held Accountable for Harming Teens: What’s Next?

3 Min Read

Meta recently suffered a legal defeat in New Mexico, marking the first instance where the company was held accountable by a court for compromising child safety. This significant verdict was quickly followed by another loss, as a Los Angeles jury ruled that Meta allegedly designed its apps to be addictive, harming a plaintiff, identified as K.G.M., a 20-year-old. These decisions could lead to numerous lawsuits targeting Meta’s pursuit of teenage users despite potential negative mental health impacts. Thousands of similar cases are pending, with 40 state attorneys general filing lawsuits alike.

While social media platforms generally have legal protections that shield them from being liable for user content, this case focused on addictive design features, such as infinite scrolling and continuous notifications, rather than content. Allison Fitzpatrick, a digital media lawyer, stated that relying on design elements was a strategic legal approach.

Following the New Mexico case, Meta was found liable for violating the Unfair Practices Act, resulting in a fine totaling $375 million. In the Los Angeles case, where Meta was held 70% responsible and YouTube 30% responsible for the plaintiff’s distress, a combined fine of $6 million was imposed. Snap and TikTok settled prior to the trial.

Despite the decisions, Meta announced plans to appeal, arguing that reducing teen mental health issues to a singular cause ignores broader challenges and the fact that many teenagers depend on digital platforms for connection.

During the trials, new internal Meta documents emerged, revealing a history of inaction toward the negative effects of its platforms on minors and an active effort to increase teen engagement, including during school hours. A 2019 study within Meta referenced negative impacts on people’s well-being resulting from Facebook use.

Documents also included statements from Meta’s leadership, including comments from CEO Mark Zuckerberg, outlining attempts to attract teen users. Inside emails indicated efforts to increase user retention among teens, sometimes forgoing parental oversight.

A Meta spokesperson responded that many documents were outdated and reassured that the company prioritizes improving safety based on feedback from parents and experts, mentioning features like Instagram Teen Accounts launched in 2024 which aim to protect younger users.

Former Meta employee Kelly Stonelake shared that these revelations align with her experience, which involved advocating for better content moderation in the metaverse, concerns allegedly overlooked by Meta.

The U.S. government maintains a keen focus on children’s online safety, especially after whistleblower Frances Haugen exposed internal documents indicating Instagram’s harmful effects on teens. However, privacy advocates express concern that proposed legislation might lead to increased censorship.

Stonelake, once an advocate for the Kids Online Safety Act, voiced opposition to the current version citing potential preemption of state laws, including cases like New Mexico’s against Meta.

Stonelake emphasizes the necessity for well-rounded solutions that achieve more than merely inciting fear and insists on complex, multi-faceted resolutions to protect minors effectively.

You might also like