Jury Finds Meta and Google Harmed a Child: What's Next?

Jury Finds Meta and Google Harmed a Child: What’s Next?

2 Min Read

Nuclear options like age limits and repealing Section 230 won’t make social media safer.

Today on Decoder, we discuss landmark social media addiction trials that led to major verdicts against big tech, with cases in New Mexico against Meta and another in California against both Meta and Google. These cases have significant implications for platform design and the nature of speech in America. Joining us are Casey Newton, founder and editor of Platformer, and Lauren Feiner, Verge senior policy reporter. Lauren was in the LA courtroom where executives like Mark Zuckerberg testified.

These cases, the first in a wave of lawsuits targeting tech companies, focus on design decisions of platforms like Instagram and YouTube. The trials argue that these platforms have fundamental flaws that harm users, particularly teenagers, and that companies were negligent in deploying these features. This pushes for changes to legal mechanisms regulating social media.

The trials’ focus isn’t just on addictive design but features like algorithmic recommendations and camera filters that can worsen issues like anxiety, depression, and body dysmorphia. This movement suggests social media might be inherently defective, similar to how cigarettes cause cancer.

Casey and Lauren explore the distinction between product features – like recommendations, auto-play video, and infinite scroll – and harmful but legal content directed at young people. However, it’s challenging and potentially unconstitutional to regulate content moderation due to the First Amendment and Section 230 protections.

These verdicts may open new fronts for litigation, challenging the 230 shield. While people understand Section 230 prevents platforms from being held responsible for user content, trials like these are trying to set new precedents.

Casey Newton notes that Snapchat settled before the trial, indicating fear of outcomes, and other companies’ struggles highlight the growing scrutiny of social media’s impact. These cases have broader implications, touching on free speech and platform liability. Repealing Section 230 is often argued politically but might lead to more content restriction, contradicting free speech advocates.

As policymakers consider actions, the question of safe social media usage and design features looms large. Preventing harmful outcomes, like algorithmic rabbit holes that worsen disorders, while respecting free speech is the new focus. We’re amid complex debates about responsibility, regulation, and user protection on social media platforms.

You might also like