Court Filing Reveals Instagram Head Pressed on Delays in Launching Teen Safety Features such as Nudity Filter

Court Filing Reveals Instagram Head Pressed on Delays in Launching Teen Safety Features such as Nudity Filter

2 Min Read

Prosecutors in a lawsuit examining whether social media apps like Instagram are addictive and harmful questioned why it took Meta so long to implement basic safety features such as a nudity filter for messages sent to teens. In April 2024, Meta introduced a feature that automatically blurs explicit images in Instagram DMs, recognizing the issue nearly six years earlier.

A recently unsealed deposition in a federal lawsuit showed Instagram head Adam Mosseri was questioned about an August 2018 email with Meta VP and Chief Information Security Officer, Guy Rosen, where he acknowledged the potential for “horrible” incidents in private messages, which could include unsolicited explicit images, as agreed upon by Mosseri and a plaintiff’s lawyer.

Meta has been approached for comment. Mosseri resisted claims that Meta should have informed parents about the unmonitored nature of its messaging system, excluding the removal of CSAM (Child Sexual Abuse Material).

Mosseri argued that problematic content could be messaged through any app. He noted that Meta tries to balance privacy interests with safety measures.

The testimony also revealed that 19.2% of survey respondents aged 13 to 15 reported unwanted exposure to nudity or sexual images on Instagram, and 8.4% of the same age group saw someone harm themselves or threaten to do so within a week of app usage.

While a nudity filter is one of several updates made to protect teens, prosecutors focused on the delay rather than current safety measures.

Mosseri was also questioned about a 2017 email from a Facebook intern interested in identifying “addicted” users to see if they could be helped.

The 2018 email chain illustrated Meta’s awareness of risks to minors but delayed addressing the issue of sexual images until 2024. This includes images from adults potentially engaging in grooming, an act of building trust to exploit minors.

Mosseri’s deposition occurred amid several lawsuits seeking to hold big tech accountable for harming teens. This case in the Northern District of California involves plaintiffs arguing that social media platforms are defective as they promote prolonged use and addictive behavior in teens, with Meta, Snap, TikTok, and YouTube as defendants.

Similar lawsuits are in progress in the Los Angeles County Superior Court and in New Mexico.

Lawyers aim to prove that tech companies prioritized user growth and engagement over potential harms to young users.

These trials coincide with an increasing number of laws restricting social media use for teens in the U.S. and abroad.

You might also like