Court Decides Section 230 Offers No Protection to TikTok in Lawsuit Regarding Death from Blackout Challenge

Court Decides Section 230 Offers No Protection to TikTok in Lawsuit Regarding Death from Blackout Challenge

Court Decides Section 230 Offers No Protection to TikTok in Lawsuit Regarding Death from Blackout Challenge


### Appeals Court Revives Lawsuit Against TikTok Over Child’s Death in “Blackout Challenge”

In a noteworthy legal turn, an appeals court has reinstated a lawsuit against TikTok, overturning a previous ruling from a lower court that had conferred immunity upon the social media giant under Section 230 of the Communications Decency Act. This case revolves around the heartbreaking death of a child who engaged in the perilous “Blackout Challenge,” a viral phenomenon that prompted users to choke themselves until they lost consciousness.

#### Background: The “Blackout Challenge” and Section 230

The “Blackout Challenge” represents a troubling trend that has emerged on various social media platforms, TikTok included. This challenge requires users to choke themselves with objects like belts or cords until they become unconscious. Tragically, multiple children have died while attempting this challenge, triggering numerous lawsuits against TikTok.

In 2022, Tawainna Anderson, mother of Nylah Anderson—one of the victims—filed a lawsuit against TikTok. The complaint claimed that TikTok’s algorithm pushed the hazardous challenge toward Nylah, contributing to her demise. Nevertheless, the lower court rejected the case, referencing Section 230, which typically protects online platforms from liability regarding third-party content.

#### The Appeals Court’s Ruling

In a recent decision, Third Circuit Judge Patty Shwartz overturned the previous ruling, asserting that Section 230 does not offer absolute immunity to TikTok in this instance. Judge Shwartz noted that TikTok’s algorithm does more than merely host third-party content; it actively curates and endorses certain videos for users, rendering it an “expressive product” of the platform itself.

Shwartz referenced a recent Supreme Court decision that highlighted the difference between third-party speech and a platform’s own “expressive activity.” According to this ruling, if a platform’s algorithm embodies editorial choices regarding the content it supports, that action could be classified as the platform’s own speech, which Section 230 does not shield.

The appeals court has since remanded the matter back to the district court, which will address Anderson’s outstanding claims. The district court will also have to ascertain which claims are precluded by Section 230, in line with the Third Circuit’s judgment.

#### Implications for TikTok and Other Social Media Platforms

This ruling may have significant repercussions for TikTok and various other social media platforms that depend on algorithms to curate and highlight content. Should the courts ultimately deem TikTok accountable for endorsing harmful content through its algorithm, it could pave the way for an increase in lawsuits against social media companies, especially concerning child safety.

Circuit Judge Paul Matey, who partially agreed with the ruling, stressed that Section 230 should not be interpreted so expansively as to enable companies like TikTok to disregard the hazards associated with the content they promote. Matey advocated for a “far narrower” interpretation of Section 230, one that would hold platforms responsible for knowingly disseminating harmful content.

Matey also remarked that by the time Nylah Anderson engaged in the “Blackout Challenge,” TikTok had already recognized the risks linked to the trend but failed to take sufficient measures to curb its proliferation. He contended that TikTok should be liable for its targeted promotion of harmful content, especially when it pertains to the safety of children.

#### The Ongoing Legal Battle

Anderson’s legal team has pledged to persist in seeking justice, contending that the Communications Decency Act was never meant to permit social media companies to gain from endorsing dangerous content aimed at children. They have expressed optimism that the revived lawsuit will result in enhanced protections for minors on social media platforms.

TikTok, for its part, has earlier declared its commitment to user safety and has promised to “remain vigilant” in eliminating harmful content, including the “Blackout Challenge.” Nevertheless, the company has yet to respond to the latest ruling.

As the case progresses, it will attract considerable attention from legal professionals, social media enterprises, and concerned parents. The outcome could establish a precedent for how courts understand Section 230 in relation to algorithm-driven content promotion, potentially altering the legal framework for online platforms.