EU Asserts Meta and TikTok Failed to Protect Children, Possible Fines of 6% of Revenue Enforced

EU Asserts Meta and TikTok Failed to Protect Children, Possible Fines of 6% of Revenue Enforced

EU Asserts Meta and TikTok Failed to Protect Children, Possible Fines of 6% of Revenue Enforced


The European Union has recently pinpointed considerable deficiencies in child protection measures enacted by both Meta and TikTok, especially with regard to the reporting of child sexual abuse material (CSAM) on their services. According to initial findings from the EU, both firms have breached rules established in the Digital Services Act (DSA), which requires that online platforms guarantee user safety, particularly for minors.

The EU’s inquiry revealed that Meta and TikTok have erected obstacles that prevent researchers from obtaining essential data needed to evaluate children’s exposure to illegal or harmful content. This opacity not only hampers research initiatives but also brings into question the effectiveness of the companies’ content moderation strategies.

Meta, which owns Facebook and Instagram, has faced particular criticism for making it difficult for users to report illegal content. The platforms reportedly do not have a clear and accessible ‘Notice and Action’ system, which is crucial for users to alert about harmful material. Additionally, Meta has been accused of utilizing “dark patterns,” or misleading design tactics, that complicate and confuse the reporting process unnecessarily.

In light of these findings, both companies will have the chance to review the report and provide their responses. If the EU deems these responses unsatisfactory, they may incur penalties of up to 6% of their global annual revenue.

Beyond the EU’s conclusions, Meta is presently contending with legal issues in the United States. Several states have initiated lawsuits against the company, claiming that it has intentionally designed its applications to be addictive, while being aware of the potential detriment to teenagers. This follows disclosures from internal studies indicating that Instagram could negatively affect the mental well-being of young girls. The company has been accused of suppressing this data and seeking to insulate itself from responsibility through legal strategies.

A recent decision by Judge Yvonne Williams has permitted the DC attorney general to utilize Meta’s internal documents in the ongoing lawsuit, dismissing the company’s assertions of attorney-client privilege. The judge concluded that the communications in question fell under the crime-fraud exception, as they were intended to obscure possible legal liabilities.

As the legal processes progress, the repercussions for both Meta and TikTok could be substantial, particularly regarding their obligations to safeguard vulnerable users and foster a safer online environment for children. The first of the lawsuits against Meta is anticipated to be heard next year, representing a pivotal moment in the ongoing examination of social media platforms and their influence on youth.