EU Accuses TikTok, Meta of Violating Content Moderation Rules
Economy / Finance

EU Accuses TikTok, Meta of Violating Content Moderation Rules

The European Commission has launched formal concerns regarding TikTok and Meta’s (Facebook and Instagram) compliance with the Digital Services Act (DSA), signaling a potential escalation in regulatory scrutiny over online content moderation and data accessibility. The Commission’s provisional findings, released Friday, allege significant failures by both platforms to uphold their obligations concerning researcher access to public data and the provision of robust mechanisms for users to report illegal content and appeal moderation decisions.

At the core of the investigation lies the Commission’s belief that TikTok and Meta have implemented procedures hindering researchers’ ability to access vital public data. Preliminary assessments suggest the platforms’ current data sharing models often provide incomplete or unreliable information, severely limiting the capacity of researchers to effectively examine critical issues, including the exposure of users, particularly minors, to harmful or illegal content. This restricts vital oversight and independent verification of platform safety measures.

Beyond data accessibility, the Commission flagged serious shortcomings in Meta’s reporting mechanisms. The current “Notice and Action” systems for flagging illegal content, encompassing materials related to child sexual abuse and terrorist propaganda, were deemed excessively convoluted and burdensome for users. The imposition of multiple, unnecessary steps and additional requirements creates a barrier to reporting, potentially allowing harmful content to persist on the platforms.

Furthermore, the Commission’s investigation uncovered evidence of the deliberate use of “dark patterns” – deceptive interface designs employed by Facebook and Instagram that actively mislead and confuse users. These practices, intended to nudge users towards specific actions, are now suspected of undermining the effectiveness of reporting and content removal processes, raising concerns about deliberate obfuscation of user agency.

The Commission’s concerns represent a clear indication that the DSA’s stricter regulatory framework is being actively enforced. While these are preliminary findings, they carry significant weight and pave the way for formal investigations and potential sanctions if non-compliance is confirmed. The case highlights the ongoing challenge of balancing platform innovation with the imperative of ensuring user safety and fostering transparency within the rapidly evolving digital landscape and marks a pivotal moment in the European Union’s efforts to hold major social media companies accountable.