The European Commission has issued preliminary findings accusing TikTok and Meta (owner of Facebook and Instagram) of potential breaches of their obligations under the Digital Services Act (DSA), signaling a significant challenge to the platforms’ content moderation practices and data transparency. The Commission’s investigation reveals systemic issues hindering academic research and potentially compromising user safety.
The core of the allegations centers on the platforms’ failure to provide researchers with adequate access to public data, a crucial provision of the DSA intended to foster independent scrutiny of online content and its impact. The Commission’s findings suggest that Facebook and Instagram have implemented procedures and tools that effectively obstruct researchers’ attempts to obtain comprehensive and reliable data, limiting their capacity to assess the prevalence of harmful or illegal content, particularly its exposure to vulnerable users like minors.
Beyond data access, the Commission’s assessment highlights concerns about the usability of reporting mechanisms for illegal content. Both Facebook and Instagram are accused of failing to provide user-friendly “Notice and Action” systems for reporting instances of child sexual abuse material and terrorist propaganda. The current processes employed by Meta are reported to involve unnecessary steps and excessive requirements, creating obstacles for users attempting to flag concerning content.
A particularly critical finding focuses on what the Commission describes as the platforms’ utilization of “dark patterns” – deceptive interface designs deliberately crafted to confuse or dissuade users. These manipulative practices, impacting both data access requests and reporting mechanisms, raise serious questions about Meta’s commitment to operationalizing the principles of the DSA. Experts suggest this deliberate obfuscation may be a strategic attempt to minimize scrutiny and limit the removal of illegal content.
The Commission’s preliminary findings represent a potential watershed moment in the ongoing effort to regulate the digital landscape. While these are not yet formal charges, they demonstrate the Commission’s willingness to enforce the DSA’s stringent requirements. The findings are likely to fuel a broader debate about the power and responsibility of large social media platforms and the need for increased transparency and accountability in their content moderation strategies. Further investigation and potential sanctions are anticipated, underscoring the increasingly assertive regulatory environment facing the tech giants.



