Skip to main content
EU accuses TikTok and Meta of breaking transparency rules under Digital Services Act
Technology

EU accuses TikTok and Meta of breaking transparency rules under Digital Services Act

The European Commission preliminarily found TikTok and Meta breached transparency rules, failing to provide researchers adequate data access under the DSA.

October 24, 2025 - 11:13 AM ET • 3 min read

The European Commission, the executive arm of the European Union, announced Friday, that it had preliminarily found U.S. tech giants Meta and TikTok in breach of transparency obligations mandated by the landmark Digital Services Act (DSA). The findings mark a significant step in the EU's sustained efforts to regulate the world's largest online platforms.

The Commission preliminarily concluded that both social media companies allegedly violated their legal duty to grant researchers "adequate access" to publicly available data, a core requirement under the DSA. This legislation is central to the EU's strategy for holding Very Large Online Platforms (VLOPs) accountable and ensuring platform safety and transparency across the bloc.

The DSA mandates that VLOPs, which include Facebook, Instagram, and TikTok, must provide data access to vetted researchers studying systemic risks, such as the spread of disinformation, election interference, or the impact of platform design on user well-being. Officials noted that the alleged failure to provide sufficient access hinders the independent scrutiny necessary for understanding the societal impact of these platforms.

In addition to the data access issues shared with TikTok, the Commission preliminarily found Meta, the parent company of Facebook and Instagram, in breach of two further obligations related to user interaction and content moderation procedures.

The executive arm of the EU stated that Meta allegedly failed to provide users with simple and effective mechanisms to notify the platforms of illegal content. Furthermore, the Commission found preliminary evidence that Meta was not allowing users to effectively challenge content moderation decisions made by the company. These requirements are designed to empower users and ensure due process when content is removed or restricted by the platforms.

The Digital Services Act is among the EU's most ambitious pieces of legislation aimed at curbing the power and influence of major technology companies. Enacted to create a safer and more accountable online environment, the DSA imposes strict rules on content moderation, transparency, and risk mitigation for platforms that reach a significant portion of the EU population.

The preliminary findings announced on Friday do not constitute a final decision, but they initiate a formal process that could lead to substantial financial penalties if the breaches are confirmed. Under the DSA framework, companies found to be non-compliant can face fines reaching up to 6% of their global annual turnover. The Commission has previously opened numerous investigations under the DSA and related legislation, signaling a sustained commitment to enforcing its new regulatory framework against major global technology firms.