The European Commission recently announced that both TikTok and Meta Platforms, which encompasses Facebook and Instagram, have violated transparency obligations set out in the Digital Services Act (DSA). This legal framework was established to enhance accountability among digital platforms, aiming to ensure that they operate transparently and responsibly.
According to the Commission, TikTok and Meta have failed to provide adequate access to public data for researchers, which is a key requirement under the DSA. This lack of access limits researchers’ ability to analyze online content and its impacts, especially concerning misinformation and user safety. Such transparency is essential for understanding how these platforms function and for assessing their influence on public discourse.
Moreover, the Commission highlighted that Meta has not offered users straightforward mechanisms to report illegal content. In today’s digital age, where harmful or illicit content can spread rapidly, easy reporting tools are crucial. Users should be able to navigate the process of reporting content that violates platform guidelines without unnecessary complexity. Additionally, the absence of clear procedures for users to contest moderation decisions raises concerns about the fairness and accountability of content moderation practices employed by the platform.
Meta representatives have responded to these allegations by asserting that they have made significant changes to comply with the DSA regulations. They argue that they are committed to improving transparency and user safety. On the other hand, TikTok contends that it has provided researchers with access to necessary data through various tools, although the Commission’s assessment suggests otherwise.
In light of these findings, the European Commission has invited both companies to address these preliminary conclusions. The importance of this dialogue cannot be understated, as it will determine the next steps for TikTok and Meta in their compliance efforts. Should the Commission ultimately decide that the companies are not in compliance with the DSA, they could face substantial fines — potentially amounting to 6% of their global revenue. This serves as a strong signal to digital platforms about the serious implications of non-compliance with European regulations.
This scrutiny of social media giants is part of broader efforts by the European Union to regulate digital platforms and enforce stricter compliance measures aimed at protecting users, including minors, from online risks. The DSA represents a comprehensive approach to digital regulation, seeking to ensure that platforms take responsibility for the content shared on their sites. By holding companies accountable, the EU aims to foster a safer and more transparent online environment.
As the digital landscape continues to evolve, the importance of responsible practices by major platforms cannot be overstated. Users deserve assurance that the services they engage with are safe, transparent, and fair. The ongoing challenges and discussions surrounding these regulatory frameworks will likely influence the future operations of tech companies, shaping how they manage content moderation, user reporting, and overall transparency. The outcome of this situation will be closely watched, as it may set important precedents for digital regulation not only in Europe but globally.



