The European Commission has taken a consequential step in the early enforcement of the Digital Services Act (DSA). On 5 December 2025, the Commission imposed a €120 million fine on X (formerly Twitter), concluding that the platform’s paid “blue checkmark,” its sparse and unwieldy advertising repository, and its barriers to researcher access together formed a pattern of conduct at odds with the DSA’s core principles of transparency and accountability.
X has been subject to the DSA’s most stringent requirements since its designation as a Very Large Online Platform (VLOP) in April 2023. The Commission’s inquiry unfolded over two years, following a familiar investigative arc of risk assessment reports, responses to information requests, and, ultimately, the opening of formal proceedings in December 2023. By the summer of 2024, preliminary findings already hinted at the likely contours of the case, namely, concerns over deceptive design practices (Article 25 DSA), inadequate advertising transparency (Article 39 DSA), and the throttling of researcher access to public data barriers (Article 40(12) DSA).
The Commission’s final decision confirms those early signals. The paid verification badge, it found, projected an aura of credibility and identity-checking that simply did not exist, leaving users exposed to impersonation and manipulation. The advertising repository, conceived as a window into the platform’s political and commercial currents, was judged instead to be incomplete, labyrinthine, and incapable of supporting meaningful scrutiny. And for vetted researchers, access to public data remained elusive, discouraged by terms and technical processes that blunted the very purpose of the DSA’s transparency framework.
X now faces a pair of deadlines: 60 working days to remedy the deceptive architecture of its verification system, and 90 working days to submit a detailed action plan addressing advertising transparency and researcher access. That action plan will be assessed with the European Board for Digital Services. Failure to follow through may invite periodic penalties.
Political and Transatlantic Reactions
The decision has reverberated far beyond Brussels. Elon Musk, the owner of X, responded with sweeping denunciations, calling for the dissolution of the European Union and portraying the decision as censorship dressed in regulatory garb. US political figures soon joined the chorus, framing the case as an assault on US technology giants.
The Commission, however, has declined to be drawn into this theatre of geopolitics. Its public statements have remained restrained, almost legal purist in tone, emphasising that the matter concerns consumer protection, transparency, and risk mitigation, not political speech. Executive Vice-President Henna Virkkunen reiterated that misleading trust signals, opaque advertising structures, and restrictive data access practices erode the fragile trust that sustains online environments. Public criticism, she noted, does not alter legal obligations.
Why the Decision Matters
As the first non-compliance decision under the DSA, the ruling is likely to become a touchstone for future enforcement. It signals how the Commission interprets several foundational provisions:
- Verification mechanisms that imply identity checks may be deemed deceptive if the underlying process is purely transactional.
- Advertising repositories must be genuinely usable, containing complete, structured information sufficient to support accountability.
- Researcher access must be real in practice, not hollowed out by restrictive terms of service or technical friction.
Equally important is what the case reveals about the Commission’s supervision of VLOPs. The progression from designation to information-gathering, to preliminary findings, and finally to a non-compliance decision, with post-decision monitoring, offers a road map that major platforms should expect to see replicated.
Key Compliance Takeaways
Several practical lessons emerge for platforms navigating the DSA’s maturing enforcement landscape:
- Trust signals must match reality. Verification badges and similar indicators cannot suggest identity or quality checks that are not undertaken.
- Advertising transparency must be functional, not symbolic. Repositories must be complete, searchable, and barrier-free, including information on the advertisement’s content and paying entity.
- Researcher access must be effective. Terms of service and API conditions must support, rather than stifle, vetted researchers’ ability to obtain public data.
- Expect ongoing oversight. Once proceedings begin, sustained supervision, information requests, preliminary findings, and follow-up obligations are likely. Internal governance should be calibrated to withstand prolonged regulatory engagement.
- Political noise will not blunt enforcement. The Commission’s steadfast response to criticism from Musk and US political figures suggests the DSA will be enforced with vigour, regardless of external pressure.
This informational piece, which may be considered advertising under the ethical rules of certain jurisdictions, is provided on the understanding that it does not constitute the rendering of legal advice or other professional advice by Goodwin or its lawyers. Prior results do not guarantee similar outcomes.
Contacts
- /en/people/m/mavroghenis-stephen-c

Stephen C. Mavroghenis
Partner - /en/people/k/kontosakou-athena

Athena Kontosakou
Partner - /en/people/d/doh-hyunseok

Hyunseok Doh
Associate