Ireland’s media regulator has opened an investigation into Elon Musk’s social media company X, examining whether the platform failed to comply with the European Union’s Digital Services Act. The regulator, Coimisiún na Meán, announced that the inquiry will focus on whether X is properly responding to reports of illegal or harmful content under the DSA’s content moderation obligations. The authority said it had received information indicating that users’ complaints about illegal content were not addressed promptly or adequately. The probe will analyze the company’s reporting systems, transparency procedures, and internal compliance structures. It will also determine if X is meeting its responsibilities as a “Very Large Online Platform,” a category that places it under heightened regulatory scrutiny within the European Union.
The investigation stems from ongoing concerns about X’s handling of content moderation since Musk’s 2022 acquisition of the platform. Ireland’s regulator said its action is supported by findings from the nonprofit HateAid, which previously represented a researcher in Germany who was repeatedly suspended after reporting hate speech on the site. The organization claims X has failed to enforce consistent policies or provide sufficient protections for users who flag abusive or illegal posts. The Digital Services Act, which took full effect for large platforms earlier this year, requires companies to remove illegal content quickly, share risk assessments, and submit to regular independent audits of moderation practices.
European officials have increased enforcement efforts against major technology companies amid concerns about misinformation and harmful content online. Under the DSA, violations can carry fines of up to six percent of a company’s global annual revenue. For X, which is headquartered in Dublin for its EU operations, the investigation could have far-reaching implications for its regional governance and compliance models. Regulators are expected to examine whether the platform’s current staffing, moderation technology, and reporting tools meet the DSA’s high standards for transparency and user protection. Irish authorities noted that the investigation will proceed independently but may coordinate with the European Commission and other national regulators as part of a wider effort to ensure consistent enforcement across member states.
The probe places X among a growing list of digital platforms facing formal investigations under the new EU regime, alongside Meta, TikTok, and Google. Since Musk’s takeover, X has significantly reduced its global moderation staff and altered its verification and advertising policies, drawing repeated criticism from European lawmakers and watchdog groups. The company has said it is committed to respecting local laws and fostering “free expression,” though officials argue that regulatory obligations must balance speech rights with public safety. Coimisiún na Meán said it would release further updates as the case develops, with findings expected to set a precedent for future enforcement under the Digital Services Act.
