The European Union on Monday announced a formal investigation into X, the social media platform owned by Elon Musk, for failure to counter prohibited content and disinformation, a lack of transparency about advertising and ” deceptive” design practices.
The questioning is perhaps the most significant regulatory move to date against X as it loosened its content moderation policies after Mr. Musk launched the service, formerly known as Twitter, last year. The company’s new policies have led to an increase in incendiary content on the platform, according to researchers, causing brands to increase advertising.
In pursuing X, the European Union used for the first time the authority obtained after the passage of the Digital Services Act last year. The law gives regulators sweeping new powers to force social media companies to police their platforms for hate speech, misinformation and other divisive content.
The European Commission, the executive arm of the 27-national bloc, has signaled its intention to take a closer look at X’s business practices. In October, regulators began a preliminary investigation into the spread of “terrorist and violent content and hate speech” in X after the start of the Israel-Gaza conflict.
The investigation shows big differences between the United States and Europe in internet policing. While online posts are largely unregulated in the United States as a result of free speech protections, European governments, for historical and cultural reasons, have placed more restrictions around hate speech. words, incitement to violence and other harmful material.
The Digital Services Act is an attempt by the EU to force companies to put in place procedures to comply more consistently with the rules around such content online.
Monday’s announcement was the start of an investigation without a specified deadline. The inquiry is expected to include interviews with outside groups and requests for more evidence from X. If found guilty of violating the Digital Services Act, the company could be fined up to 6 percent of global revenue.
EU officials said X may not be complying with rules that require online platforms to respond quickly after learning about prohibited and hateful content, such as antisemitism and incitement to terrorism. The law also requires companies to conduct risk assessments regarding the spread of harmful content on their platforms and implement mitigation measures.
Officials have also expressed concerns about X’s content moderation policies in non-English languages, particularly as continent-wide elections in 2024 approach.
In addition, the investigation will examine X’s efforts to address the spread of misinformation. The company relies on a feature, called Community Notes, that allows users to add context to posts they believe are misleading, an approach that EU officials have said may not be enough. Regulators will also look at ways in which posts by X users who pay to be authenticated, symbolized by a blue check mark, are given more visibility.