The European Union (EU) has opened an investigation into X (formerly Twitter) for lackluster moderation of illegal content and disinformation in the wake of the Israel-Hamas war. The move, via Financial Times, comes two days after EU Commissioner Thierry Breton sent an “urgent” letter to X owner Elon Musk asking the billionaire about the company’s handling of misinformation. The formal probe is the first under the newly minted Digital Services Act (DSA), which requires platforms operating in Europe to police harmful content — and can levy fines significant enough to give it teeth.
EU officials sent a series of questions to X that the company has until October 18 to answer. The commission says it will determine its next steps “based on the assessment of X replies.” The DSA, which passed into law in 2022, requires social companies to proactively moderate and remove illegal content. Failing to do so could lead to periodic fines or penalties that, in X’s case, could total up to “five percent of the company’s daily global turnover,” according to FT.
Researchers and fact-checkers have cautioned about widely distributed misinformation on X following the Hamas attacks on Israel. Tuesday’s letter warned Musk about harmful content on X, signaling that Breton was prepared to use the DSA’s full muscle to enforce compliance. “Following the terrorist attacks carried out by Hamas against Israel, we have indications that your platform is being used to disseminate illegal content and disinformation in the EU,” Breton wrote. “Let me remind you that the Digital Services Act sets very precise obligations regarding content moderation.”
Musk’s response appeared to contain at least a whiff of snark. “Our policy is that everything is open source and transparent, an approach that I know the EU supports,” the X owner and Tesla CEO wrote. “Please list the violations you allude to on X, so that that [sic] the public can see them. Merci beaucoup.” Breton retorted, “You are well aware of your users’ — and authorities’ — reports on fake content and glorification of violence. Up to you to demonstrate that you walk the talk.”
Yaccarino’s response claimed the company redistributed its resources and shuffled internal teams to address moderation issues surrounding the Middle East conflict. She said X has removed or labeled “tens of thousands of pieces of content” since the attacks commenced.
The CEO added that X deleted hundreds of Hamas-aligned accounts from the platform while stating that the company works with counter-terrorism organizations. Yaccarino said X’s Community Notes, a crowdsourced moderation feature, is now supported on Android and the web (with iOS “coming soon”). She also claimed the company has “significantly scaled” a feature that sends notifications to people who liked, replied to or reposted something that later received a Community Note fact-check.
The EU’s newly opened probe also questions how X is prepared to react during a crisis and what procedures it has to handle associated misinformation. The company allegedly has until the end of October to respond to that line of questioning.
Breton isn’t focusing exclusively on X. The commissioner also sent letters to Meta CEO Mark Zuckerberg and TikTok owner ByteDance this week, reminding them of their obligations to the DSA in the wake of the Middle East bloodshed.