In a significant move, EU industry chief Thierry Breton has initiated an investigation into Elon Musk’s social media platform, X, marking the first such probe under the new EU tech rules. This action follows Breton’s earlier criticism of TikTok and Meta for their perceived inadequacy in addressing the surge of disinformation after the Hamas attack on Israel.
Digital Services Act Enforcement and Social Media Platforms Under Scrutiny Amid Israel-Hamas Conflict
Enforced in November of the previous year, the Digital Services Act mandates large online platforms and search engines to take more strict action against illegal content, threats to public security, and manipulative techniques.

All three platforms, including X, have experienced a rise in false content related to the Israel and Hamas conflict, with X reportedly being the most affected, according to various social media researchers. Thierry Breton’s decision to investigate intensifies the pressure on TikTok and Meta to eradicate illegal and harmful content in compliance with the Digital Services Act.
Also Read: X’s Unusual Audio and Video Calling Feature Puzzles Users
X CEO Takes Action Against Hamas-Affiliated Content Amid EU Investigation
X CEO Linda Yaccarino announced on Thursday that the platform has already taken steps to remove hundreds of Hamas-affiliated accounts and label or eliminate tens of thousands of pieces of content since the attack, responding to Breton’s concerns.
“We have sent X a formal request for information, a first step in our investigation to determine compliance with the Digital Services Act,” said Breton in a post on X. However, X has chosen not to comment on the matter.
Elon Musk has also weighed in on the matter regarding disinformation on the platform, stating that the EU has not provided any examples. The spokesperson for the European Commission has not yet responded to Musk’s remarks.
X Faced Critical Deadlines and Challenges in Crisis Response Amidst Musk’s Cost-Cutting Measures
The initial deadline for X to disclose details about its crisis response protocol was October 18th, with an additional cutoff on October 31st for addressing other issues. Musk’s decision to restrict free academic access to a data tool created challenges in tracking keywords and hashtags, necessitating manual content sifting by researchers.

In response to feedback from the research community and the EU, X has recently announced plans to reinstate free academic access to the data tool by the end of November. The platform commits to enhancing the tool’s data quality and transparency while providing increased guidance and support for researchers.
Simultaneously, the European Commission has confirmed the receipt of X’s response regarding the activation and functioning of its crisis response protocol. The Commission will now analyse the provided information and determine the next course of action, potentially involving formal proceedings and sanctions.
Musk’s Workforce Reductions at Twitter Echo Across X
Musk’s cost-cutting measures after taking over Twitter, reducing the workforce from 7,500 to 1,500 employees, have also impacted content moderation and the identification of coordinated propaganda campaigns on X.
X has suffered losses in its trust and safety division, risking fines of up to 6% of its global turnover if found guilty of Digital Services Act violations.
Breton’s Ultimatum Follows Warning Letters to X and Meta
On a parallel track, Thierry Breton issued a 24-hour ultimatum to TikTok CEO Shou Zi Chew to intensify efforts to remove illegal content. Breton’s warning, conveyed in a letter, echoes similar messages sent earlier in the week to X owner Elon Musk and Meta’s Zuckerberg.
The letter to TikTok highlights indications of the platform being used to disseminate illegal content and disinformation after the Hamas attacks. Breton emphasises TikTok’s particular obligation to protect children and teenagers from violent and graphic content circulating without appropriate safeguards.
TikTok and Meta have been officially instructed to provide information to the EU regarding the potential spread of disinformation on their platforms concerning the Israel-Gaza conflict. While the previous 24-hour request lacked legal force, this latest demand carries more weight. Both companies now have a week to respond. Failure to comply could lead to a formal investigation under the new tech rules.
Also Read: Linda Yaccarino’s Intense Interview at Code Conference 2023
EU Escalates Pressure on Social Media Giants to Combat Terrorism-Related Content After Israel-Hamas Attack
The EU’s heightened concern about the potential spread of terrorist and violent content, along with hate speech, following the Hamas attack on Israel underscores the urgency of addressing disinformation on social media platforms.
A spokesperson from TikTok stated, “We’ll publish our first transparency report under the new law next week, where we’ll include more information about our ongoing work to keep our European community safe.” Meta also affirmed its commitment to platform safety and cooperation with third-party fact-checkers.
This latest demand from the EU comes a week after a similar inquiry into X, which claimed to have removed hundreds of Hamas-affiliated accounts. Social media platforms, including X, Google, TikTok, and Meta, have each received letters from EU Commissioner Thierry Breton, giving them specific deadlines to respond under the Digital Services Act.
The Future Effects of Digital Services Act Scrutiny
Both Meta and TikTok have promptly responded to the EU’s requests, affirming their dedication to fulfilling obligations under the Digital Services Act, or Digital Services Act. Meta detailed several proactive measures it has taken to mitigate the risks of illegal and harmful content on its platforms. These include the removal of terrorist and violent content, efforts to enhance transparency and accountability, and collaborative initiatives with industry peers and authorities.
Likewise, TikTok outlined various protective measures implemented to safeguard users, particularly minors, from inappropriate and harmful content. These measures include age verification, parental controls, robust content moderation practices, and educational campaigns.
The EU Commission has announced its intention to thoroughly analyse the responses from Meta and TikTok before determining the next course of action. Potential steps include opening formal investigations and imposing sanctions if the platforms are found to be non-compliant with the Digital Services Act.
Author Profile

Latest entries
GAMING2024.06.12Top 4 Female Tekken 8 Fighters to Obliterate Your Opponents in Style!
NEWS2024.03.18Elon Musk’s SpaceX Ventures into National Security to Empower Spy Satellite Network for U.S.
GAMING2024.03.17PS Plus: 7 New Games for March and Beyond
GAMING2024.03.17Last Epoch Necromancer Builds: All You Need To Know About It