EU Investigates into Meta's Handling of Political Content and Disinformation
EU Investigates into Meta's Handling of Political Content and Disinformation
Stay informed as the EU investigates Meta's approach to political content and disinformation. Get the latest updates here.
The European Commission has formally investigated Meta, the parent company of Facebook and Instagram, to assess its compliance with the EU’s Digital Services Act (DSA). The probe comes amid concerns about the company’s handling of political content and its efforts to combat disinformation on its platforms, particularly in the run-up to the European Parliament elections in June.
Concerns Over Meta's Content Moderation and Transparency
The commission has raised several concerns regarding Meta’s practices. One of the main issues is the company’s moderation of deceptive advertisements and disinformation campaigns. The investigation will examine whether Meta’s current approach to moderating these types of content meets the requirements set out in the DSA.
Another area of concern is the transparency of Meta’s content moderation procedures. The commission will assess whether the company is providing sufficient information to users about its moderation practices and the reasons behind its decisions to demote certain political content and accounts.
The EU has also expressed concern over lacking an effective third-party, real-time monitoring tool for civil discourse and elections. This issue has been brought to the forefront as Meta plans to deprecate its CrowdTangle tool, which has been used by researchers, journalists, and civil society organizations to track potential misinformation on the company’s platforms.
Potential Consequences for Meta Under the DSA
As a Very Large Online Platform (VLOP) under the DSA, Meta is subject to stricter controls and potentially significant fines if found to be non-compliant with the regulations. The company could face penalties of up to 6% of its global turnover if it fails to meet the DSA’s requirements for combating disinformation and ensuring transparency.
In response to the investigation, a Meta spokesperson has stated that the company has a well-established process for identifying and mitigating risks on its platforms. The spokesperson added that Meta looks forward to continuing its cooperation with the European Commission and providing further details of its work in this area.
The commission has given Meta five working days to inform the EU about the remedial actions it has taken to address the concerns raised in the investigation.
EU's Efforts to Combat Disinformation Ahead of Elections
The decision to launch this investigation into Meta is part of the EU’s broader efforts to clamp down on disinformation ahead of the upcoming European Parliament elections. The commission has recently conducted a “stress test” to assess the readiness of various platforms to address manipulative behavior during the election period.
The probe into Meta follows similar investigations launched by the EU into other tech companies, such as X (formerly Twitter) and TikTok, as the bloc seeks to ensure that these platforms are taking adequate measures to combat disinformation and protect the integrity of the electoral process.