Skip to content
ill communications

Meta to face EU probe for not doing enough to stop Russian disinformation

Insufficient moderation of political ads risk undermining electoral process.

Javier Espinoza, Financial Times | 61
montage of EU flag and Meta logo
Credit: FT
Credit: FT
Story text

Brussels is set to open a probe into Meta’s Facebook and Instagram as soon as Monday over concerns the social media giant is failing to do enough to counter disinformation from Russia and other countries.

Regulators suspect that Meta’s moderation does not go far enough to stop the widespread dissemination of political advertising that risks undermining the electoral process, the European Commission is expected to say on Monday, two people with knowledge of the matter said.

EU officials are particularly worried about the way Meta’s platforms are handling Russia’s efforts to undermine upcoming European elections. The commission, however, is not expected to single out Russia in its statement and will only make reference to the manipulation of information by foreign actors.

EU officials also fear that the company’s mechanism to let users flag illegal content is not easily accessible or user-friendly enough to comply with the EU’s Digital Services Act, the bloc’s landmark legislation designed to police content online.

The law, approved in April last year, includes measures to force platforms to disclose what steps they are taking to tackle misinformation or propaganda. If the EU finds Meta to be in breach of the Act, it could be fined up to 6 percent of its global annual turnover.

The move represents the latest regulatory action taken by the commission against Big Tech groups, as fears grow among member states that Russia is pushing disinformation on social media to undermine democracy ahead of Europe-wide elections in early June.

The commission is to start the investigation based on a report sent by Meta in September on how it is handling disinformation risks on its platform as well as the EU’s own assessment.

The investigation will assess whether the way Facebook and Instagram place political content on their sites is compliant with the law.

Investigators will look into whether Meta has failed to mitigate risks as it looks to discontinue CrowdTangle, a tool that shows publishers how content is spreading across the site, and to outline concerns related to how Meta tracks disinformation to help fact-checkers and journalists.

The commission is expected to give Meta five working days to say what it will do to mitigate the situation or threaten the social media group with measures under the DSA, the people said.

There is no set deadline for the investigation to end and it will depend on Meta’s willingness to cooperate, the EU is expected to say.

“We have a well-established process for identifying and mitigating risks on our platforms,” said Meta. “We look forward to continuing our cooperation with the European Commission and providing them with further details of this work.”

The commission did not reply to a request for comment. The timing of the announcement can still shift, the people said.

Meta’s probe follows a separate investigation into X in relation to illegal content and disinformation of violent and terrorist content spreading on its platform after Hamas’s October 7 attacks against Israel.

It also comes after regulators imposed election safeguards aimed to counter online threats to the integrity of electoral processes. As a result of the guidelines, social media platforms such as X and Meta will be required to scrutinize the risks of online disinformation across the bloc.

© 2024 The Financial Times Ltd. All rights reserved. Not to be redistributed, copied, or modified in any way.

Listing image: FT

61 Comments