EU Takes Aim at Big Tech
The European Commission has issued guidelines aimed at X, TikTok, Facebook, and other major online platforms, urging them to address election risks and combat voter disinformation. These guidelines, adopted on Tuesday, target platforms with over 45 million active users in the EU, falling under the category of ‘Very Large Online Platforms and Search Engines’ as defined by the Digital Services Act (DSA). They outline potential measures to tackle election-related risks, harmful AI content, and misleading political advertising.
Specific guidelines are provided for the upcoming pan-EU election in June, addressing concerns about increased interference and misinformation online. While these guidelines are not legally binding, the Commission retains the authority to initiate formal proceedings against platforms that fail to comply with the DSA provisions on elections and democratic processes. Non-compliant platforms could face fines of up to 6% of their global turnover imposed by the Commission.
As part of a coordinated effort, Brussels is pushing to curtail the industry’s reliance on self-regulation, which has often been criticized as complacent and insufficient. The aim is to compel Big Tech to take more decisive action in upholding democratic values.
A senior EU official emphasized that the guidelines were prompted by the “threat” to the integrity of elections within the bloc, especially with the rapid emergence of generative AI and the dissemination of misleading deepfake content, both of which can exacerbate divisions within European societies.
EU’s Measures Against Deepfakes and Misinformation
For instance, last October witnessed a deepfake video of a candidate in the Slovak elections claiming to have manipulated the vote, posing a significant risk to the democratic process.
Under the newly established framework, platforms will be obligated to promptly identify and signal such high-risk situations through a new incident response mechanism. They are also mandated to cooperate with both European and national authorities, as well as independent experts and civil society organizations, to effectively tackle emerging threats.
Additionally, the Commission is concerned about recommender systems, which utilize machine learning to prioritize content with divisive, harmful, or misleading characteristics that have the potential to go viral. The guidelines mandate that platforms design such systems to afford users “meaningful choices and controls over their feeds.”
The executive is also vigilant regarding the potential dissemination of inaccurate election information facilitated by AI-powered chatbots, commonly referred to as chatbot “hallucinations,” according to the official.
In 2023, a study conducted by non-profit organizations AI Forensics and AlgorithmWatch shed light on concerning findings regarding Microsoft’s Bing Chat, which has since been rebranded as Microsoft Copilot. The study revealed that the chatbot inaccurately responded to one-third of election-related inquiries. These errors encompassed misinformation regarding election dates and candidates, as well as the propagation of fabricated controversies surrounding candidates.
The guidelines are strategically released ahead of the European Parliament elections following consultations with platforms, which were invited to offer feedback on the draft.
Ahead of June’s ballot, several companies, including Google, Meta, and TikTok, have implemented election safeguards. They’ve established election centers to counter misinformation. TikTok plans to send push notifications to its European users directing them to an in-app election center for trusted information and media literacy tips starting next month.
EU Platform Stress Tests for Multilingual Elections
The Commission plans to conduct stress tests on the rules with “relevant platforms” by the end of April, though it hasn’t specified which platforms will participate.
With 370 million eligible voters across 27 member states heading to the polls in June, Brussels is concerned about the strain on platform resources, particularly the need for content moderators fluent in the EU’s 24 official languages.
For example, X’s latest transparency reports reveal a scarcity of content moderators fluent in certain EU languages. This linguistic complexity renders the European elections “particularly vulnerable,” according to the senior official. These developments occur amidst the largest election year in history, with over 2 billion voters expected worldwide.
The official acknowledges that while DSA compliance carries costs, implementing similar rules outside the EU would incur minimal expenses. Consequently, platforms may contemplate extending similar safeguards globally.