Election Interference in the Age of Hybrid Warfare
- Isabel Rodenas
- Jul 8
- 6 min read
Updated: Aug 7

With half the world’s voting population having gone to the polls in 2024 and many more preparing for national elections in the coming years, democracy is facing its most complex and volatile period since the Cold War. Election interference has evolved into a multidimensional threat, ranging from AI-generated disinformation to authoritarian states deploying cyber operatives.
Hybrid warfare, which blends conventional tactics with cyber, information, and psychological operations, now routinely targets electoral integrity. Democracy, long considered resilient in Western nations, is now being tested by an ever-changing threat landscape that blurs the distinctions between foreign and domestic, digital and physical, and truth and fabrication. The next five years will be a critical period for democratic systems. This blog post examines the primary risks of election interference up to 2025 and beyond, analysing the involvement of authoritarian regimes, digital oligopolies, and the misuse of emerging AI technologies, and offers policy responses for key stakeholders.
Dissecting the Multi-Vector Threat
Hybrid Warfare and Foreign Interference
The modern battlefield for democracy is multifaceted, unpredictable and deeply interconnected. By 2030, authoritarian regimes, such as those in China, Russia and Iran, are expected to deploy increasingly sophisticated strategies to disrupt democratic elections. These strategies include cyberattacks, digital espionage, economic coercion and deepfake campaigns, blurring the lines between overt aggression and covert manipulation.
A defining feature of these operations is their ability to masquerade as internal dissent. Proxy influencers, bot networks and compromised platforms enable foreign operatives to inflame domestic polarisation while remaining largely invisible. A recent EU study revealed that these tactics can affect even well-established democracies, exerting sustained and subtle pressure on electoral processes.
Authoritarian Regimes Exploiting Openness
Autocratic actors exploit the strengths of open societies — such as free expression, decentralised flows of information, and accessible media — to sow division and disinformation. Using tactics such as information laundering, adversarial regimes can insert false or altered information into legitimate news cycles and gradually shape public opinion without being detected. Russia’s interference in the 2016 US election is a stark example of this strategy. The combination of state propaganda and viral content creates a feedback loop that undermines consensus and is cloaked in plausible deniability.
Open societies will remain vulnerable to such subversion if they stay unguarded.
Although authoritarian regimes exploit openness for political gain, their ecosystem is often created and maintained by private technology platforms that are neither adversarial nor politically aligned. However, their business models can inadvertently facilitate manipulation.
Polarisation and Internal Erosion
Perhaps the most troubling trend is the internal adoption of narratives outside the country. Domestic political actors are increasingly amplifying misinformation originally spread by hostile states, thereby exacerbating societal divisions. This is reflected in the rise of extreme right-wing parties in Europe, such as the FPÖ in Austria and VOX in Spain, which echo the hateful messages of the American MAGA movement and portray foreigners as the cause of all a country's economic problems.
By 2030, democracies may face the collapse of a shared reality, where election outcomes are not dismissed due to fraud, but because the electorate no longer agrees on basic facts. Slovakia's 2023 elections vividly illustrated this dynamic. Deepfakes and conspiracy theories circulated widely, with domestic actors exploiting them for short-term political gain and accelerating the long-term decay of democracy.
Regime Instability via Institutional Decay
Beneath these digital threats lies an even greater vulnerability: the weakening of democratic institutions themselves. Public trust in legislatures, the judiciary and independent journalism is declining, creating an environment where electoral outcomes can be openly contested and dismissed.
The insurrection at the U.S. Capitol following the 2020 election was a watershed moment. The fact that even one of the world's most established democracies can experience such democratic turmoil and decline should serve as a warning to us all. Without urgent efforts to restore institutional credibility, similar events are likely to happen again.
Digital Oligopolies and Platform Governance
Although technology companies do not operate like authoritarian states with formal geopolitical objectives, they significantly impact democratic processes. Through platform design, algorithmic curation, content moderation policies and selective enforcement, these companies have become powerful political actors in their own right, often shaping public discourse more profoundly than traditional institutions.
Algorithms optimised for engagement tend to prioritise the most divisive or sensational content, fuelling polarisation and misinformation. Meanwhile, inconsistent moderation and opaque internal policies raise concerns about ideological bias and selective censorship. The companies' lobbying efforts and resistance to regulation further complicate their role as stewards of the public sphere.
Although initiatives like the EU’s Digital Services Act and AI Act aim to introduce accountability, enforcement remains fragmented. Without deeper structural reform and transparency, tech platforms risk becoming entrenched as self-interested gatekeepers of political reality, whether they claim that role or not.
AI Misuse and Cognitive Warfare
One of the most urgent emerging threats is using artificial intelligence to manipulate perception. Deepfakes, automated disinformation and AI influencers are flooding the information landscape, rendering traditional fact-checking and verification tools ineffective.
For example, during India’s 2024 elections, AI-generated videos spread rapidly across social media, deceiving millions before any countermeasures could be mobilised. Without technological safeguards that evolve in tandem, democracies could descend into 'reality nihilism', where citizens reject lies and inconvenient truths.
The Tech Governance Vacuum
Despite the growing number of threats, the global response remains fragmented. While existing frameworks such as the EU AI Act set essential precedents, there is currently no enforceable international regime governing digital election integrity.
This vacuum enables malicious actors to exploit regulatory gaps and jurisdictional inconsistencies. As long as governance remains fragmented, even the most robust democracies will be vulnerable to attack through the weakest links in the global chain.
Policy Recommendations to Fortify Democratic Infrastructure
These complex threats require a unified, cross-sector response. Governments must take the lead by introducing mandatory transparency standards for AI-generated content, such as clearly labelling and making traceable all synthetic media during election cycles. They should adopt zero-trust cybersecurity frameworks to protect the electoral infrastructure from sophisticated cyberattacks and insider threats. Furthermore, establishing bipartisan electoral defence task forces would enable a rapid, coordinated response to disinformation campaigns and digital provocations, ensuring public trust is maintained in real time.
Multilateral institutions also have a unique role in establishing and maintaining global norms. Negotiating new treaties focused on AI weaponisation and election interference, analogous to traditional arms control agreements, would signal clear red lines. These institutions must also enable cross-border coordination, including intelligence sharing and joint enforcement against transnational disinformation networks. A critical pillar of this global defence could be the establishment of a UN-style Electoral and AI Integrity Observatory, which would serve as a neutral hub for monitoring elections, sharing early warnings and offering best practices. This body could become the strategic backbone of democratic integrity in the years ahead.
Technology companies must assume greater responsibility as custodians of the digital public sphere. Implementing real-time content provenance systems, such as the C2PA standard, would help track and verify the origin of political media. Independent audits of algorithmic behaviour, particularly with regard to political content that influences public opinion, should become standard practice. Furthermore, companies must make political advertising data accessible to researchers and journalists so they can access information on targeting and spending patterns that could expose covert influence operations before they escalate.
Civil society and academia also play vital roles in strengthening democratic resilience. Media literacy initiatives must be strengthened to help citizens identify deepfakes, bots, and algorithmic manipulation. Universities, think tanks, and NGOs should collaborate to establish watchdog consortia that can issue real-time alerts on disinformation trends. Similarly, digital campaign observatories could provide transparency on advertising and subtler techniques such as micro-influencer strategies.
Reclaiming Democratic Sovereignty
Election interference in 2025–2026 is not just a foreign policy or cybersecurity issue but an existential test of democratic cohesion. Technological disruption, geopolitical rivalry, and institutional fragility collide in ways that threaten the idea of a “free and fair” vote. Democratic actors must not only harden systems but also revitalise public trust.
The most urgent priority is achieving strategic alignment among stakeholders. Governments, tech companies, civil society, and multilateral bodies must transition from a reactive defence to a proactive approach to coordination. Transparency mandates, global norms, and technological safeguards are no longer optional; they are essential to preserving the legitimacy of future elections.
A UN-style electoral and AI Integrity Observatory could provide the foundation for a coordinated global defence strategy. Action is urgent. Reclaiming democratic sovereignty cannot be delayed—it must begin now. Vigilance is not enough. Proactive, systemic reforms are the only way to preserve digital age electoral sovereignty.
Comments