Amidst the Russian invasion of Ukraine and an unstable partnership with the United States, Europe is currently consumed by rearmament and industrial supply chains. In this determined quest for strategic autonomy, European Union member states are prioritising spending on military research and development, ammunition production, and military mobility.
While such investments are important, in this fervour of defence spending, European countries must not lose sight of the more insidious and piecemeal Russian-sponsored activities. These low-complexity operations aim not to defeat but to distract, diverting public attention away from uniting against Russia as a common adversary. Defending the information terrain and enhancing cognitive resilience is foundational to European security.
Russian subversion in Europe
Russian-linked sabotage operations have risen sharply since 2022. Prominent examples include tampering with cables on railway tracks in the Netherlands before June’s NATO Summit, burning down commercial outlets in Poland and Lithuania, coordinated operations targeting the French railway network ahead of the Olympic Games, and a possible arson attack on the house of the UK prime minister. Blending information manipulation and physical tactics, Russia sows chaos and discord within European countries.
While the primary effects of an arson attack versus a cyber operation are different, we must understand them as being part of the same Russian playbook of psychological manipulation. The intended secondary effects are the same: to distract and weaken Western democratic societies and erode international support for Ukraine.
Moreover, these operations are coordinated by the same actors. For example, the Russia-linked group UAC-0050 (also known as the ‘Fire Cells Group’) has been involved in theft, cyber espionage, bomb threats to Ukrainian institutions, and even violent sabotage operations.
After failures in Ukraine led to a 2022 shakeup in the Russian intelligence services, Russia resorted to recruiting ‘disposable agents’ using the Telegram messaging app. These agents, often tied to organised crime, perform low-complexity, high-visibility operations, consuming the attention of the media and ordinary citizens.
Attributing these activities is difficult: the agents have little to no formal training, murky chains of command, and rarely perform repeat missions. We still don’t know conclusively the details of Russian involvement in the French railway operation before the Olympics. Moreover, publicising attribution risks amplifying the psychological effects of such operations and serving the interests of Russian strategy.

Cyber and information operations
Russia is also using malware and information operations to bolster the war effort and fracture European domestic publics. It engages in a patchwork of operations, often run by different units in competition with each other, seeking to sow confusion and elicit emotional reactions.
Multiple governments have accused the Russian hacking group known as APT 28 of targeting Western logistics entities and technology companies involved in the delivery of aid to Ukraine with ransomware and of ‘active measures’ against Czechia and Germany. Dutch intelligence services recently revealed that the Russian group ‘Laundry Bear’ had targeted the Dutch police, as well as multiple defence-related companies and military units. There are also strong indications that Russia is linked to an attack infrastructure known as DDoSia, a distributed denial of service toolkit that has impacted over 486 websites, often in reaction to political developments supporting Ukraine.
Interference in democratic processes likewise remains high on the Russian agenda. For example, the ‘Lying Pigeon’ threat actor launched a highly personalised email campaign in the fragile democracy of Moldova ahead of the 2024 presidential elections and EU referendum, combining both disinformation and malware tactics. The result of the referendum was unexpectedly close, with just 50.46% of voters supporting EU membership.
Ahead of the presidential election in Poland, meanwhile, Russia engaged its well-known polarisation arsenal. Russian efforts focused on (mis)representing liberal candidate Rafal Trzaskowski as a keen advocate for welfare increases for Ukrainian migrants. Russian actors also targeted critical infrastructure in an attempt to spread doubt about the competence of local authorities.
In Romania, Russia harnessed TikTok to support independent candidate Calin Georgescu, who unexpectedly won the first round of the 2024 presidential elections. According to declassified documents from Romanian intelligence, Russia supported a highly curated and expensive social media campaign involving tens of thousands of fake TikTok accounts.
This year’s Hague TIX conference, which I coordinated, brought together researchers and practitioners to examine precisely these subtler threats: sabotage operations, AI-enhanced influence campaigns, ransomware with geopolitical intent, and the quiet embedding of malware into Europe’s digital infrastructure. However, these sorts of dialogues are just the beginning.
Europe needs an effective cognitive defence strategy
Russian psychological tactics, whether via malware, sabotage or information operations, require a psychological response. European governments should prioritise cognitive defence.
Public education campaigns on the workings of social media algorithms – which are instrumental not only for information operations but also for amplifying the psychological effects of sabotage and malware interference – ought to be a priority. TikTok and YouTube have become the foremost news sources for young people across Europe. TikTok users in the UK spend an average of 49 hours and 29 minutes per month on the app, more than any other social media app. Infinite scrolling, short videos, and highly personalised ‘For You’ streams prompt dopamine release in the brain and keep users hooked.
The greater a person’s algorithmic knowledge, the more likely they are to take actions against manipulation. People also object to the algorithmic personalisation of political campaigning and news sources. Yet, despite growing general awareness of social media deception, this does not translate into people knowing when they have been presented with one-sided information on controversial issues, leaving them susceptible to influence campaigns.
Devising effective media literacy and counter-disinformation strategies is no simple task. While the immediate focus is necessarily on preventing emotional manipulation by foreign-influenced narratives, there is a risk that engaging in continual debunking and attribution of cyber operations might eventually breed indifference and desensitisation to the threat – exactly what Russia wants.
Even without intentional manipulation, these social media platforms contribute to Russia’s end goal. Most people immediately think that the main utility of short-form videos to Russian strategy is the amplification of disinformation and showcasing the effects of sabotage. Less considered is how they contribute to shrinking attention spans and increased distractibility by immersing viewers in a continuous stream of highly engaging twenty-second content. Distraction and indifference are precisely the conditions that Russian strategy seeks to engineer to weaken Western societies from within.
To inform public education campaigns, European governments should seriously invest in behavioural science and psychology and support research on the cognitive impact of social media, especially short-form video. Research in this area is still scarce. This knowledge will allow governments to pursue more informed conversations with social media platforms, devise specific education campaigns for schools, better understand the emotional triggers that make individuals vulnerable to manipulation and distraction, and devise appropriate countermeasures to cognitive tactics.
Russia is known to use insights from psychology and neuroscience in designing its influence campaigns. It is time Europe caught up.






