The mechanisms of cyber-enabled information campaigning
Western interest in cyber-enabled influence operations (CEIOs) initially surged in the aftermath of Russian election meddling in the US presidential election in 2016 and has only gained momentum with the upcoming election megacycle in 2024. Revelations about the Russian Foreign Ministry’s plans to weaken Western adversaries, emphasising ‘offensive information campaigns’, highlight the importance of looking beyond specific operations to cyber-enabled influence campaigns (CEICs) for achieving strategic objectives. Campaigns are a series of linked sustained operations tied to a broader objective.
We recently discussed an analytical framework for understanding these cyber activities in Intelligence and National Security. An improved understanding of CEIOs and CEICs can help guide better policy and strategy design to counter them.
Exploiting division
The first step in countering ‘fake news’ or disinformation is understanding the mechanisms by which this information propagates. Much focus has been on remedies such as fact-checking and clarifying ‘truth’ vs. ‘lies’. These efforts neglect the fact that CEIOs may use information that is reasonably accurate but salient to opposing groups. This kind of divisive information requires a different counterstrategy.
From a national security perspective, the focus must be on CEICs (linked, sustained operations) that seek strategic-level impact, for example by undermining trust in the democratic institutions of a target country. Here, the veracity of a single information claim is much less important than the way a series of claims are circulated and weaponised in specific communities over an extended period.
The ubiquitous and fluid nature of cyberspace creates an environment ripe for technical and informational exploitation. The structure of the cyber strategic environment and the strategic choices that states are making, however, opens the door for a counterstrategy aimed both at weakening single operations and limiting, frustrating, and disrupting the capacity of adversaries to cumulatively link information operations into campaigns that can have a national security impact.
Russia’s leaked foreign policy concept lays out a campaigning orientation focused not on any single group or event but instead on sowing divisiveness. As part of the Washington Post’s analysis of these classified documents, Russia proposed to “continue to facilitate the coming to power of isolationist right-wing forces in America” and “enable the destabilization of Latin American countries and the rise to power of extremist forces on the far left and far right there”. Here, the goal is not to achieve direct ideological alignment with Russia—an element core to Soviet-era influence campaigns—but to create a concerted and sustained campaign to inflame extremes at both ends of the political spectrum.
This seems consistent with other published Russian documents calling for targeting a “coalition of unfriendly countries” by “finding the vulnerable points of their external and internal policies with the aim of developing practical steps to weaken Russia’s opponents.”
Russia’s greatest concern is the cohesion of effort across Western democracies. In military terms, a coherent West might mean heavy losses in Ukraine and expansion of the NATO alliance. In terms of economic coercion, a strong West might be able to sustain and weather sanctions despite negative spillover impact, tipping the cost-benefit balance against the Russian economy over time.
With Moscow’s limited and diminishing capability to use direct military or economic levers for strategic gain, cyberspace has become an increasingly vital environment for competition and contestation. If Russia is to level the playing field, the country will need to exploit any fissures that exist in Western cohesion.
Identify, imitate, amplify
We argue that cyber-enabled information campaigns follow a core pattern comprising three steps, what we refer to as the Identification-Imitation-Amplification (IIA) loop. In the first step, a foreign adversary identifies divisive issues and communities that can be exploited and polarised. Second, facilitated by the anonymity achievable in cyberspace, the CEIC initiator hides their identity, poses as a member of the target community, and shares divisive messaging. In the final step, these messages are shared within and across social media platforms.
Cyber-enabled divisive campaigns are most powerful in Western societies when they capitalise on core democratic ideals of freedom of speech and trust. These campaigns can exploit the differing opinions inherent in an open society and use them to undermine overall trust in the institutions and mechanisms built to allow those differences to coexist.
The House divided
We can use a recent case study from US politics to illustrate the power of CEICs: congressional support for supplying arms to Ukraine. In February 2022, when the Russian invasion commenced, Congress voted almost unanimously to provide aid. Two years later, in April 2024, most Republicans opposed the funding.
Two things are instructive about this shift. One is that the Republican Chairman of the US House of Representatives Intelligence Committee and the Chairman of the House Armed Services Committee explicitly accused other Republicans of spreading Russian propaganda on the floor of Congress.
Even more significant is that opposition to Ukrainian funding is framed not in simple fiscal terms but in trust terms—that the Biden administration is selling out American security. The divisiveness rests on questioning the loyalty of the president and those “RINOs” (Republicans in name only) selling out the country rather than on any specific policy disagreement. Funding Ukraine is now a question of internal institutional trust rather than military spending or external alliance-building.
However, this may not be such a clear-cut case of disinformation as implied by the Republican chairman. Our IIA framework concludes that the key to a successful divisive campaign is to identify information that is salient to the target audience, imitate it to ground the salience and then amplify the message. CEICs circulate information already present within a specific community rather than relying on planting and spreading new information. This is what makes it so effective as a source of divisiveness. For example, a recent article in Foreign Affairs shows that Russian disinformation actually “echoes talking points by the American far right, and not the other way around.”
Protecting trust
We are entering a strategic competition in which the battle is over divisiveness versus democratic cohesion of effort, both internally at the national level and internationally among allies and partners.
Scholars and policymakers need to better understand the imitation and amplification mechanisms used to sow division, beginning by addressing the role of information technology platforms. Through well-orchestrated campaigns, platforms can create or reinforce beliefs, which can then be leveraged to further widen schisms in Western society.
The recent US action against TikTok is an example of what happens when countries play catch-up in countering the potential of divisive cyber campaigns. This legislation is an important attempt to reduce opportunities for CEICs at the platform level. However, such action will fall short if better counter-campaigning against the IIA loop of CEICs does not emerge. Private sector companies will need to reexamine their roles and responsibilities in democratic societies. Simultaneously, governments will need to facilitate more direct engagement with citizens to renew and rebuild institutional trust.
The power to influence is not ultimately in spreading (dis-) information per se but in queuing specific individuals to receive the information one wants to spread. Acknowledging and correctly understanding what we are up against is a necessary first step to any effective counterstrategy to maintain democratic resilience in a dynamic cyber-enabled information space.