Fighting fake war imagery

Online misinformation from military simulation games can be mitigated
Main Top Image
This image was created with the assistance of DALL·E 2

The resurgence of violence in Gaza and Israel has led to the online diffusion of several video game clips that have been passed as real war footage. One popular video on X incorrectly claimed to portray an Israeli helicopter being fired upon by a Hamas militant. Others purported to show rocket fire launched by Hamas or the downing of two Israeli helicopters. 

The videos in question derived from Arma 3, a 2013 military simulation (MilSim) released by Czech video game developer Bohemia Interactive. While doctored footage has also emerged from other commercially available games, such as Digital Combat Simulator, Arma 3 remains the most prominent example.

Arma and misinformation

The war in Gaza is not the first time Arma gameplay has been mistaken or misused as actual combat footage. Among the earliest examples was a 2011 documentary featured on Britain’s ITV1, which mistook an Arma 2 clip for an Irish Republican Army propaganda video.

More often, though, footage has been shared widely during conflicts or flashpoints. The Russia-Ukraine War has been a focal point. Circulated clips have depicted—amongst other episodes—a drone strike against Russian vessels, volleys of Ukrainian anti-aircraft fire, an attack upon Russian tanks, and a confrontation between Russian forces and NATO helicopters. Occasionally, such footage has even crept into conventional news media.

Earlier examples abound. One popular video allegedly portrayed an aerial bombing in Sudan; others linked it to fighting in Mali. Footage also went viral during the 2021 Israeli-Palestinian crisis, while a different clip circulated in September 2021, furthered by Indian news media, purporting to show a Pakistani plane under fire in Afghanistan’s Panjshir Valley. The same visual was spread in 2020, albeit then linked to violence over the contested Nagorno-Karabakh enclave.

Arma footage also circulated following Iran’s January 2020 ballistic missile attack on al-Asad Airbase in Iraq. Another 2020 video, which amassed millions of Facebook views and later resurfaced in 2021, showed an alleged Taliban shootdown of an American aircraft. Moreover, whether it was the online depiction of a fictitious Turkish drone strike or the erroneous broadcasting of Arma footage by Russia’s state-operated Channel One, similar incidents occurred during the Syrian Civil War.

Sandbox and clickbait

Arma has fuelled a litany of fake news largely because the game is highly conducive to user-generated modification (‘modding’). This includes allowing players to download a wide array of content created by others via the gaming platform Steam. Arma’s ‘sandbox’ feature is one its greatest strengths, yet it is also at the core of why the game has been amenable to misuse.

Bohemia Interactive, Arma’s developer, suggested in January 2023 that it is difficult to remove deceptive footage featuring user-generated content not only due to the sheer volume of material, but precisely because such content does not necessarily violate the Arma 3 end-user license agreement. Even though doctored footage can be promptly debunked, its removal is therefore not a simple task.

This capacity for misuse and challenges in the moderation thereof have been compounded by social media, which incentivise ‘clickbait’ material through viewer metrics. Arma 3 videos commonly originate from gamers aiming to expand their online following, before such clips are then misappropriated for political or trolling purposes.

Frequently, though, efforts by users to increase their follower base have themselves fuelled the production of misleading content. Many Arma clips include disclaimers about the footage’s MilSim origins, but these are often less-than-prominently displayed and overshadowed by the addition of misleading, attention-grabbing titles or graphics. 

Much clickbait material includes nothing more than a sensationally absurd title, while the videos themselves cannot by any measure be mistaken for reality (e.g., “3 MINUTES AGO! Russia’s Crimean Bridge collapsed along with a billion liters of oil”). Sometimes, though, this sort of material presents comparable problems to those videos uploaded with no such disclaimer, blurring the already porous boundary between clickbait and flagrant misinformation. 

In August 2022, for example, a video supposedly capturing a Chinese attack on Taiwan gained millions of views. Although the original post declared that the content was derived from Arma 3, the attention it received forced fact-checkers to emphasise that the content was merely gameplay. A similar incident occurred immediately before the invasion of Ukraine when a Facebook livestream showed Arma 3 gameplay superimposed with a deceptive ‘Breaking News’ banner.


Mitigating the spread of doctored footage requires a multi-pronged effort by game developers, fact-checkers, online users and gamers, and social media companies.

Bohemia Interactive does not have an employee dedicated to countering the misuse of its game, but the developer has shown initiative in debunking misinformation. Against the backdrop of renewed violence in Gaza and Israel, the company reposted its guide—previously developed in response to the Russia-Ukraine War—on how to identify gameplay-based misinformation. This guide calls for the responsible use of the game, warns against devising clickbait titles, and highlights telltale signs of doctored footage, such as low-resolution visuals, shaky camera motions, unnatural smoke, fire, or explosions, and unrealistic military equipment. 

Bohemia Interactive has also regularly cooperated with journalists and fact-checkers. However, as the developers make headway on Arma 4, a game promising improved graphics and greater modability, the problem will likely worsen.

At present, social media companies have struggled to promptly flag and remove content. While some clips have been taken down, valid questions remain about time lags in content moderation and how to address the cross-posting of videos between multiple platforms. For instance, one clip uploaded by a ‘verified’ account on X, which had been masquerading as Israeli intelligence, depicted the alleged interception of projectiles with Israel’s ‘Iron Beam’. Before the video was taken down, it had accumulated 5.8 million views. Another Arma 3 clip showing the supposed shootdown of Israeli helicopters has amassed 224,000 views on X, yet it has neither been removed nor has a user-generated Community Note been attached to contextualise the content.

User-led moderation may offer a partial corrective, especially as X has increasingly sought to outsource this task to its user base. Regular players of MilSims and first-person shooters can serve a valuable role. Through their familiarity with the game, they can promptly debunk misinformation via comment sections or promote contextual Community Notes. As one article published by the Wilson Center recently noted: “It wasn’t just the game’s developers who stepped forward [to debunk doctored footage] but also a community of online users intimately familiar with the game.”

As someone who has amassed hundreds of gameplay hours on the Arma series over the past decade, I agree with its developers: “It’s disheartening for us to see the game we all love being used in this way.” The successful flagging and removal of misinformation places considerable responsibility on social media companies. Stemming misinformation cannot nor should be entirely delegated to users. Gamers can, however, do much to strengthen the imperfect system of community-led moderation.