Can a video game teach you to resist disinformation?
The U.S. government certainly thinks so: In May, the State Department’s Global Engagement Center (GEC), the government agency tasked with countering foreign disinformation, released a request for proposal offering $1 million for “an evergreen game in a sandbox platform, with an existing fan base, in which participants play a game that builds cognitive resilience to authoritarianism and promotes democratic norms and values.” The call for a sandbox platform refers to open, multiplayer game spaces such as Minecraft, Roblox, or Fortnite, which allow players to build forts, explore virtual worlds, experience short stories, and share experiences. This request is asking for proposals to use creative mode in Fortnite (or a similar platform) to design a custom game experience—only instead of being fun, it is meant to train people to resist Russian disinformation.
It’s an intriguing way to combat an existential challenge for democracy. Can play undermine lies more effectively than speech does? There is a lot about this idea that is compelling, but there are just as many reasons to be skeptical.
The GEC’s idea certainly has some validity. It wants to leverage the emerging field of prebunking—the art of making people aware of disinformation before they encounter it—to help build media literacy skills and contribute to online safety. This is a process that researchers call “inoculation,” which treats disinformation like a virus: You need to train your psychological immune system, so to speak, to learn how to identify and reject bad information. Researchers have suggested different methods for this, ranging from a very literal metaphor of exposing people to “weakened” forms of common disinformation up to complex media literacy training intended to prepare people to identify disinformation on their own.
Using games as part of the battle over information isn’t new. The United Nations Office of Counter-Terrorism has an entire project devoted to understanding the role of video games in what the U.N. calls “countering violent extremism.” Late last year, the Swedish Psychological Defense Agency—which, like the GEC, is empowered to combat foreign disinformation—sponsored research into foreign political interference that uses video games. And the European Journalism Observatory has highlighted video games, specifically, as a vector for disinformation during Russia’s ongoing invasion of Ukraine.
So, the GEC is addressing a serious problem with global implications. And the sandbox anti-disinformation proposal is not the only video game program that the agency is funding. As Aftermath reports, it is also offering $250,000 for a program at the U.S. Embassy in Ukraine that will use the process of building an esports team and hosting an esports tournament to provide “counter disinformation/conflict resolution training to confront foreign propaganda and disinformation in competitive online gaming spaces.” While these sums may seem high, a typical “indie” game (one that is not developed by a major studio) can cost a million dollars or more, and so-called AAA games (such as Grand Theft Auto, Fallout, or Call of Duty) can cost hundreds of millions of dollars to develop.
One challenge that inoculation programs face is establishing success conditions. After all, how do you know when someone is successfully protected against disinformation? There is no good answer for this yet—we can design experiments and surveys to measure how messages are being accepted or rejected by a population, but—like other preventative measures—success is negative. You know the program worked if you don’t see people repeating disinformation, rather than knowing it worked because some tangible finish line has been crossed. It is a problem requiring constant vigilance. In that sense, the GEC’s call for an evergreen (permanent) game to counter disinformation is aligned with broad aspects of disinformation research.
But is a game the best way to do this? For decades, games studies have adopted an argument put forth by Dutch cultural historian Johan Huizinga in the 1930s: Games and play are essential to civilization, because they (however unintentionally) teach children how to socialize and move within rules-based systems in a mirror of society.
Building on those ideas, media theorist Ian Bogost coined the term “procedural rhetoric” in the 2000s to argue that video games instruct players to view the world through a certain set of rules and to discard others—even when trying to “break” a game system, he argued, players are still learning how rules and games work. If one accepts this line of argument, then it would naturally follow that an effort to design a game to inoculate against disinformation has the potential to be highly effective.
There are some problems with this approach. The research into so-called serious games, which are games intended to do something other than entertain, suggests that they are the most effective when they are also fun to play. This is a bit of a contradiction, since a serious game is not made with entertainment as its primary purpose, and that is reflected in the GEC’s call. There is no mention of the evergreen game being fun for its players. The agency, understandably, is focused on the outcomes of the game, not the game itself. But making serious games fun is a hard challenge that researchers are still working on, and without it, the effectiveness of any serious game will be limited.
The fun challenge has plagued efforts to use video games to do achieve goals in foreign policy, statecraft, and human rights since the start of the 21st century. Games such as the International Committee of the Red Cross’s LifeRun (2020) or 11 Bit Studio’s This War of Mine (2011) try to cultivate in players a concern for civilians in warfare. The Lebanese militant group Hezbollah released Special Force (2003) so players can battle against Israeli soldiers in South Lebanon, and Fursan Al-Aqsa (2022), places players in the shoes of a Palestinian student who seeks revenge on the Israeli soldiers who tortured him in prison. Fursan is available on Steam, an online video game marketplace used by players around the world that (relevant for the GEC grants) also restricts sales in Russia and Belarus due to sanctions stemming from Russia’s 2022 invasion of Ukraine.
Militaries have used games for propaganda, too, from America’s Army (2002) to China’s Glorious Mission (2011). Some of these games went nowhere. (Hezbollah, for instance, did not make a fun game.) But others, such as America’s Army, endured for decades because they were fun—and that game became fun by abandoning some of its more serious pretensions as new editions were published.
While it is clear that the GEC is drawing on a large number of precedents, ideas, and projects, is there evidence that any of it works? After studying the Red Cross’s LifeRun game, which seems to be a close analog to the GEC’s call for proposals, scholar Jolene Fisher concluded that there are structural limits to what these games can be expected to do, given their small distribution and limited scale. In a recent report, the Carnegie Endowment for International Peace observed that initiatives to support local journalism and media literacy education were far more effective at undermining disinformation than statecraft or counter-messaging, but the former are also much more difficult to fund, implement, and scale.
Bogost, the media scholar, reflected in 2018 on his experience trying to make “persuasive games” and concluded the concept was more promise than delivery. “It was emotion and novelty that drove much of the interest in this work,” he wrote, not concrete or supportable projects. It could be that games are just an accessible channel to do this work compared to more effective methods.
There are broader issues with the GEC’s plans, too. I wrote my Ph.D. dissertation on the U.S. Army Esports (USAE) team, an effort launched in 2018 in an attempt to use esports to bolster years of flagging recruitment. The U.S. Defense Department certainly seems to be convinced that the team has been effective in growing its recruiting pipeline and boosting morale, however controversial it may be. But it also does not release data to support its claims of effectiveness, and in 2023, the Army announced a major overhaul of the recruiting process due to multiple consecutive years of missing enlistment goals. If the USAE is effective at growing recruitment, that growth was hard to see. (The service claims that it is on track to meet a much lower recruiting goal in 2024).
I wasn’t alone in observing the limited effects that games on influencing thinking. A couple of years ago, games scholar Philip Hammond observed that decades of U.S. military influence on video games has coincided with declining recruiting and less public trust. If games can persuade people, it’s hard to see how.
This does not mean that such programs are a failure, nor does it mean that the GEC’s program is futile. Rather, it indicates that, as Bogost cautioned, we should be clear about the gap between promise and delivery, and mindful of where that gap emerges.
The GEC’s success in persuading social media companies to moderate away Islamist extremist content on their platforms (the most effective way to counter disinformation, according to researchers) suggests that it sometimes can do this work effectively. After all, while the growing presence of extremists in video games is a real concern, it is the community and discourse around games where that extremism tends to emerge, not within the storylines and play of the games themselves.
Games scholar Sky LaRell Anderson calls these conversations “extraludic narratives,” and in studying them found that they form an important basis for building communities around sharing gameplay experiences. Such a dynamic leaves open the potential for the GEC’s sponsored esports team in Ukraine to influence some of those narratives about Russia, or even to cultivate a community of resistance against Russian narratives in Ukraine’s Esports spaces. But researchers find this dynamic hard for outsiders to understand in real time, much less to intentionally shape beforehand. Governments just aren’t cool, and the USAE’s own engagement scandals point to the many scenarios where government sponsorship might be a poison pill.
The GEC has experienced this with its other efforts to counter disinformation. Its successful campaign to contain Islamist disinformation online, when applied to countering Russian disinformation, resulted in the center being subjected to unfair, partisan attacks by far-right politicians in the United States. Republicans in the House of Representatives tried last year to block the center’s budgetary reauthorization, falsely claiming that it targeted conservatives for censorship. Embattled Rep. Darrell Issa disputed the need for a counter-disinformation agency and claimed that the GEC had no successes to justify its budget despite the agency’s successful work countering disinformation.
The dishonest nature of these attacks points to a difficult political environment emerging for the agency. It could be the case that sponsoring games and gaming events is all that the agency has left if platform governance has become closed off by toxic right-wing politics. The GEC is a meaningful organization that treats the threat of disinformation with the appropriate seriousness.
But if politics prevent the agency from responding effectively to disinformation in the venues where it can be the most effective, it is hard to blame it for trying something else. Still, we should be cautious and keep our expectations in check: As unfair as the right-wing attacks on the agency are, and as hard as it works to address disinformation globally, those same attacks will also be carried over to the teams and games the agency sponsors.
Even in an ideal environment, there would be modest expectations for such a small program, but those may be impossible to meet. Disinformation is ultimately a political challenge, not a technical one, and the politics of disinformation in the United States have already tied the GEC’s hands. It’s just not clear how this political problem can be solved with a video game.
The post There’s No Dodge Button for Disinformation appeared first on Foreign Policy.