The material (in a shortened version) was published on the DOU platform
“Russia understands the importance of preserving sports traditions, but at the same time is open to everything best, new, and advanced. Open to what is popular among young people, including in sports”. – President of the Russian Federation Vladimir Putin on esports at the “Russia – A Sports Power” forum (October 2023).
Over the two years of Russia’s full-scale invasion of Ukraine, the topic of propaganda in video games has been raised several times. It is usually discussed in two contexts. The first, more common, context is the use of video games as a platform for propaganda. A key material frequently cited by Ukrainian media is a New York Times article about the use of some of the world’s most popular games – Roblox and Minecraft – to recreate real events, such as the Battle of Soledar or the celebration of Russia Day.
Roblox is an online game platform that hosts numerous diverse games developed and published by users as virtual “worlds”. Players can create their own games using the built-in Roblox Studio tools and share them on the Roblox platform for other users. For instance, one of the most popular Roblox-based games, Brookhaven RP, consistently has several hundred thousand active players, with a peak of over one million active players in August 2023.
Minecraft is a video game that combines elements of survival, building, and adventure. Players in Minecraft are placed in an open world where they can gather resources, construct buildings, develop their character, and interact with the game environment and other players.
Attempts to understand the genesis of this phenomenon were made on the Ukrainian gaming industry resource Gamedev DOU (Parts 1 and 2). Game designer Andrii Petkevych addressed the issue more comprehensively in a lecture at the Games Gathering: Gamedev Under Bombs conference in the spring of 2022 (lecture theses are available in text format at the link).
According to Andrii Petkevych, Russian propaganda through video games received a strong impetus in 2003, following the U.S. invasion of Iraq and the deterioration of relations with Russia. He links the intensification of this propaganda to increased profitability in projects targeting the domestic market and a demand for stories of heroism (such as “heroic plots” from World War II or the Soviet invasion of Afghanistan, like 9th Company or the fabricated story of the 28 Panfilov Guardsmen), providing corresponding examples of games.
The second context concerns esports. Anton Khimyak of the Hybrid Warfare Analytical Group highlights Russia’s exploitation of esports in his review. Esports figures who had only niche popularity before the full-scale invasion were invited by traditional media for interviews to discuss Russia’s instrumentalization of esports. This was relevant to Russia’s hosting of the “Games of the Future” in February 2024 – an esports tournament opened by Russian President Vladimir Putin, who declared it a gift from Russia to the “global sports family”.
This material will address another, but no less important, aspect of the gaming environment that has not yet been explored in the Ukrainian segment. It concerns the infrastructure for game distribution and gamer interaction, specifically the most popular platform for this – Steam, a product of the American company Valve. Although Steam primarily serves as a marketplace where users can purchase games for personal use, it also possesses all the features of a social network. It includes personal pages, friends lists, private messaging, groups, the ability to publish user-generated content (UGC – content created by users such as text posts, images, game modifications, etc.), and more.
What is Steam?
Steam was launched in 2003 as a tool for downloading updates for games developed by Valve. Later, it evolved into a platform for the digital distribution of games from various developers, and by 2013 it controlled 75% of the online sales market. The platform is so large that in 2021, Valve was sued in a U.S. court for allegedly establishing a monopoly on the gaming market and hindering competition.
According to various estimates, Steam has 100–130 million active users per month. Moreover, according to data from the research company GameDiscoverCo, as of April 2023, Russian gamers made up nearly 10% of Steam’s audience. This is the third-highest figure after the U.S. (13.7%) and China (12.3%). This high share persists despite certain restrictions Steam operates under in Russia—while the platform is accessible, some developers’ games cannot be purchased.
However, that is only formally the case. The Russian segment of the internet is filled with guides on how to bypass such blocks (examples can be found on the website of the Russian bank Tinkoff, the popular forum Pikabu, or the online magazine Lifehacker). Commenters on such materials often state that they have no issues purchasing games using payment cards from Russian banks.
Comments on the guide “How to buy games on Steam in Russia quickly and easily”
So, in fact, Russian Steam users do not experience any difficulties using the platform, meaning that these 10% of the audience are still present. Such a high number of Russian users suggests the hypothesis that Steam may be used as a platform for Russian propaganda—primarily through UGC (user-generated content) such as nicknames, images, and posts. To confirm or refute this hypothesis, we will examine Steam’s content moderation rules and their effectiveness, as well as look for cases of Russian propaganda on the platform.
Within Steam’s ecosystem, there is a section where UGC content is widespread—Steam Community. This is a separate “world” created in 2007 to allow users to share content and interact in various ways around specific games. Every video game represented on the Steam marketplace has its own “hub,” through which developers can quickly receive feedback from their audience about their games, and gamers can exchange various types of game-related content, such as screenshots, videos, and forum-like discussions.
Another part of Steam Community relevant to this discussion is personal pages and groups. These do not require much explanation as they are similar to those found on other social networks.
Content Moderation on Steam: Regulatory Framework
When registering on Steam, a user agrees to three things:
- They are 13+ years old.
- Valve Privacy Policy—a standard document on the privacy policy outlining the data collected and stored by Steam.
- Steam Subscriber Agreement—a document covering everything that happens on the platform, including community rules under the name Steam Online Conduct. These rules/prohibitions are quite general for a document regulating the behavior of users on a platform with over 100 million unique visitors per month:
- Engaging in illegal activities. For example, exploiting minors, theft, inciting real-world violence, fraud, or creating a fake identity to deceive other users.
- Uploading or publishing prohibited or unacceptable content. For example, realistic or violent scenes of violence, any scenes of child exploitation, or sexually explicit content (though posting such content in a game hub is not prohibited).
- Violating the personal rights of others. For example, doxxing (publishing personal data), fraud, theft, impersonation, harassment, gaining access to other users’ Steam accounts, or publishing defamatory statements meant to disgrace others.
- Harassing other users or Steam employees. This includes trolling, provocation, threats, spam, intimidation, curses, and insults.
- Disrupting Steam, harming the platform, or manipulating it. For example, attacking a Steam server, manipulating Steam reviews or other content, or automatically creating Steam accounts.
- Disrupting or harming other users’ computers. For example, uploading or linking to malware (viruses, Trojans, etc.) or attacking another user’s computer.
- Cheating. For example, using cheats, smurfing, or artificially boosting your rank in the matchmaking system.
- Violating the intellectual property rights of others. For example, uploading materials you do not own the rights to, using materials protected by someone else’s trademark or copyright, or violating patents, trade secrets, or other property rights.
- Illegally collecting or distributing information. For example, collecting data about other users (including email addresses) or distributing surveys, multi-level marketing materials, chain letters, or advertising emails.
- Engaging in commercial activity. For example, posting advertisements, conducting contests, gambling, buying or selling Steam accounts, selling content, gift cards, or other items, and begging.
In a separate clarification, there is a section about UGC content, and its content is even more concise:
- Do not upload unacceptable content, such as: realistic or graphic scenes of violence; any scenes of child exploitation; sexually explicit content. Publishing such content in the game hub is not prohibited (this refers to sexually explicit content as part of a certain game – author’s note).
- Do not bypass the rules regarding adult content on Steam.
- Do not upload content to which you do not have legal rights.
- Do not upload images of other people without their consent.
- Do not use content to bully, insult, or humiliate other users.
- Do not post advertising content.
- Post content only in appropriate places. For example, do not post Dota illustrations in the Counter-Strike game hub.
Overall, Steam’s code of conduct does not contain detailed rules related to hate speech or the use of political content on the platform. Perhaps the developers naively believed that disinformation would not affect them. Instead, the rules regarding “harassment of other users” seem abstract, leaving it unclear whether Steam moderation considers situations where a user’s nickname and profile picture have a chauvinistic or racist nature to be harassment. However, the same clarification notes that the rules are not “comprehensive”; rather, they “provide an understanding of what type of content is unacceptable.” This leaves even more room for differing interpretations by moderators.
There are two types of moderators on Steam: paid moderators hired as contractors by Valve, and community moderators—those engaged by developers of specific games to moderate their hubs (discussed below). Although little information is available about Steam moderation and its mechanisms, it is known that paid moderators work exclusively with content reported by users. The exact number of moderators employed by Valve is unknown, but it is easy to imagine the number of complaints on their long list. After all, content with hate speech based on race, nationality, or gender can be easily found, even in Valve’s native English language, not to mention the dozens of other languages used on the platform, such as Russian, Turkish, Polish, German, and more.
As for the second type of moderators, the so-called “community moderators”, Steam is interesting because environments dedicated to specific games (Steam Community Hubs, mentioned above) can be partially moderated by the developers themselves, bypassing platform moderation, which can intervene only after user complaints. This applies to images, game modifications, videos, game reviews, guides, and discussions.
Moreover, the rules for such moderation indicate that developers can effectively “disable” Steam moderators from participating in discussions in their hub. How the powers of different moderators intersect, who holds higher authority, and under what conditions Steam moderators can intervene in a hub if its content does not comply with platform rules remain unclear. We know that, “by default”, it is Steam moderators who review content complaints, but we do not know what situations exist beyond the “default” format.
Attention to Content Moderation in Gaming Environments
The opacity of moderation on Steam and the topic of gaming communities in general has drawn the interest of U.S. government agencies. Below are just a few examples of such interest:
- In January 2024, the U.S. Government Accountability Office published a report with recommendations for the FBI and the Department of Homeland Security (DHS) to develop strategies to counter extremism in gaming environments.
- In September 2022, DHS allocated approximately $700,000 in grants to research extremism in gaming environments.
- In December 2022, Senator Margaret Wood Hassan submitted an inquiry to Valve CEO Gabe Newell regarding far-right user-generated content (UGC) on Steam. The senator argued that it was easy to find personal pages and communities on the platform that used Nazi slogans or symbols, or names of figures from the Third Reich, in their names or images. The inquiry also contained four questions for Newell regarding content moderation on Steam, with the most telling question being: “Does the content listed in Appendix 1 (Nazi UGC content) violate the Steam Subscriber Agreement, Steam Online Conduct Rules, or any other company policies regarding user conduct?” In other words, it was unclear whether the company even considered the use of SS symbols or slogans like Blut und Ehre (“Blood and Honor” – the Hitler Youth slogan) to be a violation of its platform rules. Newell did not respond to this inquiry, and the content mentioned in the letter can still be found on the platform without much effort.
Problems with content moderation within games or gaming environments (particularly on platforms like Discord, the Twitch live streaming platform, Steam, etc.) can have direct real-world consequences in the form of societal radicalization, recruitment to extremist organizations, or even the organization of mass shootings.
For instance, in 2022, a mass shooting took place in Buffalo, New York, where 10 people were killed and three others were injured. NBC News journalist Ben Goggin cited excerpts from the accused shooter’s conversations. The accused, 18-year-old Payton Gendron, indicated that another gaming environment – Roblox – partially influenced his interests and political views. Gendron held far-right beliefs and subscribed to the “Great Replacement” conspiracy theory, which claims that a “genocide” of white people is taking place in modern-day America:
“…playing Apocalypse Rising on Roblox piqued my interest in survival and weapons, which led me to hunting and shooting. The result – tinnitus and a fascination with firearms.”
“I probably wouldn’t be as nationalist as I am if I hadn’t played Blood and Iron on Roblox.”
It is important to note that this does not indicate a direct cause-and-effect relationship between gaming and real-world violence. Moreover, such cases should not be grounds for strict regulation of games or for blaming developers for criminals’ actions. Games can be one of many factors in an extremely complex chain of individual radicalization, but the issue here is one of content interpretation rather than game content itself. Returning to gaming environments as a whole, the U.S. National Strategy for Countering Domestic Terrorism recognizes that such platforms can be recruitment and mobilization tools for terrorist activities, similar to messaging apps and social media.
Outside the U.S., the topic of gaming environments also draws attention from European and international actors. For instance, in 2021, the European Commission published two reports on the relationship between gaming environments and radicalization. Both reports focused on mapping extremist infrastructure in gaming environments, including Steam. One report provided an overview and noted that despite efforts to update moderation protocols in 2018, problematic content remained easily accessible on the platform.
The other report aimed to develop recommendations for counter-extremism and counter-terrorism bodies, emphasizing the need for further research into the topic (thus justifying general recommendations without concrete steps). Another example is a report from the UN Counter-Terrorism Office, which also addressed the link between radicalization and gaming environments.
One of the main conclusions of the UN report was that counter-extremism bodies should collaborate with platforms like Steam and offer moderators training on recognizing extremist content. However, the report also emphasized the importance of maintaining balance and avoiding overly strict regulations. Efforts to change content moderation should “incorporate input from gaming communities” rather than being one-sided.
Public attention, media scrutiny, and pressure from journalists, civil society, and researchers regarding gaming environments are yielding results. For example, Roblox is the only popular gaming environment (with over 200 million monthly active users) that has comprehensive community rules similar to Facebook’s.
The creation of these rules was a response to close attention to the game and high-profile cases associated with it: from ISIS propaganda to recreations of Nazi concentration camps from World War II. The current version of Roblox’s community rules prohibits content that promotes extremist/terrorist organizations or individuals associated with them, as well as hate speech based on various traits (such as race, nationality, religion, etc.). The rules also strictly limit political content – the platform bans any campaigning for political parties or public office candidates, desecration of political symbols (such as virtual “burning” of national flags), and content related to “real-world national borders”.
A case relevant to Ukraine involved the removal of two games related to the Russia-Ukraine war from the Roblox platform: War on Larkiv: Ukraine and Battle for Ukraine. The first game took place in the fictional city of Larkiv, which resembled the real city of Kharkiv. The second game recreated battles from Russia’s full-scale invasion of Ukraine, with the Russian Armed Forces and “DNR armed forces” among the factions users could play as. The platform removed these games after BBC journalists contacted the developers.
Russian Propaganda on Steam
It is important to note that the cases listed below are unlikely to be the result of “bot farms” or any other organized attempts at information influence. Most likely, this is organic behavior by regular Russian users operating under conditions of anonymity and a lack of adequate platform moderation. Given the ease with which the pages and communities listed below can be found, we assume that there is no moderation of Russian-language or other foreign-language content that promotes Russia on the platform. Below are examples of Steam user profiles categorized by their nicknames:
Names of Individuals Involved in Russia’s Armed Aggression Against Ukraine
Encroachment on the territorial integrity of Ukraine
Mention of groups fighting on the side of the Russian Federation
Slogans or symbols of the invasion of Ukraine
Xenophobia, hate speech
Communities
It is also easy to find publicly accessible groups on Steam where users post content dedicated to glorifying Russia. The number of such communities is unknown, so we will examine their content using a few examples.
Community “Za Россию!”
The main page of the group greets users with a message stating that it was created “to support and unite the Russian population.” The group features a comment thread where participants can interact with one another. The thread is filled with pro-Russian slogans and symbols related to Russia’s invasion of Ukraine:
In the “useful links” section, the community administrators added templates of the above symbols and a link to the community server in Discord called “Я Русский!”:
“RUSSIAN FEDERATION” community
The group’s description includes a link to a Discord server, the text of the Russian national anthem, and links to other communities that are allegedly “friendly” to this one (all of which are similar in content and name: Forever ZoV, ZOV, PMC WAGNER GROUP). The group’s comments contain content similar to the previous case:
In the “discussion” section there are two publications: one glorifies Russia, and the other is an encrypted language of hostility towards Ukraine:
“Forever ZoV” community
The description of the community includes a link to a Discord server with imperial Russian symbols, and a separate note “caringly” states that the community administration “does not support xenophobia and discrimination”:
It is likely that the note on xenophobia does not always work, as in the community’s discussion section, there is a hostile complaint from a user about their profile being raided (a slang term for mass posting of identical messages on a user’s page) by representatives of the Ukrainian segment of Steam:
The comments in the community have the same character as the previous two examples – the mass publication of symbols of the invasion of the Russian Federation:
Other
Below are other examples related to UGC content that do not fall under the above categories.
Creation of Xenophobic Images
During the research of personal pages, we encountered a user profile actively creating caricature-like images on the topic of the Russian-Ukrainian war. As of the time of writing, the author has over 240 images. Here are just a few examples that either have a xenophobic nature or violate Ukraine’s territorial integrity:
The link on so-called “DPR” assistance
A user with the nickname “Вперёд Россия Z,” who actively supports the Russian Federation in its aggression, includes a link to a “humanitarian fund” located in the United States in their profile description. This fund is said to provide “aid to the people of Donbas.” The website’s color scheme reflects the colors of the “state flag of the DPR,” and the “Links” section on the site includes websites such as “Православие.ру” (“Orthodoxy.ru”), “Русская Вера. Христианский Ренессанс” (“Russian Faith. Christian Renaissance”).
Using game modifications to reproduce real events
The user “winston,” with a profile picture of an actual Russian soldier, actively uses the game Arma 3 to recreate real events of the Russian-Ukrainian war. Arma 3 is a shooter game that simulates large-scale military operations in fictional locations. The game allows for user-generated modifications, including new locations, characters, and weapons. The user’s screenshot collection includes numerous abstract “sketches” related to the Russian-Ukrainian invasion, some of which depict real locations. For example, one screenshot shows “game versions” of Russian soldiers at a location resembling the Chornobyl.
The use of photos of the Russian military in the temporarily occupied territories of Ukraine
Some Steam users upload real photos to their pages. As in the case with traditional social networks, this tool exists to “highlight” one’s personality, so that other platform users can see what you like. In general, Steam profiles have wide modification options—users have a whole arsenal to create a unique page. The user “x64 spinbot” decided to use photos of soldiers from the Wagner PMC in temporarily occupied territories of Ukraine to “decorate” their profile.
Recommendations
The moderation issues on Steam have repeatedly been a topic of discussion among experts and governmental institutions. Unfortunately, Valve does not have a reputation as a company that listens to feedback and collaborates with various stakeholders.
For example, Gabe Newell or other Valve representatives have yet to respond to the aforementioned request from U.S. Senator Margaret Wood Hassan regarding content moderation. In a press release about a €1.6 million fine for Valve for violating EU antitrust laws, the European Commission noted that the company chose “not to cooperate,” whereas fines for other gaming companies mentioned in the text (Bandai Namco, Capcom, Focus Home, Koch Media, and ZeniMax) were reduced due to their cooperation with the European Commission.
Gaming journalists highlight the monopoly issues Steam faces in the market and the internal difficulties within Valve’s corporate structure, with the company doing little to correct them.
However, some examples of interaction and adequate moderation do exist, even though they are already 4-5 years old: in 2018, the company removed 179 games with explicit sexual content from Steam and stated it would remove games deemed “illegal or blatant trolling” by Valve. In 2019, at the request of the German media regulator, the company removed 30 pages with Nazi content from the platform.
Additionally, the European Commission may examine Steam under the Digital Services Act, which requires at least transparency reports and various reporting mechanisms and cooperation with public authorities. Furthermore, if the monthly number of Steam users in the European Union reaches 45 million (currently around 35-40 million), according to the same act, Steam would be classified as a Very Large Online Platform (VLOP). This would create additional compliance conditions for Valve with European legislation.
Ukraine, in turn, could follow Germany’s example of removing Nazi content on the platform and influence Valve in a similar way through relevant authorities. Especially since governmental bodies have shown interest in gaming topics: in March 2024, the Ministry of Culture and Information Policy announced it would reach out to game developers regarding Ukrainian localization; the Ministry of Digital Transformation has experience successfully collaborating with Valve in 2022 regarding the restoration of payments to Ukrainian developers, which were suspended at the beginning of Russia’s full-scale invasion.
Attention can also be drawn to the issue through specialized media such as Kotaku, Polygon, Eurogamer, and many others. Moreover, there is already an example of cooperation with Ukrainians on Steam: in 2022 and 2023, a Ukrainian Games Festival was held on the marketplace in honor of Ukraine’s Independence Day, created in collaboration with the Ukrainian Games platform.
Returning to Steam itself, the situation may change if Valve implements certain reforms on the platform:
- Update Steam Online Conduct. The platform’s community guidelines are general and open to interpretation by moderators. Valve should pay special attention to hate speech and references to real people or organizations involved in extremist or terrorist activities. A good example of such rules in gaming environments is Roblox’s community guidelines.
- Automate part of the moderation process. Currently, Steam lacks automated sanctions for pages using hate speech in their usernames or profile pictures. Valve could develop a “blacklist” of symbols, words, and images that the platform would automatically block.
- Review the community moderator policy. The idea of implementing autonomous moderators for hubs and groups is a good initiative, as moderators with game context knowledge can perform their duties more accurately. However, these moderators currently have virtually unlimited power, especially regarding “forum” sections with discussions and debates, where developers may entirely refuse to implement “higher” Steam moderation. The first step to revising the policy would be to remove this provision.
- Engage with civil society, experts, and government bodies. Valve often finds itself in a position of ignoring issues present on Steam. To improve the platform, the company could participate in events and research specifically related to moderation in Steam (e.g., Extremism and Gaming Research Network). Furthermore, an organ similar to Meta’s Oversight Board could be created for Steam, composed of researchers on radicalization through games, gaming journalists, etc.
- Increase transparency. Many aspects of Steam’s operations are currently unknown. This uncertainty could be dispelled by publishing transparency reports that address issues like extremism, bullying, hate speech, etc., and the company’s efforts to make the platform safer. Examples of similar reports in the gaming industry are from Xbox, Roblox, and game developer Electronic Arts.
Such changes could happen if governments from countries with a large Steam audience influence Valve (primarily the U.S. and EU countries through the European Commission). Meanwhile, the gaming community currently has only one leverage—mass complaints about propagandistic content. While moderation on Steam doesn’t always respond quickly to such complaints, some successes have been achieved. For example, one of the personal pages mentioned in the case study above ceased to exist by the time this text was finalized.