Skip to main content
Propwatch Tell us how you really feel cover

Tell Us How You Really Feel: Analyzing Pro-Kremlin Propaganda Devices & Narratives to Identify Sentiment Implications

Samy Amanatullah, Serena Balani, Angela Fraioli, Stephanie M. McVicker, and Mike Gordon. The Propwatch Project

Illiberalism Studies Program Working Papers no. 14 February 2023

Photo Cover by John Chrobak made using “Russia” by AMCharts.com licensed under CC BY-NC 4.0; “Ukraine” by AMCharts.com licensded under CC BY-NC 4.0; “Владимир Соловьёв – 09 (07-02-2022)” by Информационное агентство БелТА licensed under CC BY 3.0; “Double-headed eagle of the Romanovs, Winter Palace, St. Petersburg (71)” by Richard Mortel licensed under CC BY 2.0; “Shiny onion domes from a russian orthodox church” by Old Photo Profile licensed under CC BY 2.0; “Sunflower sky backdrop” by Fir0002/Flagstaffotos licensed under CC BY-NC 3.0; “Taras Ševčenko 20160702 5136” by Branko Radovanović licensed under CC BY-SA 4.0.

The contents of articles published are the sole responsibility of the author(s). The Institute for European, Russian, and Eurasian Studies, including its staff and faculty, is not responsible for any inaccurate or incorrect statement expressed in the published papers. Articles do not necessarily represent the views of the Institute for European, Russia, and Eurasian Studies or any members of its projects.

©IERES 2023


Almost a year since Russia’s invasion of Ukraine, initial outrage and incredulity towards the invasion have since congealed into a resigned acceptance of ongoing uncertainty, punctuated by narratives stressing contradictory, region-specific, and incompatible worldviews. That initial outrage spoke to a widespread disbelief among North American and European countries that such an event could be carried out under pretenses so divorced from the mainstream media messaging in North America and Europe and common geopolitical understandings. From this perspective, it represented the most severe subversion of the status quo since World War II.

A particular source of fascination has been the Russian public’s domestic support for the so-called special operation and the extent to which the general public genuinely accepts the denazification and Western-hostility narratives that have been provided for much of the invasion’s justification. On one hand, the invasion, and Putin, still enjoy widespread support among the Russian public; however, the lack of pluralism in the Russian media landscape, strict crackdowns on dissent, and the threat of consequences for speaking out against the war have led some analysts to suggest that the interior feelings of the Russian public are inaccessible.[1] The role of pro-Kremlin messaging—including state-sponsored and state-supportive messages regardless of connection to official channels—has also been cited for its efficacy in managing Russian public perceptions of geopolitics and anti-Kremlin information.

Indeed, the last decade has seen a spike in attention paid to the Kremlin’s use of non-conventional devices that have now coalesced as information warfare, including hybrid, asymmetric, and nonlinear warfare—all terms which speak to the supposedly innovative applications of propaganda and disinformation mobilized by (though, of course, not exclusively) the Kremlin. Entrepreneurial, opportunistic uses of content—including of Western media—have been mobilized by newly-emerging technologies and resurgent opposition to a globalized, neoliberal economic system; they are being used to help promote narratives favorable to authoritarian systems. Many of these narratives draw on the importance of strong central leaders and the need for obedience at the expense of institutional transparency and personal freedoms. They often dismiss human rights and concepts of democracy as insincere, imperialist, and/or inefficient.

The polarization present in many Western political systems stands in stark contrast to the united front the Kremlin presents in the Russian public’s support for its leadership and, in particular, for the invasion. This has led many to wonder how and why Putin has been so effective at selling policies that, to Western powers, seem objectively false and detrimental to the public.[2] This is especially true of attitudes concerning the use of propaganda, which is the focus of this paper.

Given the perceived widespread support of the Russian people for the invasion of Ukraine, the blatant disinformation that has been used to justify it, and the efficacy with which the Kremlin’s narrative has withstood international pressure, there is much interest in identifying the strengths and innovations of the Kremlin’s propaganda usage. Through content and discourse analysis of pro-Kremlin media focused on Ukrainian and domestic Russian audiences, this study seeks to address the following questions:

  1. What propaganda devices have been employed to maintain support for Russia’s invasion of Ukraine?
  2. What factors have enabled the success of these devices?
  3. What aspects of the Kremlin’s narrative strategy are innovative and propaganda usage is new or otherwise different from the past?

To address these areas, this study first describes its approach to defining and categorizing propaganda. This is followed by a brief summary of historical approaches to categorizing propaganda, and then a literature review of existing research on propaganda and disinformation devices used in pro-invasion messaging. A description of this study’s methodology leads into a summary and discussion of the narratives and propaganda devices identified by this research. An analysis of the narratives and techniques, rooted in Grice’s Cooperative Principle and Fairclough’s tenets of discourse analysis, is then used to identify the implications of Kremlin narratives, and based on those implications, assumptions of the Kremlin about broader geopolitical priorities.

Techniques & Typology in Propaganda Analysis

To define propaganda for this study, the authors use Jowett and O’Donnell’s definition: “Propaganda is the deliberate systematic attempt to shape perceptions and manipulate cognitions and direct behavior to achieve a response to further the desired intent of the propagandist.”[3]

Propaganda persuades in part by connecting the desired result with attitudes, symbols, or emotions that the audience already has strong feelings towards, using subtle, concealed suggestions to build a new attitude toward the desired result.[4]

Ellul describes the transition from information to propaganda as the distortion of public opinion, noting that it “employs psychological methods of influencing; it attempts to predetermine a decision; it involves one in a current of thought and violates both conscience and will.”[5] Propaganda can be a way to bind people together, often by the use of fear.[6]

Walton identifies propaganda as discourse wherein (a) the speaker’s goal is to have the audience take action, (b) the message is meant to persuade a mass audience in a direction using logical evidence or not, (c) the argument is partisan and advocates one side of an issue as strongly as possible, and (d) the actions that the speaker takes are justified by the results, even if those actions include dangerous activities.”[7]

Propagandists can employ a wide variety of devices or techniques to achieve their goals. Techniques can be considered a set of cognitive devices influencing a receiver to bypass rational thought and prompt a response that furthers the desired intent of the propagandist, similar to a magician’s repertoire or “bag of tricks” that have been around for centuries.[8] Like the magician’s bag of tricks, there is no single collection or list of propaganda techniques that have been adopted by all.

For this work, the authors are referencing a list of techniques assembled by Propwatch Inc., an educational nonprofit started in 2018 and dedicated to raising public awareness of propaganda and disinformation. The organization uses a proprietary web platform that can catalog and cross-reference embedded video segments of propaganda techniques in authentic use. The list is based on the most common propaganda techniques that the organization has detected in authentic political content since its video analysis began in 2018. These techniques are often used during political speeches and debates, media appearances, and advertising campaigns.

Because of the wide variety of methods that can be used to achieve a propagandist’s intent, typologies are often used to group them into clusters, categories, or types based on a common set of behaviors or attitudes.[9] A typology is a “hierarchical system of categories used to organize objects according to their similarities and dissimilarities”.[10] The development of a typology helps researchers understand conditions or factors by grouping items with similar characteristics. This can help provide “a framework for processing, organizing, and understanding information.”[11] A typology is formed by grouping cases together according to their type based on features they have in common, while continuing to consider the ways in which each individual case represents a particular.[12]

Propwatch has categorized these techniques under the umbrella of five distinct categories: (a) distractions and diversions; (b) fear, uncertainty, and doubt; (c) oversimplification; (d) transfer and association; and (e) falsehoods and distortions.[13] The typology table in Appendix A is the product of categorization based on common cognitive characteristics that certain techniques share. Propwatch took a thematic approach to the development of its typology, allowing for in-depth exploration of each technique’s shared characteristics, patterns, and perspectives.

A Brief History of Propaganda Classification

An early classification of propaganda dates to the late 1920s, when Edward Bernays published his seminal work, Propaganda.[14] Approximately a decade later, the Institute for Propaganda Analysis (IPA) released its list of seven original devices of propaganda in 1936.[15] From 1937–1942, the IPA’s goal was to help people recognize the types of propaganda increasingly used in the public sphere in the lead-up to, and early years of, World War II.[16] The IPA’s list of seven common devices found in propaganda materials was an effort to combat the propaganda that had spread as World War II approached.[17]

The 1930s–40s also gave rise to a new classification, the Colors of Propaganda, which posited three classes of propaganda: white, black, and gray.[18] White propaganda has clearly labeled origins and a transparent purpose, e.g., army enlistment posters and advertisements for war bonds. In this way, white propaganda is truthful, or, at the very least, based on truth and facts. Gray propaganda is information of questionable origin that is never sourced and is of doubtful accuracy. This form of propaganda often presents information that seems legitimate, but the genesis of this information, including the names of the groups releasing it, is almost never accurately sourced. Black propaganda is information put out by an opposing government or institution and made to look as though it came from a friendly source. Black propaganda uses partial truths as they are more believable than blatant lies.[19]

The IPA’s ABC’s of Propaganda, and the Colors of Propaganda, however, were just a few models of propaganda classification. Cunningham argues that propaganda exploits information, poses as knowledge, and generates belief systems, leading to the conclusion that propaganda is entirely unethical.[20] Propaganda has the tendency to reinforce existing beliefs. It is because of this that Cunningham suggests that a propagandee is complicit in the spread of this disinformation: because of the nature of the message, one sets oneself up to willingly accept this message thus being complicit in one’s own susceptibility to disingenuous or opportunistic messaging. 

Another key framework comes from Propaganda Techniques, which looks at propaganda more neutrally. Conserva notes that, “propaganda causes the reader to suggest, imply and assume; propaganda discourages reflection, reason and understanding.”[21]

Cialdini (1984) identifies different forms of persuasion, suggesting the six principles of influence: reciprocity, authority, scarcity, commitment & consistency, liking, and consensus.[22] A seventh principle, unity, was added in Pre-Suasion: A Revolutionary Way to Influence and Persuade.[23] By that, he is referring to a shared identity that both the influencer and influencee are part of. The more we perceive people are part of us, the more likely we are to be influenced by them. This fits with the entire theme of Pre-Suasion, which is to create a favorable state of mind just before the actual persuasion effort occurs.

Participatory, or peer-to-peer, propaganda marks an emerging approach to propaganda, where a person has the ability to replicate the indoctrinated message in their networks, allowing them to feel involved, to identify with the messages, and to affirm their beliefs as true. Asmolov posits that social media first builds the object of a conflict that can potentially divide people, and then it gives it the technological tools to run that propagandistic idea. This, in turn, becomes an instrument of polarization and disconnect by creating a bubble that validates the version of facts to which people are exposed.[24]

A Brief History of Pro-Kremlin Disinformation Classification

Russia’s ability to manage information through the promotion of strategic narratives has been a source of begrudging respect, even among its critics.[25] Domestic support for the invasion of Ukraine, even existing in a black box of unlikely authentication for the West, provides a foundation for support, especially in contrast to openly politicized Western and Indo-Pacific democracies. Of particular concern is the question of how much of the Kremlin’s approach is novel and to what extent is it simply old wine in new bottles.

The Debate Over Technology’s Impact

Technology has advanced exponentially in the past decade, cultivating more opportunities in global and domestic governance than once thought possible. Fast-growing social media sites influence the transfer of information to a dizzying degree. Technological advancement has affected the dissemination of propaganda narratives for both the audience and the propagandist. The question of whether contemporary Russian propaganda is fundamentally different from its past Soviet counterpart has resurfaced, at the same time as Kremlin interference in foreign elections have emerged consistently, usually to accusations of Russophobia.

A general observation across scholarship is that while Russian propaganda techniques may have grown out of Soviet-era strategy, the Soviets were more concerned with disinformation, while the Kremlin today concerns itself with misinformation[26]. However, the role that technology plays in this strategic growth is a point of divergence within the scholarly community, generally following two schools of thought.[27] Kilkenny provides a literature review that presents these two schools in the context of twenty-first century propaganda.[28]

The first understands Russian propaganda campaigns as fundamentally unchanged with respect to Soviet-era propaganda, as the regime’s goals have persisted. For example, both Soviet and modern Kremlin information efforts seek to discredit, influence, and spy on others. Technology makes that easier, not into something different.[29]

The second school of thought considers the groundbreaking effect that technological advancement has had on propaganda as a means of disseminating information, and as a relationship between propagandee and propagandist, arguing that this advancement is so powerful that it will create new paths to enacting these campaigns at a much higher rate of success. This theory postulates that technology will facilitate change to such a degree that propagandists will now be able to achieve ends which were previously unimaginable.[30] While the former is concerned with the fundamental goals of a propagandist, the latter is focused on the methods the propagandist uses to achieve those goals and, through these new methods, the potential to create new goals along with it.

In support of the former perspective is the argument that Moscow’s tools of propaganda in the West are still fairly basic, relying on the same aspects of human psychology that propaganda has always relied on.[31] Additionally, the viral nature of social media for information-sharing and the public’s vulnerability to it are cited as contributory factors to technology’s ability to further the aims of propagandists.[32] However, even this perspective cedes that the fast-paced advancement of technology may provide for more effective, surreptitious infiltration of democratic states through the use of propaganda. This would be an acknowledgment of an amendment, growth, or maturation of methods, although this argument still holds that the goals of the propagandist remain the same.[33]

For example, computational propaganda, which is the programmed dissemination of propaganda narratives across social media and online platforms, is currently detectable, but technology is advancing at such a rate that soon this strategy will become virtually undetectable.[34] While it can be argued that many of the propaganda-centric goals of this new regime are shared by past ones, Polyakova and Boyer acknowledge that a greater emphasis has been placed on blurring the distinction between fact and fiction, and propagating division in foreign countries, in the past decade.[35]

This acknowledgement addresses what many scholars consider a fundamental change in propaganda campaigns as brought on by rapid technological advancement.[36] These scholars may find themselves in greater support of the latter school of thought surrounding propaganda and emergent technologies. Regardless, across the conversation, scholars have come to the general consensus that the digital tsunami of technology has paved the way for, not necessarily the creation, but at least the efficacy of, certain propaganda techniques.[37] For example, conspiracy theorists are the largest source of propaganda throughout Europe by a large margin, and it is the emergence of the online platform and its viral nature that has allowed these conspiracy theorists to infiltrate the forefront of mainstream news.[38] The speed with which information travels online has allowed these theories to spread like wildfire, while rendering them nearly impossible to debunk.

The virality of such sites and apps are most effective when paired with an audience that is void (whether forcibly or by choice) of alternative, less viral sources for news, such as private news channels without the popularity algorithms that many social media apps possess. One study found that nearly a quarter of US adults use TikTok as their first source of news, and that, if broken down into generational statistics, almost one-half of Americans ages 10-25 use TikTok for daily news.[39] While the same study has not been done in Russia, the parallels for the future of propaganda usage are stark and may demonstrate some validity behind the latter school’s argument for a change in methods resulting in a change in goals.

From Disinformation to Misinformation

Given the nature of online and social media platforms, misinformation has never been easier to disseminate to a global population, replacing the disinformation aims of the past.[40] A study of young Russians’ attitudes toward the news found that when watching Channel One, the most common source of Russian television news, many young Russians were not blind to the traditional propaganda techniques being utilized by Russian newscasters and were not as easily manipulated as generations past.[41]

Without an ability to control the narrative due to the virality of social media and the Internet, some scholars have found evidence of a Russian strategy to stray from disinformation and lean into misinformation, employing a firehose of falsehood method.[42] A study done in Serbia and Ukraine compared the narratives disseminated on vaccines from the Kremlin, finding that pro-Russian Serbia was provided with clear and consistent narratives, whereas anti-Russian Ukraine was fed confusing and muddled ones.[43] This supports the claim that new goals are developing out of the new methods available to propagandists, namely, confusion.

Whereas clear but inaccurate disinformation campaigns have been used in the past, Russia is disseminating misleading, unclear misinformation campaigns to select audiences over which the regime has a weakened level of control. The ability to confuse through online platforms including social media is the greatest change that technological advancement has provided propagandists, and the mass migration of much of the public from traditional television news to online news viewing will only act as a catalyst in this new wave of propaganda.[44]

In a comparison with China during the outbreak of the COVID-19 pandemic, Foster looks at this scholarly consensus of confusion propaganda through a slightly different lens, using what she calls “epistemological nihilism.”[45] She defines this as “the form of philosophical skepticism that holds that knowledge either does not exist or that, if it does, it remains unreachable.” When disseminated to vulnerable populations, employing this tactic may “poison the well,” priming the population with a sense of overwhelming doubt towards any new information they receive, no matter its accuracy.[46]

Adapting old methods to fit modern levels of technological advancement is central to the contemporary Russian propaganda strategy. Miller describes Russia’s post-Euromaidan propaganda strategy as “incredibly flexible,” a necessity in the uncharted territory of online news and social media.[47] The Russian regime is able to implement multiple narratives at once to cultivate fear and confusion in their target audience. Additionally, Miller concludes that the regime has traded its responsive 2010s approach during Euromaidan for a more proactive approach in Ukraine in the 2020s. During the 2010s, the Kremlin controlled the narrative mainly through labeling after the fact; in the 2020s thus far, the Kremlin disseminated anti-Ukrainian propaganda well before launching its first attack on Ukraine, establishing the narrative rather than merely controlling it.[48]

Generally, Russian propaganda is understood to have developed in four ways since the Soviet era. First, pro-Russian narratives have been introduced into Russian cultural centers, think tanks, and social media. Second, the methods of propaganda have been updated in accordance with technological advancement. Third, Kremlin propaganda is now flexible enough to be tailored to the psychological makeup of each country receiving propaganda. This flexibility allows the Kremlin to adapt their message to be most effective in each state in which it is employed. Finally, the Kremlin has harnessed the “openness of Western media” to its advantage.[49]

Exploring Narratives and Techniques

These four developments have all influenced Russia’s creation of contemporary propaganda narratives, alongside its resurgence of old, powerful narratives. Viral narratives, such as conspiracy theories, are being pursued along with more emotionally driven propaganda. For example, a common narrative disseminated to both Russian and Ukrainian audiences is the “soft” narrative that encourages a nostalgia for the shared history of Russia and Ukraine, emphasizing the “brave past world” they have endured together.[50]

One of the most popular narratives circulating from the Russian propaganda machine in the 2020s is the “denazification” narrative.[51] While the presence of Nazism in Eastern European countries had been mentioned throughout the decade before by Russian leaders, it was after the 2014 annexation of Crimea that it had significantly increased. It skyrocketed again on February 24, 2022, the day of the first documented attack in the invasion of Ukraine.[52] This narrative has been effective for domestic Russian audiences because Nazism is not as strongly linked to religion or antisemitism as it is in Western shared history, a fact that befuddled many in the United States and Europe and explains why many would believe that a Jewish-Ukrainian president could be a Nazi. Russia’s history with and framing of Nazism allows it to function as a conflation or dysphemism with any enemy, particularly when accompanied by accusations of culture-based human rights violations. This has allowed President Putin to discredit the existing Ukrainian government and call for a new

one to be installed, allowing him to control not only the narrative, but the potential of a new government as well.

Oates and Steiner contribute to the characterization of these narratives by arguing that, although the propaganda used in Ukraine and Syria are anchored in the style of Soviet past, the narratives being explored are more “organic” than simply post-Soviet. Russia has pushed the “West against Russia” narrative from many angles, including the Russophobia lens, conspiracies around the Skripal poisoning, and the downing of Malaysia Airlines Flight 17.[53]

To achieve the plethora of contemporary narratives on the Russian propaganda docket, the techniques used by Russian leaders have expanded. A sister technique of the firehose of falsehood, asymmetric flooding, has been used by Russian leaders to confuse and misdirect their domestic audience.[54] Brandt compared the strategies used by both Russian and Chinese leaders and found overlap in techniques such as whataboutisms, conspiracy theories, and a myriad of anti-Western narratives.[55] Russian government officials have also developed a variety of foreign news sources to transmit Russian narratives on current events; this includes the use of online trolls, fake news, and even the use of deep fakes and artificial intelligence in some instances.[56]

Generally, the Russian agenda has strived to discredit opponents, especially amid the contentious war in Ukraine. A study comparing three separate Russian talk shows found commonalities in the use of “informational selectiveness,” or cherry picking, and the fabrication of equal representation of proponents and opponents of the regime in the studio audience.[57] This framing tactic was bolstered when audience reactions were framed as representing the united support of the nation in response to issues presented by Russian authorities.

One final aspect bearing discussion is the use of reflexive control. Today, Russian propaganda and disinformation campaigns can be categorized as either offensive or defensive. In an offensive campaign, reflexive control is a key component of the Kremlin’s persuasive strategy. It can be loosely defined as the presentation of strategically framed information to an opponent to induce that opponent to voluntarily make the decision favored by the propagandist. Reflexive control aims to affect the decision-making process to control it while it is in progress, rather than having the target react to a decision and then control their perception. Moreover, the goal is not to persuade the opponent, as that would also be controlling the result of the process, rather than the process itself, but to disrupt the process, pollute it with an excess of information, and paralyze decision-makers to the point of exhaustion.[58]

Methodology

This study collects and analyzes propaganda claims in headlines, memes, and similarly packaged content to identify strategic narratives used or invoked by the Kremlin and Kremlin-supporting actors to justify aggression against Ukraine and to maintain this aggression. An initial set of content was observed and collected from fact-checking and debunking sites from January 1, 2018, to June 30, 2018. Additional data was collected from periods leading up to the invasion and after. Similarly packaged content includes memes, video titles, social media posts, news report snippets, and any other content where text underlying content serves the function of a headline.

Content Selection

Headlines and content with equivalent or adjacent functions to headlines—including memes, titles and captions for audiovisual materials, hashtags, and any other content that leverages overt signals and concise packaging against an issue’s depth of nuance and the heuristics needed to interrogate that—have been shown to be extremely effective at perpetuating disinformation. On social media, headlines are often the only part of a story that is read. News organizations change content titles based on content’s placement and medium, including sensationalizing headlines on social media “in ways that wouldn’t be accepted on their own websites” to optimize them for maximum social media engagement.[59] Headlines used in disinformation are increasingly designed to grab attention to be amplified through engagement-driven algorithms just as they are “designed to be liked and shared based on headlines and photos.”[60] Bot accounts often only include headlines, retweets or memes when they reshare content.[61]

Disinformation is typically designed to present unverified claims as true or credible, instilling a sense of urgency and encouraging re-sharing. Such methods include hedging language and declarative sentences that assert the authority of dubious claims. Implications, even in the form of a question, have lasting links in readers’ minds and persist even if accompanying text offers rebuttals or clarifications.[62] Content analyses of demonstrably false or misleading news content suggest that such content is shorter and uses more hedging language and hyperbolic linguistic features.[63] These features support the notion that intentionally misleading content functions through heuristics rather than argumentation.[64]

Framework

This study considers disinformation as intentionally disingenuous content disseminated with the intention to change stances and/or fortify existing ones. In analyzing a subject that uses hybrid strategies, this study appropriates a hybrid framework that combines aspects of linguistics, discourse studies, and content analysis.

Cooperative principle

The Cooperative Principle (CP), conceptualized by Paul Grice, argues that within verbal communication “each participant recognizes in them, to some extent, a common purpose or set of purposes, or at least a mutually accepted direction.”[65] With this, Grice outlined Kantian maxims that dictate effective discourse. These are:

  1. Quantity
    1. Make your contribution as informative as is required (for the current purposes of the exchange).
    2. Do not make your contribution more informative than is required.
  2. Quality: Try to make your contribution one that is true.
    1. Do not say what you believe to be false.
    2. Do not say that for which you lack adequate evidence.
  3. Relation:
    1. Be relevant.
  4. Manner
    1. Avoid obscurity of expression.
    2. Avoid ambiguity.
    3. Be brief (avoid unnecessary prolixity).
    4. Be orderly.

To some degree, these are subjective and dependent on the standards of users, and they are not equal in effect of violation: “a man who has expressed himself with undue prolixity would, in general, be open to milder comment than would a man who has said something he believes to be false.”[66] Flouting these conventions is also a means of communication. Satire, for example, is the intentional disregard of the quality maxim.

While the CP was designed for conversation, it has been applied to text, in, e.g., Dor, Al Kayed and Kitishat, Ecker, Lewandowsky, Chang, and Pillai.[67] As argued by Kheirabadi and Aghagolzadeh,[68] media functions as a conversation between publishers and audiences. Publishers are expected to conform to these maxims, and audiences respond with attention and financial support. The interactive nature of online media has strengthened this aspect of the relationship. Of course, flouting and violating these maxims is a reality of the news media. Clickbait, for example, is the intentional flouting of manner and quality. Violations, which are intentional and covert, are antithetical to the stated function of independent, balanced media.

Applied to the analysis of both propaganda and disinformation, Grice’s maxims presuppose the widespread, systematic violation of discourse norms. “Examining on-line behavior, not content, in order to detect deception is most important,” and the CP bridges content and behavior.[69] Disinformation is the mobilization of violations, and violations are behavior. Thus, this study analyzes media content as discourse that is actively deceptive.

Critical discourse analysis

With this understanding, critical discourse analysis (CDA) is applied to disinformation. Discourse analysis is an interdisciplinary approach frequently described “as the analysis of language ‘beyond the sentence.’”[70] Initially, it was conceived of “as the discovery of patterns of formal equivalences across the sentences in a text” by Zellig Harris, and adopted by Noam Chomsky, “to focus it more exclusively on the constituent relations within the sentence itself.”[71] The work of linguist Roger Fowler, in particular in Language and Control, provided a foundation on which much of CDA has been established.[72] CDA is the application of analysis to power structures and their rhetorical means. To this, van Dijk defines CDA as the study of social power and discourse, and specifically, how “abuse, dominance, and inequality are enacted, reproduced, and resisted by text and talk.”[73] Ramanthan and Tan note several features about CDA when applied to media: it is problem-oriented, interdisciplinary, rooted in social research, interested in unethical issues, and recommends spontaneous assessment and constant reevaluation.[74]

CDA has been applied to news media for various purposes. van Dijk’s News As Discourse and New(s) Racism are influential applications of CDA to news media, and Fairclough’s Language and Power has been influential in applying CDA to media discourse.[75] CDA has been applied to social media, for example by Farkas, Schou and Neumayer on social media hate speech.[76]

This study primarily uses Fairclough’s social theory with a limited application of van Dijk’s ideology theory. van Dijk’s ideology theory conceptualizes four principles framed around the “us versus them” dichotomy:

  1. emphasis of positive things about us;
  2. emphasis of negative things about them;
  3. de-emphasis of negative things about us; and
  4. de-emphasis of positive things about them.[77]

This is also referred to as an ideological square. Social theory proposes a three-dimensional model that includes description (analysis of text), interpretation (discourse in text production and interpretation), and explanation (situational, institutional and societal effects).[78]

Given the rhetorical challenges of disinformation—its goals of inciting outrage, its nebulous origins in power structures, its lack of concern for truth or logic—social theory allows for a multi-dimensional analysis of descriptive and discursive features. It incorporates means of production and context-based interpretation in analysis, which is crucial for understanding audience engagement with disinformation. Finally, aspects of explanation consider power relationships and transformative action, which are important features of the discourse within and about disinformation. 

The content that forms the data sets was collected from fact-checking and debunking sources. When a claim appeared on a website for the period specified, the headline as it appeared on the disinformation source was first copied and coded according to semantic topics. This research excluded claims that were deemed to be at least half-true, as determined by a fact-checking website. Each coded group was analyzed by commonalities of actor, object, stance, sentiment, and implication. Headlines were then interpreted through CP and social theory and presented in terms of description, interpretation, and explanation.

Source Selection & Study Limitations

The initial data was collected from two sources. The first is StopFake. Started as a volunteer project by the Kyiv Mohyla Journalism School and the KMA Digital Future of Journalism to analyze Kremlin propaganda,[79] it has expanded. Data from StopFake is augmented by the European External Action Service East Stratcom Task Force’s EUvsDisinfo website.[80] This site collects and catalogs Kremlin disinformation efforts, listing markets and publishing outlets. Information is included if it is “a) false, which is determined by the facts and b) originating and in line with identified pro-Kremlin disinformation messaging.”[81] For each claim, the website provides a summary of the disinformation, explains falsities and provides links to explanatory information. The website has revised its wording to “not necessarily imply that the outlet concerned is linked to the Kremlin or pro-Kremlin, or that it has intentionally sought to disinform.”[82] In this sense, theoretically it might include misinformation.[83]

This study comes with limitations. The collected data have been disproven and shown to be false or otherwise disingenuous, but do not encompass the totality of narratives and are not statistically representative. It is important to note that discourse analysis has the potential to draw broad conclusions or posit widespread trends based on a limited corpus.

Finally, this study analyzes content translated from other languages, primarily Russian and Ukrainian. While StopFake and EUvsDisinfo operate or offer content in English, the original analysis is conducted in and based on non-English language content. This has minimal effect on the results. The focus of this study is the identification and analysis of broad narratives and the devices undergirding them. While this limits analysis of lexical and grammatical features, the content analyzed is meant to display limited syntactic variation, utilize overt cues, and be as widely accessible as possible. Most of the material analyzed was originally released in Russian and/or Ukrainian and intended for Russian and Ukrainian audiences. In some cases, the claim originally appeared in a different language (including English, German, and Georgian) meant to communicate a certain message across to a certain audience.

Kremlin Narratives & Pro-Kremlin Narratives

Following a spate of domestic reforms limiting civil society, the nonprofit sphere, and independent media, the Kremlin has enjoyed a substantial level of control over access to information. Noting that polling suggests high domestic support for the Kremlin’s policies and for Putin, Western incredulity over this support was perhaps best epitomized by reports of Russian citizens dismissing and denying claims of Ukrainian civilian deaths and bombings even when those claims were made by relatives or close contacts in Ukraine actively witnessing the bombings.

The false perception of what so many Russians believe to be justifications for the invasion of Ukraine and the ensuing rational disconnect of the Russian domestic audience may be explained in large part by a key set of false narratives the Kremlin has advanced since its 2014 invasion of Crimea. Analysis of 2018 content and additional analysis by Propwatch identified the following narratives:

  1. Russia is pure
  2. Russophobia is rampant in the West
  3. Ukraine is an illegitimate state
  4. Ukraine is weak, hostile, and/or a human rights violator
  5. Don’t trust the source (or The Truth F**k*)

The rest of this section summarizes each narrative, provides examples of pro-Kremlin/anti-Ukraine media that embody it, and identifies key implications concerning the narrative and its application.

1. Russia is pure

This narrative positions Russia as the defender of traditional values and history, a claim which usually operates in tandem with a siege mentality. This Russian traditionalism is framed as being in opposition to a decadent, immoral Europe, which itself is sometimes framed as being in the process of cultural erosion or decay due to the influence of  the United States. This narrative frames Russian leadership as warrior-protectors of a grand tradition that is inherently and tacitly Christian and ethnic Russian, ignoring the plurality of ethnicities and religions that comprise the Russian Federation. This also posits an opposition to identity and gender politics, which can find sympathetic voices and amplification in transnational far-right messaging networks.

Sample media

By positing its purity through the maintenance of historical norms and appeals to bygone values, Russian leadership can further project its own authority and validity to rule while simultaneously framing external actors as inherently hostile and ideologically imperialist. In tandem with the next identified narrative, this also instills a siege mentality, framing Russian culture as an imperiled, besieged, entity holding out against invaders and warmongers.

2. Russophobia is rampant in the West

Kremlin narratives promote the idea that Russia and Russians are targets of discrimination and fear. Dovetailing with narrative number one, this creates a sense that any criticism or hostility towards Russia is because of hostile actors’ animosity towards core Russian values. If opposition can be shown to be biased, then it is sufficiently discredited and does not need to be addressed.

Sample media:

Accordingly, the downing of Malaysia Airlines flight 17, poisonings of defectors and dissidents, and alleged war crimes in Syria and Ukraine are bluntly dismissed as Russophobia.

In this sense, this narrative has an insulating function. Such staunch denialism means that any agenda-setting involving these items is never agreed to by Russia, that international criticism of Russia is not accepted, that Russia does not have substantial responsibilities to international law that it doesn’t choose to acknowledge, and that in continuing its policies, the Kremlin has a steady stream of examples of Russophobia, with which it can reinforce this narrative. With regard to the authoritarian playbook, the narrative also provides another reinforcing element. If authoritarian systems have a vested interest in reducing transparency and accountability by reducing plurality in media and personal expression, then this narrative creates opportunities for the Kremlin to reduce any externalities in messaging by holding that anything other than the party line is a lie meant to hurt the Russian people. Essentially, a successful mobilization of these first two narratives conflates, then binds, the fate of the Russian people with the Kremlin’s leadership by demanding that the public accept the existence of an existential threat to the leadership as an existential threat to the Russian people, culture, and history.

3. Ukraine is an illegitimate state

This narrative frames Ukraine as a puppet state, usually presenting in three ways or through three narratives:

  1. Ukraine has always been illegitimate because it is Russian territory; its people, culture, and language are subsets of Russian ones
  2. Ukraine became illegitimate after the Maidan revolution
  3. Ukraine continues to be illegitimate because it is a pawn of the West and/or NATO

This narrative makes multiple assumptions. It assumes that the Maidan, and other so-called color revolutions, are Western-backed coups, that Ukraine is an artificial state, that a sort of transnational deep state conspiracy engineered public discontent in Georgia, Estonia, Ukraine, and other states that have expressed pro-NATO, pro-EU, or anti-Kremlin sentiment, and that some cabal of powers, involving or adjacent to NATO and/or the EU, are actively plotting against Russia.

Sample media:

Perhaps most alarming is the steadfast denial of Ukraine’s existential legitimacy and the validity of its culture. Those familiar with the history of both nations and cultures may be less surprised by attempts to completely negate the agency of the Ukrainian people, but this narrative also presents an example of an authoritarian trend to cover all bases in messaging.

Kremlin supporters may disagree as to why Ukraine is illegitimate, but this narrative provides multiple reasons, which may contradict each other and further alienate critics, but that cast a wide net, enabling the narrative to convince parties possibly sympathetic to one argument over others and to provide flexibility in argumentation. If someone contradicts Ukraine’s historical illegitimacy, then a sympathetic party can counter with another approach to this illegitimacy.

Another opportunity created by this narrative is the ability to deflect criticisms in other nations and regions—Kazakhstan’s 2022 unrest is a recent example—where Russia’s sphere of influence is fading. By insisting that any civil action against governmental institutions is caused by Western-backed influence, authoritarian actors have more space and flexibility to navigate domestic and international messaging.

4. Ukraine is weak and/or hostile

While narrative three insists on Ukraine’s illegitimacy, implying that it is property of Russia, this narrative posits that Ukraine is a bad faith actor. As such, this narrative has drawn the most attention and confusion in the West in its manifestation of Ukraine as a Nazi state and the invasion as a “special operation” focused on “denazification.” Accordingly, the Kremlin offers splintered reasons: Ukraine is an aggressor against Russia in the sense that it fiendishly exists to facilitate a NATO invasion of Russia, it’s a Nazi state that persecutes its Russian population and/or plans to attack Russia via neo-Nazi militias; Ukraine is an inept state incapable of handling national duties thus lacks the ability to defend itself (and therefore belongs under Russia) or that its negligence (as with nuclear power) presents a threat to its neighbors and the world.

Sample media:

As with the previous narrative, this one offers a multitude of uses and interpretations that can superficially deflect criticism but can easily be exposed as contradictory. Either Ukraine was never a legitimate state and should be absorbed by Russia, or Ukraine is a delinquent state incapable of governance and should be managed by Russia.

The “Ukraine as aggressor” stance held here reinforces sentiments of Russian purity and protecting a grand tradition. This invocation of World War II rhetoric and sentiment has been well-documented and discussed in the media. The befuddlement of mainstream Western media over the salience of the claim that Ukraine is a neo-Nazi state despite having a Jewish head of state has perhaps revealed limitations and differences in discourse on World War II.

This last notion prioritizes one particular sub-narrative, mainly that Ukraine violates human rights, in particular against ethnic Russians.

Sample media:

As a subset of the weak/hostile narrative, it is insisted that Ukraine is either engaging in human rights violations and/or unable to prevent them within its territory. Ultimately, the narrative positions Ukraine as indefensible by Western moral standards. If Ukraine engages in human rights violations, then it provides a pretense for Russia to engage, and it also discourages others from defending it.

The invocation of this narrative suggests a willful attempt to game the parameters of public discourse using Western values. U.S.-led messaging about World War II tends to reflect a good-vs-evil struggle against fascism and racial/cultural supremacy, recognizing the Holocaust as a red line. The refrain of “never again” in the post-war era is largely seen as a rallying cry for the human rights regime that has been enshrined in the Universal Declaration of Human Rights as well as the two overarching human rights treaties, the International Covenant on Civil and Political Rights and the International Covenant on Economic, Social, and Cultural Rights.

The idea that human rights, epitomized by this regime, are culturally biased, and their promotion is cultural imperialism, has become a common refrain of authoritarian actors, but such claims are not necessarily novel and have achieved considerable validation because of the hypocritical, opportunistic, and inconsistent approaches to them by the powers that claim to uphold them. Because of this, it is easy for authoritarian actors to dismiss accusations of human rights violations as attempts to attack a nation’s sovereignty, and this narrative resonates with post-colonial societies.

Therefore, a narrative that Ukraine is a human rights violator, particularly a genocidal supremacist one, means that any state that supports Ukraine must be willing to weather accusations of being a human rights offender. One who argues against this narrative must face a deluge of easily digestible Russian messaging, requiring a more detailed and nuanced response, in the face of a counterargument that is much simpler. Propagandists can generally count on a wide swathe of the public tuning out the more complex answer in favor of a more immediately comprehensible one.

5. Don’t trust the source (or The Truth F**k*)

Disruption and destabilization have long presented opportunities for propagandists, so the facilitation of these in a given media environment is typically seen as advantageous for controlling an information ecosystem. Russia has long been an innovator in disinformation, with the 2008 invasion of Georgia and cyberattack of Estonia seeming, among Western scholars at least, to signal an evolution in its methods.

This might best be manifested in opportunistic interpretations of what is true, notably discussed in Pomerantsev’s Nothing Is True & Everything Is Possible and This Is Not Propaganda.[84] Paul and Matthews’ (2016) “firehose of falsehood” framing cited “high numbers of channels and messages and a shameless willingness to disseminate partial truths or outright fictions.”[85]

Kremlin disinformation has been noted for being rapid, continuous, repetitive, and without commitment to consistency, but a key aspect of it—and other pro-illiberal messaging—is its need to create an artificial reality that undermines international human rights norms by reorienting standards for truth and accuracy. In this narrative, there is a need for the propagandist to terraform their information environment through information and narrative laundering techniques such as reverse social engineering, false flag fact-checking, crisis actor narratives, and reflexive control.

In reverse social engineering, the propagandist creates fake content, debunks it as fake, and uses that as evidence that people should distrust their information ecosystems. First, Russia creates a fake story, including doctored images. Then their “fact-checkers” claim to debunk the story, showing that the images were altered. Then, they use this to claim evidence of widespread disinformation being peddled by Ukraine or their allies. Moshirnia refers to this as “false flag fact-checking,” the trick being that the image was never used except in their fact-checked story. This is then picked up by the Russian state media, claiming that it is evidence that Russians shouldn’t believe any social media about the war in Ukraine.[86]

Accusations of crisis actors and staged events are another attempt to accomplish this goal. Russian media notably amplified claims that the guerrilla humanitarian organization the White Helmets were crisis actors, the added implication being that the crisis they were responding to, the medical injuries they were treating, the suffering they were seeking to alleviate, all were faked. Within this report’s framework, where the use of devices and narratives can be interpreted as utterances complete with choices to be parsed, the projection of a widespread conspiracy can be another way of instilling a siege mentality—such a big lie must be in the service of an enemy with fierce and disingenuous motivations after all.

The downing of Malaysian Airlines flight 17, which was destroyed in eastern Ukraine in 2014, is another prominent example. Many Russians were convinced that it was faked, justified, or done by Ukraine.[87] Igor Girkin, a rebel leader in eastern Ukraine, furthered a popular conspiracy theory when he said that the deceased on the plane showed no signs of blood, which suggested those in the plane had long been deceased. Photos were shared online to further this theory. Others interviewed were convinced the plane was shot down by the Ukrainian government, or that it was President Putin’s plane, or that it was intentionally steered into a war zone by Ukrainian flight dispatchers who wanted it shot down.[88]

More recently, reporting on the deaths, mass graves, and credible evidence of atrocities in Bucha was denounced by Russian Minister of Foreign Affairs Sergei Lavrov, who notably took to Twitter to dismiss it as “another #hoax by the Kiev regime for the Western media” and a “fake attack.”[89]

Just as with the White Helmets example, where unrelated images of a film set were used to further the crisis actor narrative, Russia is deflecting allegations of war crimes and crimes against humanity by utilizing images that claim to show the war with Ukraine but are, in fact, from other events. One such example of this in action shows footage from an Australian climate change protest with live people in body bags who are not completely still. This has been recaptioned and mislabeled as happening in Ukraine with crisis actors.[90] This strategy is used often: unrelated content can be repurposed to serve denials of visual proof in order to deflect or, more often, project blame onto the victims and survivors. The dehumanization that comes with calling your victim a scheming, manipulative liar is presumably a welcome side effect.

This narrative, which we alternatively label “The Truth F**k*,” encapsulates the cynical approach to information and the opportunistic approaches that undergird pro-Kremlin narratives. They suggest that aspirationally objective and accurate information is unimportant, while manufactured narratives that can be recreated and redistributed and reinforced with the latest memes, soundbites, and informational opportunities of the latest news cycle are as valid as truth. The implications of this device, and the techniques that undergird it, will be explored in the next sections. But they point to a time-honored refrain: the ends justify the means, the strong rule with muscle and bone, the only thing that matters is victory—the strong do what they can, the weak endure what they must.

Devices As Narrative Building Blocks

This section discusses the propaganda techniques, as outlined by Propwatch, that are present in each narrative. For each narrative, the discussion of device use and implication is preceded by a review of the narrative and the sample media.

Among the most common techniques identified and discussed below are [‘N’ is the narrative that the technique was used in]:

  1. Projection / reversal of reality (N2, N3, N4): accusing an opponent of using the same underhanded tactics or committing the same misdeeds the accuser is guilty of doing.
  2. Character assassination / demonizing (N2, N3, N4): characterizing a group or those who support an opposing viewpoint as threatening, immoral, or less than human.
  3. Appeal to tradition / honor by association (N1, N3): suggesting that a long-standing practice must be good or better than a newer alternative because it correlates with past or present tradition/championing public symbols that carry respect, authority, sanction, and prestige to assume the respect, authority, sanction, and prestige of those symbols.
  4. Appeals to false authority / muddy the waters (N2, N4): insisting something is true because someone posing as or being framed as an expert says it’s true/bringing up irrelevant facts to confuse or complicate an issue, which may otherwise be relatively simple and easy to understand.
  5. False equivalency / whataboutism (N1, N4): implying that two things are essentially the same, when they only have anecdotal similarities/throwing an irrelevant fact into an argument to divert attention from the real issue at hand/discrediting a criticism by accusing hypocrisy, in order to shift the focus away from oneself and onto others.
  6. FUD / dog whistle (N1): making dire warnings or raising doubt about an issue, while providing little or no specifics or evidence to support the claims/ambiguous messaging used to stoke racial fear and anxiety and/or to covertly signal allegiance to certain subgroups of an audience.
  7. Poisoning the well (N4): discrediting your opponent to an audience in advance, in order to encourage dismissing any future claims they may make against you.
  8. Appeal to pity (N2): portraying oneself as a victim in order to gain sympathy and manufacture justification for attacking your opponents.
  9. Crisis actors (N5): accusations of any evidence showing Ukrainian civilian casualties as staged, in order to instill the sentiment that you can’t believe anything you see.
  10. Reverse social engineering (N5): creating fake content only to later debunk it as fake, in order to instill the sentiment that no sources can be trusted.

1. Russia is pure

These are the devices that presented most prominently in this study’s analysis of Kremlin propaganda:

  • Appeal to tradition
  • Whataboutism
  • Labeling/scapegoating
  • Transfer, through
    • Honor by association
    • Guilt by association
  • FUD/dog whistle

Appealing to tradition allows the Kremlin to frame its policies and legitimacy around a struggle between traditional values (itself a dog whistle to Christian ethnic Russians) and identity politics typified by liberal messaging on gender politics, immigration, asylum policies, feminism, and other civil society movements. Whataboutism enables Russian actions to always be justified by Western aggression. In essence, Putin and the Kremlin will always be reacting to aggression, thus they are reluctant defenders always on the defensive and thus always justified in their actions.

Through the device of transfer, the Kremlin can engage Russian pride through honor by association and smear Ukraine via guilt by association. Kremlin propaganda champions public symbols that carry respect, authority, and prestige that then transfer those virtues to the Russian leadership, which it then insists it transfers to the Russian people. For example, by claiming to defend Christian values, the Kremlin frames itself as having Christian values and thus is worthy of leading a Christian nation. Similarly, anti-Ukraine narratives are built on guilt by association, which allows the Kremlin to posit that Ukraine’s engagement with pro-Western forces demonstrates that Ukraine is a confederate of the foreign elements instigating color revolutions. Both of these devices rely on labeling—Putin’s Russia as hero, pro-EU Ukraine as villain—with an intrinsic scapegoating: purity is formed by the absence of (perceived) dirtiness; if Russia is pure and noble, then it is so in comparison to things that are not.

The uses of FUD (fear, uncertainty, doubt) and ensuing dog whistling instill a siege mentality; in fact, we can say that the former utilizes the latter to create a conspiratorial mindset to insulate the audience from conflicting information and, when effective, unite them against a shared perceived enemy.

2. Russophobia is rampant in the West

  • Appeal to pity
  • Ad hominem
  • Demonizing
  • Appeals to [dubious] authorities to muddy the waters
  • Reversal of reality
  • Passing the buck

The Russophobia narrative benefits greatly from appeals to pity. By portraying itself as a victim, the Kremlin is able to gain sympathy and manufacture justification for attacking its opponents. Presenting Russia as a victim manipulated and abused by the West provides blanket justifications. Much like the preemptive security rhetoric utilized by the Bush administration to justify the 2003 invasion of Iraq, this device posits that feeling threatened is the same as being threatened.

Through its use of ad hominem, the Kremlin is able to deflect all criticism by accusing victims, accusers, and any other opposition of being, at the very least, prejudiced, which devalues their claims. In a geopolitical system where the Kremlin can no longer be competitive, its main capabilities arise from devaluing the narratives of others, and this is done partly by insisting that the opposition’s bias makes their claims irrelevant. The entirety of the Russophobia narrative requires wide scale demonization. In saying that it is being unfairly criticized, there is a tacit assertion that something is persecuting Russia. After all, if Russia is not at fault, then Russia is inherently blaming others for malign activities and insinuating (or outright accusing) others of being malign actors.

Appeals to compromised or otherwise dubious authorities correspond with another device—muddying the waters—to launder alternative and otherwise non-credible accounts into wider information environments. Like the wide-scale demonization in this narrative, the appeal to dubious authorities to muddy the waters is not so much a new device as it is a larger scale mobilization of so-called “useful idiots.” Conspiracy theories, fringe media in predominantly anti-invasion countries, and fictionalized or manipulated sources are used to amplify narratives that otherwise wouldn’t be seriously reported. As with aspects of US extremist networks, content is sometimes laundered by having a junk news site report on a rumor followed by more credible outlets reporting on the existence of the rumor.

Reversal of reality, which can be conflated with The Big Lie, is used in the face of criticism: Malaysian Airlines flight 17 was downed by Ukrainians. The Skripal poisoning was done by MI6. The “little green men” who seized control of the Crimea were first patriots then Russians. The use of this technique suggests a disposition to truth and a desire to take advantage of lingering uncertainties possessed by Western nations about Putin’s sincerities.

Passing the buck, like reversal of reality, allows Russia to divert criticism and responsibility. Like the whataboutism of the first narrative, it removes agency from Russia to position it as a reluctant defender. Ukraine’s actions, Putin insists, forced Russia. Ukraine incurred this through its actions, and Russia is merely doing what any power would do to protect itself, just as the United States was acting within its rights to prevent Saddam Hussein from achieving nuclear capabilities.

3. Ukraine is an illegitimate state

  • Character assassination
  • Projection + trolling = reversal of reality
  • Poisoning the well
  • Red herring
  • False equivalency
  • Muddy the waters
  • False dichotomy
  • Guilt by association
  • Proof by anecdote/appeal to dubious authorities (drawn to scale)

The narrative itself is a form of character assassination in that it is a deliberate attempt to destroy Ukraine’s reputation both domestically and internationally. This relentless criticism mobilizes labeling, which then transitions to demonizing, seeking to isolate Ukraine externally while eroding Kremlin-resistance internally.

In its messaging about Ukraine, the Kremlin employs a form of projection so rooted in reversal of reality that it is tantamount to trolling. It holds Ukrainian democracy as illegitimate. The projection is that the democratic activity (the Maidan and subsequent democratic elections) is a farce, whereas widespread, credible reporting has shown that Russia goes to great lengths to host heavily telegraphed elections that are meant to simulate democracy.

Kremlin messaging on Ukraine suggests a scorched earth propaganda campaign, where devices like poisoning the well, false equivalency, red herrings, and muddying the waters are all sprayed out to add to the confusion, inducing uncertainty and ultimately isolation. In these narratives, Ukraine isn’t even a genuine state, so anything said about it is appropriate. Its existence is an aggressive action against Russia. Alleging foreign interference by saying the Maidan was a coup is a form of conflation that is meant to confuse, addle, and instill a siege mentality by using conspiracy theories as red herrings to suggest that support for Ukraine is support for an overarching globalist conspiracy. In this framing, Ukraine’s democratic elections and pro-EU civil society are the equivalent of a NATO-led coup. This also contributes to building a false dichotomy: Ukraine is either Russian or within Russia’s sphere of influence otherwise it is a tool of NATO. There is no meaningful in-between. Guilt by association furthers this: Ukraine wanting to align with the EU is the same as it being a puppet state. Finally, appeals to compromised or otherwise dubious authorities, including through the use of proof by anecdote, repeated ad nauseum, weaponize the strategic repetition of claims rooted in Russian paranoia narratives that are primarily inaccurate and at times outright fiction (claims about Georgian snipers present at Maidan, for example).

4. Ukraine is weak/hostile

  • Character assassination and demonizing
  • The Big Lie and reversal of reality
  • Projection

As seen in the previous narrative, anti-Ukraine messaging tends to use character assassination extensively, which effectively becomes demonizing when repeated ad nauseum. Specific to this narrative, it is either hostile (see the sub-narrative below) or it is too irresponsible and/or corrupt to engage in statecraft. Devices like these coalescence to form more expansive, dramatic narratives, such as, chiefly, Big Lies and reversals of reality. Ukraine is framed as a danger to Russia and, at times, to its Western neighbors. This operates on a subtext that the Kremlin is trying to signal to Ukraine’s Western neighbors that it is more trouble than it is worth. Although, if this message is to be realized for the Kremlin, it will only be by playing the long game based on the initial reaction from Western countries. This may serve as another indication that the Kremlin’s propaganda playbook itself may not be so new and different but that the pages have been rearranged for maximum efficiency.

The Kremlin’s use of projection is similarly telling in that it highlights many issues related to the rights of native Russians without ever invoking indigenous rights or frameworks to that extent. This presents a messaging vulnerability as this topic could draw attention to how its narrative of a traditional Russia excludes a quarter of its population who are not ethnic Russians, while its claims to the Crimea would be undermined by its historical treatment of the Tatars.

The sub-narrative of Ukraine as a human rights violator, in particular against ethnic Russians, warrants a deeper look at some of its techniques:

  • The Big Lie and straw man
  • Contextomy (context mining) + dubious (misleading and/or baseless) claims
  • Guilt by association
  • Character assassination/demonizing
  • Dysphemism
  • Appeal to tradition
  • Projection – as seen by the 2014 example of the “little green men”: accusations are often also confessions (even inadvertently)

The Big Lie and straw man reflect the importance of seeing the use of devices and narratives as choices: most human rights issues in Ukraine are also relevant to Russia; however, Russia does not engage with the actual reasons for Ukrainian disenchantment with Russia, instead finding reasons to discredit their priorities on governance and ultimately their existence.

Contextomy and the use of dubious (misleading, baseless, or false) claims are wielded throughout all narratives, but the reaction within the human rights rhetorical space tends to be particularly entrepreneurial. Any spare piece of information can be used, twisted if need be, to support a given narrative. Since many of these narratives are based on Big Lies, reversals of reality, and projection, almost everything can be useful. In this sense, the presence of far-right militias in Azov provided an opportunity to discredit all of Ukraine. This also becomes a trace attempt at appeal to tradition, allowing Kremlin messaging to exploit Russia’s valiant history in World War II to rail against an enemy that is decidedly different in scope, aggression, and capability. Opportunism and scalability again emerge as novel aspects of Kremlin propaganda efforts.

Similarly, while character assassination and demonizing have already been discussed within this narrative, the characterization of Ukraine as a Nazi state, as it relates to Russian World War II sentiment, is reinforced by consistent and comprehensive dysphemism. This can be understood as extreme, automatic conflation by which every opportunity to insult or negatively frame Ukraine heightens the wall of logic used by Russia in a form of metaphorical fortification.

Projection, as well as reversal of reality, is also evident in the hostility narrative and sub-narrative. The quintessential example may be the “little green men” in Crimea which posited first that actions were stemming from Russian patriotism in the face of Ukrainian aggression, rather than Russian annexation, a claim Putin eventually corrected. In this sense, we can say that accusations are often, even inadvertently, also confessions.

5. Don’t Trust the Source (The Truth F**k*)

  • Reverse social engineering
  • Crisis actors
  • Muddy the waters
  • Poison the well
  • Red herring
  • Reflexive control

Devices like reverse social engineering create fake content so that they can be debunked as fake in order to instill the sentiment that no sources can be trusted. Narratives on crisis actors—like those lobbied at the White Helmets and the victims of Bucha—do the opposite by positing actualities as fake news. Claims that the Ukrainian Maidan was staged operate along the same pathway that says that authentic, coordinated, grassroots action is actually telegraphed, astroturfed, manipulation by scheming international players.

Perhaps there was a time when propagandists had to choose between poisoning a well or muddying the waters, or they had to wait for one before they did the other. The disruption of information flows and the degradation of information delivery systems are at play in this narrative. For a variety of reasons, waters can be muddied then poisoned; the same is true of metaphorical wells in the modern era. These devices serve to erode confidence, further confusion, and inhibit effective decision-making, starting with the voter and creeping up into the political elite tiers that balance election fundraising with the optics necessary to achieve public reelection. An ill-informed electorate dogged by confusion, increasing inequality, and political dysfunction will have more unrealistic, disjointed expectations of its leadership, and its leadership will undertake more performative, less meaningful, political action to engage that electorate.

The Kremlin’s use of conspiracy theories, reverse social engineering, and crisis actors are tantamount to red herrings. The ability to exploit political dysfunction and the very real grievances created by it reflects a shrewd use of reflexive control. For example, Kremlin disinformation in Western spheres often plays up issues that are at the professed heart of Western values, particularly freedom of expression. By playing up debates on censorship, Kremlin propaganda can manufacture and guide a response while obscuring the more prominent pathway it is targeting: access to information. As such, while the Western world continues to debate one of the defining civil-political rights issues of the last century, Kremlin messaging exploits the one scholars have identified as likely to be the defining issue of our time.

Hence, the alternative and profane narrative title. This is perhaps a more accurate description of the Kremlin’s approach to information quality and dissemination as well as its projected importance for its audience. Truth as a concept and in practice is something to be manipulated and transformed into a tool. Its value is primarily utilitarian and almost always performative; like everything else in a hyper-commodified society, it fails to have a value beyond the needs of the leadership. People are encouraged to know that their feelings are correct even if the information that prompts those feelings is itself not true.

Comorbidities between Frameworks

In interrogating the extent to which Kremlin messaging is novel, this study has both summarized existing categorizations or frameworks for propaganda devices and identified propaganda devices within Kremlin narratives. This section provides brief analysis of how the techniques identified by Propwatch in Kremlin narratives correspond to prior frameworks of propaganda classification. More detailed charts of this synthesis can be found in the appendix.

When compared to the list created by the IPA, the devices identified by Propwatch correspond explicitly to at least five of the seven devices noted. Propwatch’s findings of “reversal of reality”, “false equivalency”, and “poisoning the well” correspond to the IPA technique of “card stacking” while “honor by association” and “FUD” present as both “transfer” and, more specifically, “glittering generalities.” “Name calling” under the IPA framework corresponds to “dog whistle” and “demonizing” while the Kremlin’s “appeals to false authority” are a form of the IPA’s “testimonial.” Two IPA techniques, “bandwagon” and “plain folks” are less explicitly viewed through the devices identified by Propwatch but are more diffuse in Kremlin messaging. “Bandwagon” and “plain folks” tend to be at least latently present in Kremlin narratives about Russophobia or any messaging that assumes a united Russian front. While these areas are more blatantly activated through devices more clearly linked to “glittering generalities” and “transfer”, it can be argued that both “bandwagon” and “plain folks” operate at earlier levels of persuasion, possibly acting as more rudimentary building blocks in narratives.

According to the Color Propaganda framework, white propaganda corresponds to Propwatch devices categorized as “distraction & diversion”, “transfer & association”, and “falsehood & distortions.”[91] Gray propaganda is a mixture of “distractions & diversions” and “fear, uncertainty, & doubt” while black propaganda engages “falsehoods & distortions.” Cunningham also employs a framework of color designations, with Propwatch’s analysis suggesting that some devices can operate across multiple categories. Identified devices conform to Cunningham’s categories of white, black, and gray propaganda as well as disinformation and the following types of propaganda: agitation, integration, bureaucratic, counter, hate, and “propaganda of the deed.”[92]

Across all of these comparisons, the dispersion of devices suggests that the techniques themselves are not novel, but the ways in which they are combined, arranged, adapted, or reoriented—coupled with advances in technology—allow propagandists to access mental pathways, interact with heuristics, and build more resilient narratives in ways that mark a departure or progression from past behavior. Further suggesting a procedural effect, the Propwatch devices similarly correspond to facets of reflexive control, both according to the four basic components identified by White and the eleven elements identified by Colonel Komov in the 1990s.[93] Across the four components of dismiss, distort, distract, and dismay, multiple devices operate individually, and in conjunction with others, to induce the effects. Across Komov’s eleven elements, the procedural aspects are even clearer: distraction, overload (of information), paralysis, exhaustion, deception, division, pacification, deterrence, provocation, suggestion, and pressure.

This study now moves to explore the implications and alternative ways of understanding this process.

Implications of Kremlin Device & Narrative Usage

The focus of this research was discerning insights from the Kremlin’s use of propaganda narratives. A central question of this study was what, if anything, is new about Russian propaganda, narratives, and disinformation mobilization. A comparison of Propwatch’s analysis with prior frameworks suggests that many of the devices and narratives used in Kremlin messaging are not novel so much as their application is. As such, the propaganda is not new, but aspects of how it is constructed and used are.

Based on the analysis of propaganda devices and the narratives, a number of implications have been identified.

Recreation of a battlefield in digital space

In their rise in popularity, various social media outfits touted themselves as new digital public squares. Russian propaganda echoed this ethos in a different way. While Kremlin approaches are not entirely unique, Russia’s approach suggests the recreation of a battlefield or sort of never-ending, movable No Man’s Land or demilitarized zone for the digital era. Digital spaces are new ideological battle zones in which cascades of activity interact with spaces to posit positive framings, negate new ones, launder narratives, and amplify favorable viewpoints. This has been enabled by technologies that emerged in the late 1990s and early 2000s, particularly social media. Public space can no longer be neutral. In Western societies, where similar tactics have been deployed, this is manifesting physically in formerly neutral or low-priority theaters (school board meetings, etc.). An approach that could only be dreamed of by the previous generation’s propagandists is now a reality.

At least publicly, liberal democracies are still fighting a battle of narratives along the digital DMZ model identified above. Authoritarian blocs, on the other hand, control domestic access to information and increasingly the modes and contexts by and in which content is produced and internalized. It is likely that reporting and arguments stressing the importance of appealing to emotions and questioning the validity of facts to sway opinion are seeking to touch on this issue without registering the root cause. Control and gaming of heuristics in authoritarian/illiberal cycles subverts public discourse such that narratives become lines to toe and hold rather than vehicles for discussion, compromise, and the formation of policy. This lack of reciprocity and the weaponization of it is one of the defining aspects of asymmetrical warfare.

Creating plausible narratives to be believed is less important than having narratives that maintain a front or provide sufficient cover for dissent. Domestically, this means that Russian and pro-Kremlin publics are coaxed and addled into compliance, at the very least, and inspired and mobilized to support leadership at most. Internationally, such narratives can provide sufficient cover for national and multinational entities to remain neutral or abstain from outright condemnations.

Finally, the use of propaganda devices suggests that Putin’s actions, at least tacitly, invoke the Bush Doctrine of preemptive strikes, suggesting that Russia’s motivations are more concerned with what is currently permissible in the international order.

The I’s have it

Based on this analysis, the authors propose a framework which can be described as “The I’s Have It,” comprised of five elements across four steps.

  1. Isolate: control access to information and funding for it.
  2. Intimidate & Intensify: cow the public (for example, through crime enforcement or domestic chaos) to create insecurity and existential dread.
  3. Insulate: some plurality can be tolerated to give the illusion of equivalencies and balance, but narratives should be domestically produced, with external ones edited and tailored to domestic audiences, so that outside and conflicting narratives are easily dismissed or are reoriented to entertain the public.
  4. Incite: society should be conditioned to support initiatives and respond strongly to perceived insults and threats; it should also be conditioned to move in waves so that no one person can make substantial change; the public should feel that disruptions will result in crackdowns, not change.

As much a cookbook as a playbook

Although the Kremlin’s propaganda apparatus is sometimes referred to as a playbook,[94] a cookbook may be an equally apt analogy. Narratives are composed of propaganda devices and techniques, which can be used in isolation but more often in tandem with each other, interacting to game heuristics, manage conflicting sentiments, intensify reactions, and amplify reach. They can be combined to create different effects. Rather than a game where different devices serve distinct different functions hoping to create an anticipated result, the devices that form narratives are mobilized in given contexts and for audiences, with elements alternating and changing to fit the sociopolitical dispositions of a given audience. This is reflected in the widespread scaling of certain techniques using recently emerged technological capabilities to reorient and disseminate narratives in ways that were unthinkable for previous generations.

Many of these devices and their ensuing narratives use human rights issues as a sort of skeleton key. Strategic uses of rights-based language and concerns are meant to trigger responses that trick the opposition into highlighting their own hypocrisies and inconsistencies, creating more opportunities for propagandists. The “5 I’s” above form a broad playbook. The narratives are akin to dishes or meals while the devices and techniques are analogous to ingredients. Propaganda campaigns are recipes.

Assumptions of Putin & authoritarian messaging

The strategic use of these narratives, and the devices that compose them, suggests certain assumptions are held by Putin, and similar authoritarian actors, about Western priorities and willingness. Among these is the implication that Western power structures are as cynical, corrupt, and disingenuous as the authoritarian bloc.

This is typified by the exploitation of human rights rhetoric. Liberal democracies are primarily motivated by economic interests. Everything else, this interpretation suggests, is rhetoric. Support for Ukraine will fade as the war drags on and Western economies suffer more. Liberal democracies are only willing to accept so much disruption to the status quo. Voters will not tolerate extreme shocks to their economies, and politicians will pivot accordingly. The West is in denial about the lack of competitiveness that its economic ideology has in the face of new technologies, asymmetric warfare, and stances it has bound itself too philosophically.

Moreover, the West has wrongly assumed a sense of inevitable victory of its system in the wake of the USSR’s collapse. The un- or under-examined belief that it was the economic system of the West that won over the USSR may have provided a false sense of security about the efficacy of globalized, neoliberal systems to facilitate the stability and growth they promised. Rather than a victory for the US-led order, the collapse of the USSR may have been more a product of decadent governance and the pursuit of ideology at the expense of practical policies. Western narratives reinforce the victory of consumer capitalism without acknowledging the roles of cronyism, institutional rot, and the degradation of information delivery systems crucial to maintaining a sustainably fed and housed society on the USSR’s dissolution. These narratives also often downplay the fact that the intelligence gap that made them unable to foresee the Soviet collapse is one of the most substantial intelligence failures in history.

In this sense, Putin seems to accept this Western narrative. Putin’s invasion may now finalize the defeat of the USSR by consumer capitalism because the man who controls Russia has decided that it needs to be avenged. This strategic use of narratives suggests that Putin only knows how to define Russia through its reactions to the West, and particularly through opportunities it can glean from nostalgia, which can be interpreted as the dissemination of appeals to pity. Melnikau suggests that the troll factory is “a purely Russian invention,” with Putin’s true lasting contribution to the world order perhaps being the digitization of a form of serfdom.[95] That contribution therefore may be the innovation of applying advertising methodologies to the geopolitical sphere in the time of digital media, essentially taking China’s early 2000s approach to content moderation and scaling it to international relations.

Conclusions

  1. The “5 I’s” suggests a logical progression for seizing, maintaining, and weaponizing narrative control. This process facilitates control over a person’s access to information and ultimately their heuristics or thought process.
  2. Propaganda techniques are used to build strategic narratives which are used to engender the “5 I’s.” Technology facilitates news ways for techniques to be combined and to build more nuanced narratives.
  3. Propwatch’s methodology identifies the main functions of Kremlin propaganda as distraction, distortion, creating insecurity/instability, then providing solutions that play on mental associations.
    1. The ability to engage pre-existing attitudes is crucial.
    2. Analysis through Propwatch’s techniques suggests a double stream: enhance opportunities for Russia; enhance vulnerabilities for Ukraine; this corresponds with the functions of propaganda (convince the neutral; dissuade opposition; encourage supporters).
    3. The propaganda devices being used aren’t new, but the technology is, and the way the Kremlin uses these devices is rooted in the history of other innovations.
  4. Accusations are often confessions. Authoritarian actors tend to accuse the opposition of doing what they intend to do as a way to build consensus for actions that would otherwise be unacceptable among their own public. Bad faith actors don’t just tell us who they are, they tell us what they need to do. But only if we’re listening at the right frequency.

Appendix A

Categorization of Propwatch’s propaganda techniques under an umbrella of five distinct categories: (a) distractions and diversions; (b) fear, uncertainty, and doubt; (c) oversimplification; (d) transfer and association; and (e) falsehoods and distortions (The Propwatch Project, n.d., paras. 1-66). This is based on common cognitive characteristics that certain techniques share. Propwatch took a thematic approach to the development of this typology, allowing for in-depth exploration of each technique’s shared characteristics, patterns, and perspectives.

GroupTechniqueDefinition
Distractions & Diversions   Propaganda techniques that rely on distraction or diversion, by shifting attention away from someone or something under scrutiny.Ad HominemAttacking the character or motive of the person making an argument, rather than attacking the argument itself.
Adding QualifiersAdding an extra word or phrase to a response, which makes it ultimately meaningless, but still leaves the desired impression. Also known as “inserting loopholes”.
MinimizationCharacterizing something that you don’t want to address as trivial or insignificant, in order to shift the focus away from it and onto “more important” things.
Muddy the WatersBringing up irrelevant facts to confuse or complicate an issue, which may otherwise be relatively simple and easy to understand.
Poisoning the Well  Discrediting your opponent to an audience in advance, in order to encourage dismissing any future claims they may make against you.
ProjectionAccusing an opponent of using the same underhanded tactics or committing the same misdeeds the accuser is guilty of.
Red HerringThrowing an irrelevant fact into an argument to divert attention from the real issue at hand.
WhataboutismDiscrediting a criticism by accusing hypocrisy, in order to shift the focus away from oneself and onto others.
Fear, Uncertainty, & Doubt   Propaganda techniques that heighten anxiety and/or raise doubt, making it harder to think rationally and easier to draw conclusions that might be counter to logic or common sense.Appeal to IgnoranceRaising doubt by suggesting something is true because it has not yet been proven false.
DemonizingCharacterizing a group or those who support an opposing viewpoint as threatening, immoral, or less than human.
Dog WhistleAmbiguous messaging used to stoke racial fear and anxiety and/or to covertly signal allegiance to certain subgroups of an audience.
FUDMaking dire warnings or raising doubt about an issue, while provided little or no specifics or evidence to support the claims.
ScapegoatingPlacing unmerited blame on a person or group to channel societal resentment and frustration towards a common adversary or powerless victim.
Slippery SlopeSuggesting that major inevitable consequences will occur by permitting any incremental course of action.
Oversimplification   Propaganda techniques that take advantage of the tendency in human nature to prefer simple solutions or magical answers, regardless of how complex an issue might be.False DichotomyGiving the impression that there are only two opposing choices or options, while ignoring any middle ground exists between the two extremes.
False EquivalencyImplying that two things are essentially the same, when they only have anecdotal similarities.
Glittering GeneralitiesVague words or phrases used to evoke positive emotional appeal, without presenting supporting information or reason.
Proof by AnecdoteMaking a broad generalization, based on an individual story or stories that support that generalization.
Transfer & Association   Propaganda techniques that use certain words or mental imagery to instill positive or negative emotions associated with those words or imagery.BandwagonCreating social pressure to conform by promoting a sense of inevitable victory.
Common FolkEstablishing a connection with an audience based on being just like one of them and being able to empathize with their concerns.
DysphemismReplacing neutral language with more derogatory or unpleasant terms, to instill a negative association.
EuphemismReplacing accurate language that may be offensive with language that is more palatable, to instill a positive association.
Fault as VirtueTechnique where a weakness is presented as a strength, by focusing on any positive aspect of it.
Guilt by AssociationUsing an opponent’s links to another to assign the other’s beliefs, misdeeds, or other unattractive qualities to the opponent.
Honor by AssociationDefending or championing cultural sacred cows, which transfers the respect, authority, sanction, and prestige associated with those symbols.
HyperboleAn extravagant statement or figure of speech used for effect, not intended to be taken literally.
LabelingPigeon-holing a person or group into a simple category and assigning names and/or beliefs to that category.
SloganA brief, striking phrase that people will remember, which typically acts on emotional appeals.
Virtue WordsUsing words that are attractive to the value system of the target audience.
Falsehoods & Distortions   Propaganda techniques that attempt to fabricate the truth through lies, distortions, testimonials, repetition, or by focusing on just kernels of truth.Ad NauseumRepeating something over and over again, until it forms a mental association and/or becomes perceived as truth.
Appeal to Anonymous AuthorityInsisting something is true because an unnamed expert, study, or generalized group (like ‘scientists’) say it’s true.
Appeal to Compromised AuthorityInsisting something is true because an expert on the issue says it’s true, when that expert has a vested interest in the outcome.
Appeal to False AuthorityInsisting something is true because someone posing as or being framed as an expert says it’s true.
Baseless ClaimA statement that is presented as accepted or established fact but is wholly undecided or unsubstantiated.
Cherry PickingPresenting only evidence that confirms your position, while ignoring or withholding an often more significant portion that contradicts it.
ExaggerationStretching the truth, to make something seem more powerful or meaningful than it actually is.
False ClaimA statement that is directly contradicted by fact and can be easily proven untrue.
Half TruthA statement that is essentially true, but lacking critical information and presented as the whole truth.
Misleading ClaimA statement with a few elements or kernel of truth, which can easily be proven deceptive or fundamentally untrue.
Out of ContextRemoving a passage or quote from its surrounding context in such a way as to distort its intended meaning.
Post HocProclaiming that because something occurred after X, it was caused by X, when no causal relationship may exist at all.
Reversal of RealityA statement that is not only verifiably false but is the exact opposite of the truth.
Straw ManMisrepresenting an opponent’s position or argument to make it easier to attack, usually by exaggerating, distorting, or just completely fabricating it.
The Big LieTelling and repeating a lie so bold and audacious that people will be inclined to think there must be some truth to it.


Appendix B

This appendix lists the propaganda claims by relevant narrative. In addition to the claim and its fact-checked link, the source of the claim and the date of publication are also included. For claims that are in a language other than Ukrainian or Russian, the original language has been included in parentheses.

Narrative 1. Russia Is Pure

Narrative 2. Russophobia is rampant in the west

Narrative 3. Ukraine is an illegitimate state

Narrative 4. Ukraine is weak and/or hostile

Sub-narrative: Ukraine as aggressor

Appendix C

This annex features comparisons of Propwatch’s devices with the frameworks mentioned in the paper. The frameworks compared are the IPA, the color propaganda model, and the frameworks of Cunningham, Conserva, and Caldini. Devices identified in these frameworks are compared to the propaganda devices Propwatch identified in its analysis of Kremlin propaganda claims.

IPA Classification for PropagandaOur Findings
Bandwagon – convincing others that “everybody’s doing it”Diffuse and foundational, used to build consensus around other devices through implication
Card Stacking* – the use of falsehoods or facts to give the best or worst possible case for an idea or personreversal of reality – it is necessary to “stack the deck” against an opponent in order to make reality appear reversed false equivalency – by relying on anecdotal similarities, false equivalency shares the foundational purpose of card stacking poisoning the well – in order to discredit an opponent in advance, “stacking the deck” against that opponent is required; card stacking techniques are needed to produce a poisoned well
Glittering Generalities* – vague, broad statements that connect that connect with the audience’s beliefs and valueshonor by association – the strategy is grounded in the use of sometimes vague but generally recognized symbols of respect and authority to transfer these values onto the content of the propagandist’s speech; this relationship is exemplified in the use of honor by association in Narrative 1. FUD – alternatively, this device can be interpreted as sharing a theoretical basis with glittering generalities, but for the purpose of producing fear and confusion, rather than trust and agreement. The fear-mongering aspect implicit in FUD draws on a lack of facts and specificity, finding a likeness with the vague and “glossy” language used in glittering generalities.
Name-Calling – labeling an idea as bad, therefore rejecting and condemning it without evidencedog whistle – FUD’s cousin, this device shares a strong theoretical basis with name calling. Essentially labeling, name calling’s effects become bolstered when racial and cultural biases are roped into the device’s execution; the dog whistle is a perfect example of this phenomenon. demonizing – although name-calling is not necessary for the success of the technique, name-calling is implicit in the process of demonization. If name-calling is used to demonize, the repetition of such names may result in character assassination, a device found repeatedly in this study, and best exemplified in Narrative 4 as relevant to demonization and name-calling.
Plain Folks – convincing an audience that an idea is good because that person is “of the people” or “plain folks” thereby connecting with their points of viewDiffuse and foundational, used to build consensus around other devices through implication.
Testimonial – having a well-known person reinforce a product or causeappeal to false authority – a person’s title is the salient factor in the success of the persuasive device
Transfer – associating something that is respected (i.e., an American flag) in order to have that respectability and authority transferred to themSee glittering generalities, honor by association*
Colors PropagandaOur Findings
White Propaganda: Propaganda whose origin was clearly labeled, and which had a transparent purpose.Distraction & Diversion Transfer & Association Falsehoods & Distortions.  
Black Propaganda: is information put out by an opposing government or institution and made to look as though it came from a friendly source.Falsehoods & Distortions.
Gray Propaganda: is information of questionable origin that is never sourced and whose accuracy is doubtful.Distractions & Diversions Fear, Uncertainty, & Doubt  
Cunningham’s ModelOur Findings
White Propaganda: propaganda that uses facts and truthful messages in a persuasive manner. Although it relies on truth, it is presented in a biased manner.reverse social engineering – Outrightly uses facts and truthful messages in a biased manner appeal to tradition / honor by association – Uses the truth in a persuasive manner by associating itself with it poisoning the well – It can use biased truth to discredit the opponent
Black Propaganda: the reverse of White Propaganda. Black propaganda relies on lies, or erroneous information.projection / reversal of reality – Relies on lies to convey information character assassination / demonizing – Relies on false information to condemn a group appeals to false authority / muddy the waters – May rely on sharing lies to complicate or confuse an issue FUD / dog whistle – No evidence or specifics need to be offered when using this technique poisoning the well – It could use erroneous information to discredit the opponent as they don’t have a chance to correct it on the spot appeal to pity – Can use erroneous information to ask for pity from a large group reverse social engineering – Uses erroneous information to influence opinion
Disinformation: information that is intentionally designed to be misleading. Disinformation is designed to propagate rumors and assumptions.projection / reversal of reality – Information intentionally crafted to influence public opinion character assassination / demonizing – False information crafted to influence people’s opinion of a group appeals to false authority / muddy the waters – The expert offering the “truth” may be fake false equivalency / whataboutism – Potentially, fake information could be used in this method as long as there is some similarity FUD / dog whistle – Fake information could be created for this technique as no proof needs to be poisoning the well – It could use fake information to discredit the opponent as they don’t have a chance to correct it on the spot appeal to pity – Can potentially use fake information curated to appeal to a large audience to pity the propagandist crisis actors – Fake experts used
Agitation Propaganda: A form of propaganda that calls attention to a social or political problem. Agitation propaganda uses a variety of emotional messages to generate outrage, fear or anger.character assassination / demonizing FUD / dog whistle poisoning the well reverse social engineering false equivalency / whataboutism – Can cause agitation and fear, adding to the ongoing confusion appeal to pity – Has the potential to cause agitation and fear in people to garner pity from them reverse social engineering – Causing Fear and agitation is part of its larger objective
Integration Propaganda: A form of propaganda that calls for unity to a cause or group. Integration propaganda calls for people to join a movement.character assassination / demonizing – Outrightly calls for group involvement appeal to tradition / honor by association – It has the potential to call for grouping since it asks people to believe something honorable is on the same side as “the people”, the “idea” or the “person”. But doesn’t outrightly call for it reverse social engineering – Can be used on a large-scale level to influence a group of people to take action false equivalency / whataboutism – Has the potential to cause a group to act if a large number of people agree poisoning the well – Potentially, influences a larger audience to discredit/dismiss the opponent appeal to pity – This is a technique that calls for groups to unite in order to act
Bureaucratic Propaganda: the use of reports and statistics to convey a point of view. Bureaucratic propaganda masks itself as legitimate scientific findings.appeals to false authority / muddy the waters – Has the potential to influence people by conveying irrelevant information full of facts and stats false equivalency / whataboutism – Facts can be associated with the information used in this technique poisoning the well – The information used in this technique could be backed by stats and numbers that are masked as legitimate appeal to pity – Can use reports masked with legitimacy to appeal to pity crisis actors – Proves reports and stats incorrect reverse social engineering – Uses fake reports and stats to influence its target audience
Counterpropaganda: counteractive propaganda designed to nullify or reverse an opponent’s propaganda message. Counterpropaganda inadvertently provides feedback to the original propagandist.Since it’s defensive in nature, it does not apply to any
Hate Propaganda: a form of agitation propaganda that assigns blame for the problem on a person, race, or nationality. Perhaps the most prolific form of propaganda in the twentieth century. A major goal of hate propaganda is to demoralize the enemy.character assassination / demonizing – Assigns blame on a group appeal to tradition / honor by association – Has the potential to create hate by suggesting a group is not associated with the “honorable” factor FUD / dog whistle – Used to stoke racial fear against a larger group. Even aligning itself with a subgroup can suggest another group as an enemy appeal to pity – By using this technique, they can get support from a group to attack another crisis actors – In this case, used to hide evidence of attack in Ukraine.
Propaganda of the Deed: symbolic acts that rely on media attention to convey the message. Deeds follow the adage “actions speak louder than words”. Through the use of video and photography, Deeds transcend language barriers.appeal to tradition / honor by association – Has the potential to use media (video & pictures) to portray the message appeals to false authority / muddy the waters – Has the potential to share irrelevant facts via media to complicate an issue FUD / dog whistle – Can use media to convey its message appeal to pity – Messages can be conveyed via media crisis actors – Uses media to convey messages reverse social engineering – Uses Media to convey its messages
Caldini’s Principles of PersuasionOur Findings
Reciprocity: is the mutual expectation for exchange of value or service 
Authority: Trust is central to the purchase decision.appeal to tradition / honor by association: By associating with a tradition or honorable organization, you suggest that form of authority and you are on the same side. appeals to false authority / muddy the waters: Fits covertly into this device as it uses false authority to convey a message to large groups reverse social engineering: By catching false news that was originally shared by your sources, you become a false source of authority for people to trust.
Scarcity: You want what you can’t have, and it’s universal.character assassination / demonizing: By suggesting that a group doesn’t deserve certain rights that another group is entitled to, this can fall into scarcity
Commitment & Consistency: Oral communication can be slippery in memory. 
Liking: Safety is the twin of trust as a foundation element for effective communication. If we feel safe, we are more likely to interact and communicate.character assassination / demonizing: By ensuring safety if a group is demonized, this potentially falls under this category. appeal to tradition / honor by association: Aligning yourself with an honorable association or tradition can convey a sense of safety with those who believe in them. appeals to false authority / muddy the waters: If the information is conveyed by an authority that makes the audience feel safe, is physically attractive or has similarities with us, people are more likely to listen to them. reverse social engineering: Creating a false sense of trust by fact checking false news propagated by “sources” that are actually yours.
Consensus: Testimonials, or first-person reports on experience with a product or service, can be highly persuasive.projection / reversal of reality: It has the potential to be highly persuasive if this account comes across as a first-person report or testimonial that’s shared with the public. appeals to false authority / muddy the waters: By appealing to false testimonials and authority, audiences could potentially form a consensus in the propagandists favor poisoning the well: If testimonials and first-person reports are included in the argument, it could prove to be influential. appeal to pity: Could come across as a first-person account/testimonial, thus justifying your reason for attack
Unity: The more we perceive people are part of “us,” the more likely we are to be influenced by them.character assassination / demonizing: By grouping your audience in an ‘us’ vs ‘them’ scenario, they will be influenced. appeal to tradition / honor by association: By appealing to tradition, you form a sense of unity with those that align with your message FUD / dog whistle: Suggesting an alliance with a group or subgroup of people can incite the feeling of unity with people who align with those groups as well.
Conserva’s TechniquesOur Findings
Faulty Logic: The use of simplifications, appeal to inappropriate authorities, condemning the origin, biased sampling and faulty analogies.character assassination / demonizing – Can be carried out using simplifications and biased sampling (Russian speakers in Ukraine means the land is theirs) appeals to false authority / muddy the waters – Is faulty in nature because it appeals to a false authority, can contain false analogies and biased sampling in the information it sends out to cause confusion false equivalency / whataboutism – Is faulty in nature but doesn’t have any similarities to the definition FUD / dog whistle – It raises doubts about issues and origins, has the potential to share biased sampling as no specs or evidence is required in this technique poisoning the well – This device condemns the origin, or in this case the opponent, in order to increase chances of people dismissing any claims made against the propagandist crisis actors– By suggesting that something like an attack is being staged by crisis actors, the propagandist condemns the origin of the argument
Diversion and Evasion: Ad hominem, accusing the accuser, satire, name calling, choosing a scapegoat, and fear of the wicked alternative.projection / reversal of reality – Fits in with Fear of the wicked alternative character assassination / demonizing – Chooses a scapegoat to cause a diversion appeal to tradition / honor by association – Can cause diversion by scapegoating a section of the people, suggesting that they don’t associate with the same honorable sources as the propagandist appeals to false authority / muddy the waters – Is evasive in nature and causes a diversion. But doesn’t fit this definition FUD / dog whistle – It is evasive and causes a diversion from the issue as they can stoke racial fears using name calling and choosing a scapegoat while causing fear by suggesting wicked alternatives poisoning the well – This device overtly accuses the accuser in order to get the public opinion on the propagandists side appeal to pity – This device accuses the accuser as it puts itself in a position of pity in order to justify the actions being made
Appealing to Emotion: Traditions, demand for special consideration, personification, and using “hot and cold” (overtly emotional) words.projection / reversal of reality – Has the potential to use overtly emotionally triggering words while accusing the opponent character assassination / demonizing – uses national personification to justify the Ukraine take over, while using overtly emotional words. appeal to tradition / honor by association – This is rooted in tradition and may use emotionally triggered words FUD / dog whistle – It has the potential to appeal to emotion as emotionally triggering statements could be made in order to raise doubt in the public’s mind. poisoning the well – Using overtly emotional words is potentially part of the messaging used in this device appeal to pity – This device calls for the propagandee to specially consider this situation of attacking the target, as it calls for pity
Falsehood and Trickery: Quotes out of context, false dilemmas, exaggeration of consequences, appeal to ignorance, false urgency, and forgone conclusionsprojection / reversal of reality – Has the potential to exaggerate consequences, and ask its participants to appeal to ignorance character assassination / demonizing – Has the potential to create false dilemmas and exaggerate consequences appeal to tradition / honor by association – Has the potential to exaggerate consequences to suggest the opponent is straying away from tradition appeals to false authority / muddy the waters – Creates false dilemmas and confusion FUD / dog whistle – Is overtly related to this device as false dilemmas, exaggeration of consequences and a sense of false urgency can add to warnings and to raise doubts in the mind of the propagandee. appeal to pity – This device could potentially exaggerate consequences in order to garner pity reverse social engineering: Creates a sense of forgone conclusions as audiences believe that no source can be trusted
Playing on Human Behavioral Tendencies: Repetition, slogans, testimonials, using a bias, stimulating curiosity, and utopian or dystopian fantasies.projection / reversal of reality – Has the potential to suggest a dystopian fantasy being developed by the opponent character assassination / demonizing – Uses a bias and creates a dystopian fantasy to show the negatives of the situation appeal to tradition / honor by association – This overtly creates a bias be suggesting the opponent is straying away from tradition, thus creating a dystopian vs utopian fantasy appeals to false authority / muddy the waters – Uses false experts for opinions, which creates a bias FUD / dog whistle – The messaging in this device creates a dystopian fantasy for the propagandee to witness poisoning the well – A bias is created in the audience’s mind once the propagandist poisons the well appeal to pity – This device could potentially create dystopian fantasies in order to garner pity
Techniques of Style: Shock, proverbs, emphasize one point, shotgun approach, and making statements that don’t face rebuttal.character assassination / demonizing – Has the potential to shock it’s audience and make statements that get no rebuttal by monopolizing where the people will get their information from projection / reversal of reality – Has potential to make statements that face no rebuttal. appeal to tradition / honor by association – Emphasizes one point of straying from traditions and honorable forces like the church, religion, or army poisoning the well – Because of the nature of this device, it doesn’t allow for rebuttal as people dismiss any allegations made against the propagandist
Techniques of Reason and Common Sense: Metaphors, “yes, but”, portraying weaknesses as strengths, and use of famous quotes.false equivalency / whataboutism: Uses ‘Yes, but’ in its communication appeal to pity – It rationalizes the attack by garnering pity with “yes, but”

[1] Yasmeen Serhan, “Why Russian Support for the War in Ukraine Hasn’t Wavered,” Time, August 4, 2022; Peter Dickinson, “More than three-quarters of Russians still support Putin’s Ukraine War,” Atlantic Council, June 6, 2022.

[2] Shadi Hamid, “Why the Russian people go along with Putin’s war,” The Atlantic, April 23, 2022; Dina Smeltz and Lily Wojtowicz, “Russians think they’re engaged in a heroic struggle with the West,” The Washington Post, April 14, 2022; Adam Forrest, “Huge majority of Russians believe Putin propaganda ‘and cannot be reached’, says head of shut-down TV station,” Yahoo News, April 9, 2022.

[3] Garth Jowett and Victoria O’Donnell, Propaganda and Persuasion 7th edition (Thousand Oaks: SAGE Publications, 2019), 6.

[4] Hadley Cantril, “Propaganda analysis,” The English Journal, 27, no. 3 (1938): 217-221.

[5] Jacques Ellul, “Information and propaganda,” Diogenes 5, no. 18 (1957): 66.

[6] Marshall Soules, Media, persuasion and propaganda (Edinburgh: Edinburgh University Press, 2015).

[7] Douglass Walton, “What is propaganda, and what exactly is wrong with it?” Public Affairs Quarterly 11, no. 4 (1997): 396-400.

[8] Mike Gordon, personal communication, July 27, 2022

[9] Typology (n.d.), The Association for Qualitative Research.

[10] Jelani Mandara, “The typological approach In child And family psychology: A review of theory, methods, and research,” Clinical Child And Family Psychology Review, 6 (2003): 132.

[11] Mary White, (n.d.). “Examples of typology: Definitions and use across different disciplines,” Yourdictionary.com.

[12] Emily Stapley, Sally O’Keeffe, and Nick Midgley, “Developing typologies in qualitative research: The use of ideal-type analysis,” International Journal of Qualitative Methods, 21 (2022).

[13] The Propwatch Project (n.d.), “Propaganda techniques,” The Propwatch Project: 1-66.

[14] Edward Bernays, Propaganda (New York: Horace Liveright, 1928).

[15] Michael Sproule, “Authorship and origins of the seven propaganda devices: A research note,” Rhetoric and Public Affairs 4, no. 1 (2001).

[16] The Institute for Propaganda Analysis (n.d.), “Propaganda,” Southern Methodist University.

[17] Michael Sproule, “Authorship and origins of the seven propaganda devices: A research note,” Rhetoric and Public Affairs 4, no. 1 (2001).

[18] Clyde Miller and Violet Edwards, “The intelligent teacher’s guide through campaign propaganda,” The Clearing House: A Journal of Educational Strategies, Issues and Ideas 11, no. 2 (1936): 69-73.

[19] Garth Jowett and Victoria O’Donnell as cited in, Truda Gray and Brian Martin, “Backfires: White, black and grey,” Journal of Information Warfare 6, no. 1 (2007): 7-16.

[20] Stanley Cunningham, The idea of propaganda: A reconstruction. (Westport: Praeger, 2002).

[21] Henry Conserva, Propaganda Techniques, (1st Books, 2003).

[22] Robert Cialdini, Influence: The psychology of persuasion, (Morrow, 1984).

[23] Robert Cialdini, Pre-suasion: A revolutionary way to influence and persuade, (New York: Simon and Schuster, 2016)

[24] Gregory Asmolov, “The effects of participatory propaganda: From socialization to internalization of conflicts,” Journal of Design and Science 6 (2019).

[25] Tod C. Helmus et al., “Russian Social Media Influence Understanding Russian Propaganda in Eastern Europe,” RAND Corporation (2018); Jennifer Hansler, “US accuses Russia of conducting sophisticated disinformation and propaganda campaign,” CNN, August 5, 2020; David Klepper, “Russian online propaganda? Meta says yes, shuts down network,” The Christian Science Monitor, September 27, 2022.

[26] Alina Polyakova and Spencer P. Boyer, The future of political warfare: Russia, the west, and the coming age of global digital competition,” Brookings – Robert Bosch Foundation Transatlantic Initiative (2018); Noel Foster in Felix Mölder et al., “Propaganda gone viral: A theory of Chinese and Russian “COVID diplomacy” in the age of social media” The Russian Federation in Global Knowledge Warfare (Springer International Publishing, 2021): 123-145; Elizabeth Kilkenny, “Russian disinformation—The technological force multiplier,” Global Insight: A Journal of Critical Human Science and Culture 2 (2021); Martin Innes and Andrew Dawson, “Erving Goffman on misinformation and information control: The conduct of contemporary Russian information operations,” Symbolic Interaction 45, no. 4 (2022).

[27] Braeden R. Allenby, “The paradox of dominance: The age of civilizational conflict,” Bulletin of the Atomic Scientists 71, no. 2 (2015): 60-74; Nicholas J. Cull et al., “Soviet subversion, disinformation and propaganda: How the west fought against it,” LSE Consulting (2017); David Grimes, “Russian fake news is not new: Soviet Aids propaganda cost countless lives,” The Guardian (June 14, 2017).

[28] Elizabeth Kilkenny, “Russian disinformation—The technological force multiplier,” Global Insight: A Journal of Critical Human Science and Culture 2 (2021).

[29] Ibid.

[30] Ibid.

[31] Alina Polyakova and Spencer P. Boyer, The future of political warfare: Russia, the west, and the coming age of global digital competition,” Brookings – Robert Bosch Foundation Transatlantic Initiative (2018).

[32] Ibid.

[33] Ibid; Heiko Pleines, “Media Control as Source of Political Power: Differentiating Reach and Impact,” Russian Analytical Digest, no. 258 (2020): 2-7.

[34] Alina Polyakova and Spencer P. Boyer, The future of political warfare: Russia, the west, and the coming age of global digital competition,” Brookings – Robert Bosch Foundation Transatlantic Initiative (2018).

[35] Ibid.

[36] Dan Ciuriak, “The role of social media in Russia’s war on Ukraine,” Ciuriak Consulting, Inc. (April 8, 2022); Alina Polyakova and Spencer P. Boyer, The future of political warfare: Russia, the west, and the coming age of global digital competition,” Brookings – Robert Bosch Foundation Transatlantic Initiative (2018).

[37] Jukka Ruohonen, “A few observations about state-centric online propaganda,” In Proceedings of the ACM Conference 2017, USA (2021); Dan Ciuriak, “The role of social media in Russia’s war on Ukraine,” Ciuriak Consulting, Inc. (April 8, 2022); Elizabeth Kilkenny, “Russian disinformation—The technological force multiplier,” Global Insight: A Journal of Critical Human Science and Culture 2 (2021).

[38] Jukka Ruohonen, “A few observations about state-centric online propaganda,” In Proceedings of the ACM Conference 2017, USA (2021).

[39] Jaqueline Evans, “War in the age of TikTok,” Russian Analytical Digest no. 280 (2022): 17.

[40] Alina Polyakova and Spencer P. Boyer, The future of political warfare: Russia, the west, and the coming age of global digital competition,” Brookings – Robert Bosch Foundation Transatlantic Initiative (2018).

[41]Anna Llanos-Antczak and Zdzislaw Śilwa, “Manipulation and propaganda in the Russian media: The case of the Vriemia news programme (2017-2019),” Contemporary Economics 15, no. 4 (2021): 511-523.

[42] Alina Polyakova and Spencer P. Boyer, The future of political warfare: Russia, the west, and the coming age of global digital competition,” Brookings – Robert Bosch Foundation Transatlantic Initiative (2018); Christopher Paul and Miriam Matthews, “The Russian “firehose of falsehood” propaganda model,” RAND Corporation (2016).

[43] Katerina M. Keegan, “Clarity for friends, confusion for foes: Russian vaccine propaganda in Ukraine and Serbia,” Harvard Kennedy School Misinformation Review 3, no. 3 (2022).

[44] Alina Polyakova and Spencer P. Boyer, The future of political warfare: Russia, the west, and the coming age of global digital competition,” Brookings – Robert Bosch Foundation Transatlantic Initiative (2018).

[45] Noel Foster in Felix Mölder et al., “Propaganda gone viral: A theory of Chinese and Russian “COVID diplomacy” in the age of social media” The Russian Federation in Global Knowledge Warfare (Springer International Publishing, 2021): 128.

[46] Noel Foster in Felix Mölder et al., “Propaganda gone viral: A theory of Chinese and Russian “COVID diplomacy” in the age of social media” The Russian Federation in Global Knowledge Warfare (Springer International Publishing, 2021): 128-129.

[47] Noel Miller, “Adaptive Russian information warfare in Ukraine,” Russian Analytical Digest, no. 282 (2022): 2-5.

[48] Ibid.

[49]Catherine Owen, Tena Prelec, and Tom Mayne, “The illicit financialisation of Russian foreign policy” Serious Organized Crime & Anti-Corruption Evidence Research Programme (2022).

[50] Mykola Polovyi, “Exploitation of the right to freedom of expression for promoting pro-Russian propaganda in hybrid war,” Politeja 18, no.2 (2021): 171-182.

[51] Charlie Smart, “How the Russian media spread false claims about Ukrainian Nazis,” The New York Times, July 2, 2022.

[52] Rachel Treisman, “Putin’s claim of fighting against Ukraine ‘neo-Nazis’ distorts history, scholars say,” NPR, March 1, 2022.

[53] Sarah Oates and Sean Steiner, “Projecting Power: Understanding Russian Strategic Narrative,” Russian Analytical Digest, no. 229 (2018): 2-5.

[54] Alexandra Cirone and William Hobbs, “Asymmetric flooding as a tool for foreign influence on social media,” Political Science Research and Methods (2022):1-12.

[55] Jessica Brandt, “How autocrats manipulate online information: Putin’s and Xi’s playbooks,” The Washington Quarterly, 44, no. 3 (2021): 127-154.

[56] Catherine Owen, Tena Prelec, and Tom Mayne, “The illicit financialisation of Russian foreign policy” Serious Organized Crime & Anti-Corruption Evidence Research Programme (2022).

[57] Petr Gulenko, Political discussion as a propaganda spectacle: Propaganda talk shows on contemporary Russian television, Media, Culture & Society 43, no. 5 (2020): 906–924.

[58]Jon White, “Dismiss, distort, distract, and dismay: Continuity and change in Russian disinformation,” Institute for European Studies 13 (2016); Christian Kamphuis, Reflexive control, Militaire Spectator(2018).

[59]Claire Wardle and Hossein Derakhshan, “Information disorder: Toward an interdisciplinary framework for research and policy making,” Council of Europe Report (2017); Craig Silverman, “Lies, damn lies and viral content,” Tow Center for Digital Journalism (2015).

[60] Lion Gu, Vladimir Kropotov, and Fyodor Yarochin, “The fake news machine: How propagandists abuse the internet and manipulate the public,” Trend Micro (2017).

[61]Sergey Sanovich, “Computational propaganda in Russia: The origins of digital misinformation,” in Woolley, & P. N. Howard, Computational Propaganda (2017).

[62]Craig Silverman, “Lies, damn lies and viral content,” Tow Center for Digital Journalism (2015).

[63]Martin Potthast et al., “A stylometric inquiry into hyperpartisan and fake news,” Proceedings of the 56th Annual Meeting of the Association for Computational Linguistics 1 (2017): 231–240; Hannah Rashkin, “Truth of varying shades: Analyzing language in fake news and political fact-checking,” Proceedings of the 2017 Conference on Empirical Methods for Natural Language Processing (2017):2931-2937.

[64]Benjamin D. Horne and Sibel Adali, “This just in: Fake news packs a lot in title, uses simpler, repetitive content in text body, more similar to satire than real news,” Eleventh international AAAI conference on web and social media (2017).

[65]Paul Grice, Studies in the way of words, (Cambridge: Harvard University Press, 1989) 27.

[66] Ibid.

[67] Daniel Dor, “On newspaper headlines as relevance optimizers,” Journal of Pragmatics 35, no. 5 (2003): 695-721; Murad Al Kayed and Amal Kitishat, “The violation of Grice’s maxims in Jordanian newspapers’ cartoons: A pragmatic study,” International Journal of Linguistics and Literature 4, no. 4 (2015): 41-50; Ullrich Ecker et al., “Effects of subtle misinformation in news headlines,” Journal of Experimental Psychology: Applied 20, no. 4 (2015): 323–335.

[68] Reza Kheirabadi and Ferdows Aghagolzadeh, “Grice’s cooperative maxims as linguistic criteria for news selectivity,” Theory and Practice in Language Studies 2, no. 3 (2012):547-553.

[69] Thomas E. Nissen, “Social media’s role in ‘hybrid strategies’,” NATO Strategic Communications Centre of Excellence (2016): 7-8.

[70] Deborah Tannen, “Discourse analysis—What speakers do in conversation,” Linguistic Society of America, October 17, 2022.

[71] Henry Widdowson, Text, context, pretext: Critical issues in discourse analysis (Hoboken: Blackwell Publishing, 2004), 17.

[72] Roger Fowler et al., Language and control (Abingdon: Routledge, 2018).

[73] Teun van Dijk, News as discourse (Hillsdale: Lawrence Erlbaum Associates. Inc., 1988), 362.

[74] Renugah Ramanathan and B.H.Tan, “Application of critical discourse analysis in media discourse studies,” 3L: Southeast Asian Journal of English Language Studies 21 no. 2 (2015):57-68.

[75] Teun van Dijk, News as Discourse (Hillsdale: Lawrence Erlbaum Associates. Inc., 1988), 362; Teun van Dijk, “New(s) Racism: A discourse analytical approach,”in Ethnic Minorities and the Media, ed. Simon Cottle (Buckingham, UK & Philadelphia, USA; Open University Press, 2000), 33-49; Norman Fairclough, Language and Power (Harlow: Longman, 1989).

[76] Johan Farkas, Jannick Schou, and Christina Neumayer, “Platformed antagonism: Racist discourses on fake Muslim Facebook pages,” Critical Discourse Studies 15, no. 5 (2018):463-480.

[77] Teun van Dijk, Ideology: A Multidisciplinary Approach (Washington: Sage Publications, 1998).

[78] Norman Fairclough, Discourse and social change. (Cambridge: Polity Press, 1992).

[79] “About us,” StopFake, https://www.stopfake.org/en/about-us/.

[80] “About,” EUvsDisinfo, https://euvsdisinfo.eu/about/.

[81] Ibid.

[82] Ibid.

[83] Ibid.

[84] Peter Pomerantsev, Nothing is true and everything is possible: The surreal heart of the new Russia (New York: Public Affairs, 2015); Peter Pomerantsev, This is not propaganda: Adventures in the war against reality (New York: Public Affairs, 2019).

[85] Christopher Paul and Miriam Matthews, “The Russian “firehose of falsehood” propaganda model,” RAND Corporation (2016).

[86] Andrew Moshirnia, “Russia’s Diabolical New Approach to Spreading Misinformation,” Slate, March 31, 2022.

[87] Karoun Demirjian, “Russians have many theories about the MH17 crash. One involves fake dead people,” The Washington Post, July 22, 2022.

[88] Ibid.

[89] Tom Porter, “Russia claims it did not massacre civilians in Ukraine, citing conspiracy theories that evidence was manipulated or filmed with crisis actors,” Business Insider, April 4, 2022.

[90] Reuters Fact Check, “Fact Check-Video shows climate demonstration, not staged body bags in Ukraine war,” Reuters, 2022.

[91] Stanley Cunningham, The idea of propaganda: A reconstruction. (Westport: Praeger, 2002).

[92] Ibid.

[93] Jon White, “Dismiss, distort, distract, and dismay: Continuity and change in Russian disinformation,” Institute for European Studies 13 (2016); Christian Kamphuis, Reflexive control, Militaire Spectator(2018).

[94] Jamie Dettmer, “Russia’s disinformation playbook ripped apart,” VOA News, March 15, 2022, https://www.voanews.com/a/russia-disinformation-playbook-ripped-apart/6486203.html; Ben Popken, “Factory of lies: Russia’s disinformation playbook exposed,” NBC News November 5, 2018, https://www.nbcnews.com/business/consumer/factory-lies-russia-s-disinformation-playbook-exposed-n910316

[95] Eduard Melnikau, “Towards transformation of communication strategy in dealing with Russian propaganda,” OPUS International Journal of Society Researches 18, no. 39 (2021): 727-749.

X