Politologija ISSN 1392-1681 eISSN 2424-6034

2024/4, vol. 116, pp. 198–237 DOI: https://doi.org/10.15388/Polit.2024.116.5

Model of Strategic Disinformation Reconstruction Based on Analysis of Intentions*

Bohdan Yuskiv
AGH University of Science and Technology in Krakow (Poland),
Faculty of Humanities; Rivne State University of Humanities (Ukraine)
Email: yuskivb@ukr.net

Nataliia Karpchuk
Lesya Ukrainka Volyn National University,
Faculty of International Relations (Ukraine)
Email: Natalia.karpchuk@vnu.edu.ua

Serhii Fedoniuk
Lesya Ukrainka Volyn National University,
Faculty of International Relations (Ukraine)
E-mail: Sergii.fedoniuk@vnu.edu.ua

Abstract. Disinformation is a powerful means of manipulation in times of conflicts and war and fake news is a disinformation tool. The Russian Federation disseminates disinformation to support its military and foreign political goals. This article offers a theoretical contribution to exploring the role of intentions in disinformation strategies. In this research, we developed a FIRS model for detecting hidden disinformation strategies in fake news and attempted to identify the Russian disinformation strategy regarding Ukraine during the first year of the RF’s full-scale invasion of Ukraine. The article addresses the question of whether it is possible to comprehend the overall strategy of disinformation influence by reconstructing the intentions of fake news. Relying on intent analysis, intentions correlations analysis and intentions clustering method, the article highlights the RF’s manipulative intentions, namely, discrediting Ukraine, discrediting the West, justification of the RF’s actions, the RF’s positive image and criticism of opponents, demonstration of the RF’s power and intimidation of the opponents, accusing Ukraine and the West of nazism. The results demonstrate the systemic nature of the RF’s disinformation strategies and their adaptation to the course of the war.
Keywords: disinformation, fake news, intention, clusterization, correlations analysis, Ukraine, Russian Federation.

Strateginės dezinformacijos modelio rekonstrukcija remiantis intencijų analize Rusijos karo Ukrainoje kontekste

Santrauka. Konfliktų ir karo metu dezinformacija yra įtakinga manipuliacinė priemonė, o iškreiptos naujienos yra dezinformacijos įrankis. Rusijos Federacija (RF) naudoja dezinformaciją siekdama paremti savo karinius ir užsienio politikos tikslus. Šiuo straipsniu siekiama teorinės refleksijos apie dezinformacijos strategijų intencijų vaidmenį. Tyrime plėtojame FIRD modelį, kuris skirtas atpažinti paslėptas dezinformacijos strategijas skelbiant iškreiptas naujienas, taip pat pabandėme identifikuoti Rusijos dezinformacjos strategijas Ukrainos atžvilgiu pirmaisiais visos apimties įsiveržimo į Ukrainą metais. Straipsniu bandoma atsakyti į klausimą, ar įmanoma suvokti bendrąją dezinformacijos strategiją atpažįstant iškreiptų naujienų intencijas. Remiantis siekių analize, intencijų koreliacine analize ir intencijų sugrupavimu, straipsnyje išskiriamos RF manipuliacinės intencijos, pirmiausia, kuriomis diskredituojami Ukraina ir Vakarai, pateisinami Rusijos veiksmai, kuriamas pozityvus Rusijos įvaizdis, kritikuojami oponentai, ir kita. Analizėje parodoma sisteminė Rusijos dezinformacijos strategijos prigimtis ir jos kaita karo metu.
Reikšminiai žodžiai: dezinformacija, iškreiptos naujienos, intencija, grupavimas, koreliacinė analizė, Ukraina, Rusijos Federacija

_________

* This study was conducted in the framework of the Jean Monnet Module “EU Strategic Communications: Counteraction to Destructive Influences” (No.101047033 ERASMUS-JMO-2021-MODULE).

Received: 24/02/2024. Accepted: 01/06/2024
Copyright © 2024 Bohdan Yuskiv, Nataliia Karpchuk, Serhii Fedoniuk. Published by
Vilnius University Press. This is an Open Access article distributed under the terms of the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited.

Introduction

State-sponsored disinformation has existed since the emergence of states. In the 21st century, the major difference is the ease, efficiency, and low cost of such efforts.1 Nowadays worldwide people rely on the Internet and social media as primary sources of news and information, so audiences turn into an “ideal” target for disinformation attacks.

The disinformation surrounding Russia’s large-scale invasion of Ukraine in February 2022 marked an escalation in Russia’s longstanding information operations against Ukraine and open democracies. The spread of disinformation around Russia’s invasion of Ukraine reflects wider challenges related to the shift in how information is produced and distributed. Platform and algorithm designs can amplify the spread of disinformation by facilitating the creation of echo chambers and confirmation bias mechanisms that segregate the news and information people see and interact with online; information overload, confusion and cognitive biases play into these trends.2 A particular challenge is that people tend to spread falsehoods “farther, faster, deeper, and more broadly than the truth”; this is particularly the case for false political news.3

Analyzing the effects of disinformation in a military conflict, Clements4 claims that disinformation is aimed to distort the opponents’ perception of its ability and this will create a greater propensity for soldiers on the opposing side to surrender. However, the efficiency of such information activity greatly depends “not only on the degree of disinformation but also on whether and to what extent the sender of disinformation is superior to the receiver.5 According to Ullman and Wade, in times of war, disinformation aims to shock and awe that is “to affect the will, perception, and understanding of the adversary to fight.”6

In warfare, state-sponsored disinformation campaigns can be exceedingly effective in manipulating public opinion, sowing discord, and achieving strategic objectives. The heightened emotional atmosphere and the urgency surrounding wartime events can make individuals more susceptible to false or misleading information. Moreover, the amplification mechanisms available in today’s digital age can rapidly disseminate state-sponsored disinformation to a vast audience. These campaigns often exploit preexisting divisions, cultural biases, and societal fault lines, exacerbating tensions and deepening the wedge between different groups. As a result, state-sponsored disinformation has become a formidable weapon, capable of influencing not just public perception but also geopolitical outcomes during times of conflict.

Disinformation is always intentional, unlike, for instance, misinformation7 and its intention is malicious. Disinformation has different forms, e.g., fabricated content, misleading content, accurate content shared with false contextual information, satire and parody.8 Speaking about the journalistic format, disinformation is presented in fake news.

The study of disinformation influences has become a subject of meticulous scientific analysis in social sciences from various theoretical perspectives. To date, numerous models have been developed to analyze disinformation from different scientific approaches, namely, models for text sentiment analysis, machine learning models for detecting fake news, discourse analysis and content analysis in media, sociological models of media influence, and psychological persuasion models.

Sentiment analysis models are natural language processing techniques that enable the detection of emotional undertones and hidden manipulations in propaganda texts, for instance, determining whether a text is positive, negative, or neutral, as well as identifying specific emotions such as joy, anger, sorrow, and so on.9 This methodology also involves revealing the degree of emotional expression or sentiments within the text.

Machine learning models for detecting fake news can track typical patterns, especially on social media platforms and websites.10 However, they may be limited in detecting fake news that does not conform to typical patterns or characteristics.

Discourse analysis and media content analysis are approaches used to analyze propaganda and disinformation, allowing for the identification of hidden meanings and propagandistic techniques in the media.11 The main characteristics of content analysis include examining the substance and form of messages which enables to discover emphasized themes and the means of their augmentation.12 Discourse analysis, on the other hand, investigates the linguistic devices employed to convey specific ideas and concepts.13

Sociological models of media influence represent the approach to understanding the mechanisms of manipulating public consciousness and studying the impact of mass media on society and specific social groups.14 These models help identify techniques utilized to manipulate public opinion.

Psychological persuasion models can be valuable in understanding the impact of propaganda and disinformation on public opinion and identifying techniques for manipulating public perception.15 Key characteristics of these models include the audience’s motivation to engage with media,16 methods of influence in mass communication processes,17 and crucial categories of arguments used for persuasion.18

As we can see, none of the mentioned models, individually or collectively, fully addresses the general task of identifying disinformation strategies. However, such research remains pertinent for various audience categories. For ordinary citizens, these studies aid in better understanding the hidden mechanisms of manipulative media influence and contribute to the development of media literacy. For communication professionals, they offer opportunities to refine approaches to analyze and counter disinformation. For government representatives, they provide insight for formulating policies in media and information security. From a military perspective, they assist in detecting information operations and adversary’s manipulations aimed at demoralizing both military personnel and civilians. Additionally, they facilitate media discourse analysis to discern the intentions and plans of the rivalry’s side. From a scientific standpoint, these investigations delve deeper into the mechanisms of disinformation dissemination and develop new methodologies for analyzing media content. Therefore, they hold significant theoretical and practical value.

Our research represents an attempt to address this issue, at least partially, i.e. to develop a generalized model of detection (reconstruction) of the disinformation influence strategy based on the analysis of intentions present in fake news disseminated by the Russian Federation amidst a full-scale war against Ukraine.

We believe that by reconstructing the intentions of fake news, one can comprehend the overall strategy of disinformation influence. In turn, this may serve as a further tool in predicting the enemy’s behavior both in the information sphere and on the battlefield.

The main goal of the research is to develop a methodology for reconstructing disinformation influence strategies through the analysis of fake news utilized to execute these influences. The foundation for our approach lies in the idea of synthesizing the intentions identified in fake news.

To achieve this goal, the following research tasks have been formulated:

RT1: to develop the methodological foundations of a model for reconstructing disinformation strategies; l;

RT2: to create the system of intentions serving as the basis for corresponding intention analysis of fake news in texts reflecting conflicting conditions;

RT3: to develop a methodology and algorithm for detecting disinformation strategies;

RT4: to demonstrate the process of identifying disinformation influence strategies using fake news disseminated by the Russian Federation.

1. A Model to Detect the Hidden Strategy of Disinformation

Before 2017, disinformation seldom dominated in analytical discussions. When the term did appear, it played a secondary role within the context of other research subjects. However, the events surrounding the US 2016 election and their subsequent repercussions sparked a surge in interest across various fields, encompassing communication, political science, and information science.19

The word “disinformation” has paved its way into English from the Russian dezinformatsiya and although other governments and their agencies certainly deploy the technique,20 it is most strongly associated with the KGB. Romerstein cites a 1972 top-secret KGB dictionary where “disinformation data” are treated as “specially prepared data, used for the creation, in the mind of the enemy, of incorrect or imaginary pictures of reality, on the basis of which the enemy would make decisions beneficial to the Soviet Union.” 21

UNESCO has also contributed to defining “disinformation” as deliberate (often orchestrated) attempts to confuse or manipulate people by delivering dishonest information to them, having malicious intent.22 In 2018, the High-Level Expert Group on Fake News and Online Disinformation of the European Commission offered the following insight: “Disinformation […] includes all forms of false, inaccurate, or misleading information designed, presented and promoted to intentionally cause public harm or for profit.”23 This definition unites three critical criteria: 1) deception, 2) potential for harm, and 3) an intent to harm. It thus excludes deceptive messages that may cause harm without the disseminators’ knowledge (misinformation)24 and nondeceptive messages intended to harm others (e.g., slurs based on racial, sexual, or gender identity). Disinformation messages under this definition are munitions in campaigns of information warfare, nonlethal weapons intended to subdue adversaries rather than reason with them.25

When disinformation wins hearts and minds, “genuine sources of truth are disregarded or met with scepticism, as trust erodes amid the rampant spread of deceptive and prejudiced information.”26 A continuous challenge in addressing disinformation lies in the intricate and uncertain process of distinguishing between false information and facts. RAND has characterized the rise of disinformation through trends of disagreements about data and fact, the blurring between opinion and fact, the increased volume of opinions and the declining trust in institutional sources.27 Traditional news outlets have been supplanted by a social media landscape that is inherently more susceptible to disinformation, primarily due to the social nature of networks and the absence of robust editorial control. Consequently, a connection has emerged between individuals’ exposure to social media and their susceptibility to falsehoods.28 Bennett and Livingston claim that: “While the origins of much, and perhaps most, disinformation are obscure, it often passes through the gates of the legacy media, resulting in an ‘amplifier effect’ for stories that would be dismissed as absurd in earlier eras of more effective press gatekeeping.29

When a person or group distributes false or misleading information while concealing the true objective behind the campaign, a disinformation campaign occurs.30 Governments have used disinformation as a campaign tool to control online deliberations and influence public opinion.31 Generally, disinformation campaigns have been part of the tools used by governments to “stifle digital rights, distort the truth, advance propaganda, sway public opinion, manipulate the online sphere and consequently undermine the respect for human rights and democracy.”32 The key catalyst for disinformation has been the shift in the audience’s role, evolving from passive news receivers to active participants.33 This transformation in roles, coupled with the decentralization and distribution of networks, empowers users not only to create disinformation but also to amplify it across these networks through sharing, liking, or commenting.34 All of this “potential” of disinformation turns it into a powerful weapon in times of both hybrid and conventional warfare.

In our research, we follow two definitions of disinformation that, in our opinion, are useful for developing our model. The first one belongs to Bennett and Livingston, who defined disinformation as “intentional falsehoods spread as news stories or simulated documentary formats to advance political goals […] systematic disruptions of authoritative information flows due to strategic deceptions.”35 The second explanation, offered by Klinger (2022), claims that “disinformation is strategic communication – it is not just any piece of presumably ‘fake news’ that is circulating. It is the purposeful, deliberate, strategic, dissemination of falsehood or false information or misleading information.”36

Hence, three important characteristics of disinformation as a process can be highlighted:

disinformation is based on the dissemination of false information (misleading) in the form of news or imitation of documentary formats,

it is strategic communication that always involves the promotion of specific goals,

it has a covert nature, concerning not only false information but also the concealment of the true purposes of disinformation.

These very characteristics constitute the basis of our model for detecting disguised strategies of disinformation influence. Specifically, we consider that disinformation subjects use various tactics to achieve their goals and influence others, to incite them to take action and cause harm.

The primary steps to construct the model are the following:

1) we compile a set of fake news created and disseminated by the subject of influence,

2) each piece of fake news is examined for having intentions embedded within its text by the subject,

3) the identification of the disinformation strategy is accomplished by generalizing the intentions across the entire set of fake news.

When developing the model, we were inspired by the fundamental and widely known DIKW hierarchy/pyramid (“data – information – knowledge – wisdom”) interpreted by Ackoff (1989),37 Rowley (2007),38 and Brahmachary (2019).39 The DIKW model was regarded as a template for our academic search.

Hence, we have constructed the FIRS model which consists of four components: fake news (F), intentions (I), relations (R) and strategy (S). They represent a four-level hierarchy where each level adds new attributes and properties compared to the previous one.

The FIRS model for detecting concealed disinformation strategies based on fake news can be viewed from two perspectives: contextual and insight. According to the contextual perspective, we move from the level of collecting fake news (F) to the level of connecting raw parts (news) by identifying alleged intentions for each piece of news (I), then to the level of forming a holistic understanding of the content/connections between the intentions of the entire collection of news (R) and up to the level of conceptualization and unification of the content of related intentions as a form of concealed disinformation strategies (S).

From the insight perspective, the FIRS pyramid can also be seen as a process that begins with collecting and verifying fake news, determining the intention of each piece of fake news, generalizing the relations between the intentions, and making sense of them to identify a holistic disinformation strategy.

The FIRS hierarchy can also be represented in terms of time. For example, the levels of fake news, intentions and relations are seen as a look into the past (what is formed based on the past), while the last step – strategy – represents the future (how it can be used in the future).

For a better understanding, the described ideas are presented in Fig. 1.

Fig. 1. FIRS pyramide

The first stage of the FIRS model is the collection of fake news. Without a sufficient number of such messages, it is impossible to achieve a meaningful final result. This data is collected en masse, although it does not necessarily include only content useful for the research. At the same time, quantitative analysis of fake news dissemination can be useful for understanding the implementation of disinformation strategies.

The second stage is to determine the intentions of the fake news. Each piece of fake news can contain one or more intentions. When determining intentions, we try to find answers to the following questions: why was fake news created? what elements of disinformation did the subject try to include in the fake news? In this sense, we give fake news a meaningful character.

In the third stage, the entire set of intentions in each piece of fake news is generalized to the level of correlations between intentions. Correlations between pairs of intentions are determined by the frequency of cooccurrence of pairs of intentions in the same documents. The presence of correlations of pairs of intentions creates a network of correlations of intentions, which characterize the entire collection of fake news. This is useful knowledge from the perspective of disinformation strategy, since we get an answer to the question: how tactical techniques are built in the process of disinformation strategy implementation? Tactical techniques based on relations between intentions enable the increase of the disinformation efficiency, for example, to achieve a positive effect for the subject based on negative intentions regarding the object.

The final stage in the FIRS hierarchy involves determining the underlying disinformation strategy. This is the process of obtaining the final result by clustering intentions in the same set of fake news. Each cluster represents hierarchically subordinated intentions that strengthen or weaken the influence of each other and determine a certain strategy of influence. Each cluster is like a “brick” to build a disinformation campaign. The set of such clusters forms a set of possible strategic influences within the framework of a disinformation campaign. This generalized set of strategic influences can be considered as a general scheme that allows making decisions based on the experience of the researched disinformation campaign. Such a scheme can also serve as a basis to improve the campaign.

2. Intentions and Reference Objects

“Intentionality is the power of minds and mental states to be about, to represent, or to stand for, things, properties and states of affairs.”40 In other words, intention is the directedness of consciousness toward an object in communication. In this process, the subject expresses his/her attitude towards the object. Communication intention as a mental representation has two aspects: marking the object and expressing an attitude toward it. The individuality of intentions is manifested in the choice of objects and the expression of attitude. This can be embodied in a single word, a phrase, or a text fragment. The main goal of the methodology is to reconstruct communication intentions based on the text of the message.

Typically, intentions are concealed, but in texts reflecting conflicting conditions, they become explicit through the use of means of verbal influence. This strategy aims to emphasize the positive imagery of “our” side and create a model of the enemy of “those” whom we perceive as opponents. In the context of conflicting discourse, the ‘“We” – “They” (ours – aliens) – “Third party”’ scheme is applied. The “We” group usually receives positive characteristics, while “They” are perceived negatively as opponents or enemies. The “Third party” represents the public and officials who may receive different evaluations depending on the specific situation.

In our case, we are dealing with texts that express specific extreme conditions – a military conflict and explicit confrontations associated with it. The influencing subject is the Russian Federation, which utilizes fake news for deliberate, substantiated, and organized impact targeting diverse audiences to propagate Russian ideas. The triangle of influence encompasses the population of the Russian Federation and their adherents in occupied territories, along with pro-Russian forces in other countries, constituting the first object – “We”; the population, government, the Armed Forces of Ukraine represent the second object – “They”; the Western world stands as the third object – “Third party.” For each object, corresponding intentional profiles have been defined (Table 1). Direct characteristics and interpretations of intentions are provided in Annex A. When calculating the number of intentions in fake news, it was taken into account that each piece of news might contain several intentions; only one dominant intention was indicated for one object. The share of intentions in the collection of the RF’s fake news and their dynamics are presented in Annex B.

Table 1. Intentions by groups of reference objects

Object 1: “We” (Russia)

Object 2: “They” (Ukraine)

Object 3: “Third party” (West – other countries)

Intentions (mode*)

Intentions (mode*)

Intentions (mode*)

successes/ achievements of the SMO (special military operation) (+)

accusation (–)

accusation (–)

self-presentation (+)

revelation (–)

discredit (–)

deflection of accusation/criticism (+)

discredit (–)

criticism/mockery (–)

justification (+)

criticism/mockery (–)

nationalism/nazism (–)

caution (+)

nationalism/nazism (–)

Russophobia (–)

self-criticism (–)

Russophobia (–)

support of the RF (+)

dependence on the West (–)

refusal to support Ukraine (0)

• division of society (–)

humiliation of Ukraine (–)

intimidation of Ukraine (–)

encouragement to support the RF (0)

intimidation of the West (–)

* Note: Mode of intentions is defined as: “–” – negative, “0” – neutral, “+” – positive

3. Data, Methods and Research Procedure

Thus, according to the proposed research model, the investigation involved the following stages: 1) to collect the set of fake news, 2) to define and analyze intentions in fake news, 3) to study correlations between intentions, 4) to identify clusters of intentions and to reconstruct the overall strategy of the RF’s disinformation campaign.

We test the FIRS model by identifying the RF’s disinformation strategy in fake news during the first year of the second stage41 of the Russian–Ukrainian war (February 17, 2022, to February 28, 2023).

The empirical data comprises a sample of fake news collected by the Department of the Information Space Security of the Ivan Cherniakhovskyi National Defense University of Ukraine. This data was sourced from two information and analytical collections42 based on official government information channels. These channels include the official website of the Office of Strategic Communications (https://www.facebook.com/AFUStratCom), the official site of the Countering Disinformation Centre (https://www.facebook.com/protydiyadezinformatsiyi.cpd), the fact-checking site Stopfake.org (https://www.stopfake.org) and the portal of the public organization Detektor Media (https://detector.media). Each piece of fake news in these collections has undergone verification and includes a detailed description, primary message, and refutation. We used these data to practically illustrate the application of our methodology. We are also aware that the data were collected from Ukrainian sources, which are one of the belligerent parties. This may create some bias in the data presented. To mitigate this impact, cross-validation was applied. Additionally, the aforementioned fact-checking resources have a clear methodology that is explained and publicly available on their websites.

The dataset chosen for analysis comprises 1275 information units (textual content exclusively) encompassing fake news attributed to the Russian Federation. This selection spans from February 17, 2022, to February 28, 2023.

To analyze intentions, we applied the political discourse intent analysis technique.43 This technique was also used by B. Yuskiv and colleagues regarding conflict discourse.44

The intent analysis procedure involves two stages: 1) identification and expert assessment of intentions within the text, and 2) their ranking and description of individual characteristics of the subject based on these intentions. The reliability of assessing intentions in fake news is assured by the collaboration of three experts.

The study of correlations between intentions is aimed at establishing relations between different intentions and revealing possible network connections between them. Correlation is a measure of how often intentions are used together, compared to how often they are used individually in documents in the entire collection. This allows for the identification of deeper motivations by uncovering common and opposing intentions that drive the spread of disinformation. Such information becomes the basis for developing strategies to detect and counter disinformation campaigns.

Intentions in documents can be described through binary variables: 1 – the intention is present, 0 – the intention is absent in the document. Therefore, if two intentions appear together in one document, then both corresponding binary variables acquire the value 1. To measure such an association for two binary variables (here – intentions), the coefficient φ  (phi) is usually used.45 This measure of association is similar to the Pearson correlation coefficient.46 The greater the absolute value of the coefficient φ, the stronger the relations between the variables.

We used data visualization in the form of a grid graph to visualize the positive association scores and to identify the network of interdependence between intentions.

The clustering of intentions is a continuation of correlation analysis. Clustering enables to group similar intentions and helps highlight common and typical scenarios of disinformation. This simplifies and deepens the analysis, as it helps to better understand the variety of disinformation and analyze typical scenarios. In general, the information obtained contributes to the improvement of filtering algorithms and content analysis for a more effective counteraction to fake news.

When performing clustering of intentions, we took into account the peculiarities of the source data. The number of intentions subject to clustering was small, only 24, whereas the number of fake news was 1275, i.e. 50 times more. In addition, the data matrix had a binary format, where the presence of an intention in the news content was marked as “1” and its absence as “0”. Clustering results include groups of intentions that are similar to each other in a certain sense (occur in texts with some consistency).

For the considered situation, hierarchical clustering appeared to be the most suitable method. In the R environment, several packages implement hierarchical clustering methods; however, we utilized the pvclust package. During the clustering procedure, hierarchical clustering with binary metrics and the ward.D2 algorithm (Ward’s method) was employed. Besides the clustering itself, the package provides the ability to evaluate the degree of uncertainty in the results of hierarchical clustering. For each identified cluster, two types of p-values are automatically computed: AU (approximately unbiased) and BP (bootstrap probability). The AU-p value serves as a more accurate approximation of an unbiased p-value compared to the BP-p value, which is calculated using a standard bootstrap sample. The use of the pvrect function enabled us to determine the number of clusters and identify clusters that meet the reliability requirements for clustering (with specified p-values).

All the described research stages are implemented in the R environment using the following package functions:

– data manipulation – functions of the dplyr and tidyr quanteda packages;

– data visualization – ggplot function from the ggplot2 package;

– data clustering and dendrogram construction with uncertainty assessment – the cluster and pvclust packages functions;

– calculation of the correlation dependence between intentions – the pairwise_cor() function from the widyr package.

4. Findings

Identification and analysis of the RF’s fake news intentions

The generalized results of the analysis of intentions in fake news are given in Annex B.

Regarding the objects of intentional influence, the largest share of intentions throughout the studied period is directed against Ukraine (42.9%), followed by Russia (31.4%) and the West (25.6%). When examining the periods of the war, the share of intentions towards Ukraine and the West gradually increased to 55.4% and 36.2% respectively, while the share directed at Russia decreased to an extremely low value of 8.5%.

Regarding the objects of influence, the analysis shows the following distribution.

1) “They” (Ukraine): 14.5% – discrediting Ukraine (spreading false information to undermine trust and authority; 7.2% – intimidating Ukraine and spreading panic; 6.1% – accusing Ukraine of invasion, provocations and crimes; 6% – revelation of allegedly hidden (secret) intentions or actions of Ukraine; 3.6% – Ukraine’s dependence on the West; 3.6% – humiliating criticism and mockery of Ukraine. This distribution of intentions reflects a strategy of deliberate undermining of trust (delegitimization) in Ukraine in the eyes of its citizens and the world.

2) “Third party” (West and other countries): 7.9% are fake news dedicated to the humiliation of Ukraine by Western countries to discredit Western support for Ukraine; 4.6% – a call to deny the West’s support for Ukraine to cast doubt on the trust of its allies; 3.6% – accusing Western countries of aggression against Russia; 3.2% – intimidation of the West and discredit of Western countries; 2.2% – criticism and mockery of the West. A small share of other intentions completes the overall picture. So, the general strategy is to discredit the support of Ukraine from the West, to cast doubt on the trust in the allies, to portray the West as an aggressor and to undermine the unity of the West and its resistance to Russia.

3) “We” (Russia): 9.9% – denial/refutation of accusations and criticism against Russia; 9.6% – justification of Russia’s actions; 7.1% – demonstration of the successes and achievements of the Russian army; 5.3% – caution, restraint in statements and 2.7% – self-presentation, creating a positive image. The general strategy is to justify and legitimize the actions of the Russian Federation in the eyes of its citizens.

Correlations between the RF’s fake news intentions

In analyzing intentions, their cooccurrence in the same documents carries a deeper significance than a simple statistical characteristic. From the influencer’s perspective (Russia), intentions hold a greater impact when they appear more frequently in fake news. Simultaneously, if an intention has a high common frequency (or shows a positive correlation) with another intention, it can serve as an additional argument in support of that other intention.

Table 2 presents the pairs of intentions that are most often found in the fake news of the Russian Federation.

Table 2. Frequency of cooccurrence of intentions (n>25)

Intention 1

Intention 2

n

We (Russia): successes/achievements of the SMO

They (Ukraine): discredit

73

We (Russia): deflection of accusation/criticism

They (Ukraine): accusation

70

We (Russia): deflection of accusation/criticism

Third party: humiliation of Ukraine

59

They (Ukraine): discredit

Third party: humiliation of Ukraine

50

We (Russia): deflection of accusation/criticism

They (Ukraine): revelation

49

We (Russia): justification

They (Ukraine): revelation

47

We (Russia): deflection of accusation/criticism

They (Ukraine): discredit

43

We (Russia): justification

They (Ukraine): accusation

43

They (Ukraine): accusation

Third party: humiliation of Ukraine

42

We (Russia): justification

Third party: humiliation of Ukraine

42

We (Russia): justification

Third party: accusation

39

We (Russia): caution

Third party: humiliation of Ukraine

37

We (Russia): successes/achievements of the SMO

They (Ukraine): intimidation of Ukraine

35

We (Russia): caution

They (Ukraine): discredit

33

We (Russia): justification

They (Ukraine): discredit

31

We (Russia): justification

They (Ukraine): intimidation of Ukraine

29

They (Ukraine): revelation

Third party: humiliation of Ukraine

29

They (Ukraine): criticism/mockery

Third party: humiliation of Ukraine

28

They (Ukraine): dependence on the West

Third party: accusation

28

They (Ukraine): revelation

Third party: accusation

27

We (Russia): justification

Third party: refusal to support Ukraine

27

We (Russia): successes/achievements of the SMO

Third party: intimidation of the West

25

They (Ukraine): intimidation of Ukraine

Third party: refusal to support Ukraine

25

The table shows that intentions relating to the support of the Russian Federation prevail (16 out of 23). The RF combines positive intentions towards itself with exclusively negative intentions towards Ukraine and the West. That is, the RF (“We”) tries to justify itself, deflect criticism and accusations, support its image against the background of discrediting, exposing, humiliating, and intimidating Ukraine (“They”) and the West (“Third party”).

About Ukraine, the top frequencies contain various negative intentions, such as discrediting, accusations, criticism, ridicule and intimidation. The Russian Federation is trying to strengthen these intentions by accusing the West and humiliating Ukraine by the West.

Regarding specific pairs of intentions, the most common are: “successes of a special military operation of the Russian Federation” and “discrediting Ukraine” (73 times) and “deflecting accusations/criticism” from the Russian Federation with simultaneous “accusation of Ukraine” (70 times).

The assessment of the cooccurrence of intentions through the calculation of the conjugation coefficient φ looks somewhat different. The top 20 first pairs in terms of correlation are given in Table 3. Here are more or less evenly presented intentions that apply to all objects – Ukraine, the West and the Russian Federation.

Pairs of intentions with the highest correlation coefficient φ include: “accusation of nazism/nationalism” of Ukraine and the West (φ = 0.31), criticism of Ukraine regarding “dependence on the West” with simultaneous “accusation” of the West (φ = 0.26), “deflection of accusations/criticism” from the Russian Federation with mutual “accusation” of Ukraine (φ =0.25).

Table 3. Correlation of intentions (φ >0.10)

Intention 1

Intention 2

φ 

Third party: nationalism/nazism

They (Ukraine): nationalism/nazism

0.31

They (Ukraine): dependence on the West

Third party: accusation

0.26

They (Ukraine): accusation

We (Russia): deflection of accusation/criticism

0.25

We (Russia): justification

Third party: accusation

0.18

Third party: accusation

They (Ukraine): revelation

0.16

We (Russia): caution

They (Ukraine): criticism/mockery

0.15

We (Russia): self-presentation

Third party: encouragement to support the RF

0.15

They (Ukraine): dependence on the West

Third party: discredit

0.13

They (Ukraine): dependence on the West

Third party: refusal to support Ukraine

0.13

They (Ukraine): revelation

We (Russia): deflection of accusation/criticism

0.13

Third party: humiliation of Ukraine

They (Ukraine): accusation

0.13

Third party: intimidation of the West

We (Russia): successes/achievements of the SMO

0.13

They (Ukraine): criticism/mockery

Third party: humiliation of Ukraine

0.12

They (Ukraine): devision of society

Third party: encouragement to support the RF

0.12

They (Ukraine): discredit

We (Russia): successes/achievements of the SMO

0.12

Third party: humiliation of Ukraine

We (Russia): deflection of accusation/criticism

0.12

We (Russia): caution

Third party: humiliation of Ukraine

0.12

We (Russia): justification

They (Ukraine): revelation

0.12

Third party: support of the RF

We (Russia): self-presentation

0.10

We (Russia): self-presentation

They (Ukraine): devision of society

0.10

A network of paired intentions is presented in Fig. 2.

Fig. 2. The network of intentions based on the coefficient of conjugation φ 

The analysis of the network of paired intentions shows that it is divided into four separate groups, which represent variations of the strategic components of the fake news disinformation effect.

The first subnet includes simultaneously accusing Ukraine and the West of nazism, having the highest level of correlation between intentions.

The second component is formed by a subnetwork that includes four intentions. At its center, there is the intention of “self-presentation” of the Russian Federation, which correlates with the intention of “dividing Ukrainian society,” and intentions related to the West, namely “supporting the Russian Federation” and “encouragement to support the Russian Federation.” The level of correlation between these intentions is insignificant.

The third subnetwork of intentions has at its center the “successes and achievements in the SMO” of the Russian Federation, strengthened by “discrediting” Ukraine and “intimidating” the West. Here, too, the level of correlation between intentions is insignificant.

The fourth subnetwork is the largest and represents a chain of interconnected key intentions that correlate with several additional intentions. The first key element is the intention of “humiliating Ukraine” by the West. This intention correlates with other intentions such as “caution” of the Russian Federation, “criticism/mockery” and “accusation” of Ukraine. This key element also correlates with the next key element – “deflecting accusations/criticism” by the Russian Federation. The interaction between these key elements is very close. Next, this key element correlates with the “revelation” of Ukraine which serves to “justify” the Russian Federation. The third key element also correlates with the fourth key intention – the “accusation” of the West which, on the one hand, serves to “justify” the Russian Federation and, on the other hand, is very closely correlated with the last key element – accusing Ukraine of “dependence on the West.” This intention for the Russian Federation is another reason to “discredit” the West and to call “to refuse to support Ukraine.”

Clusterization of intentions

Hierarchical clustering of intentions was carried out, as indicated in the methodology, using the functions of the pvclust package. The clustering results are presented in the dendrogram (Fig. 3). On the dendrogram, p-values (%) are indicated at the edges of the clustering. AU p-value is indicated in red, BP values are in green. Clusters with AU larger than 84% are highlighted by rectangles. This level of significance made it possible to cover all intentions with clusters and gives quite decent results for clusters in terms of p-values – two clusters have AU of 98% and 95%, respectively, and the other three 84%, 85% and 86%.

Fig. 3. Cluster dendrogram with p-values

Our interpretation of the content of each cluster is the following.

The first cluster can be called “Discredit of Ukraine.” It includes the following intentions:

– “We” (Russia): caution,

– “We” (Russia): deflection of accusation/criticism,

– “They” (Ukraine): accusation,

– “They” (Ukraine): criticism/mockery,

– “Third party”: humiliation of Ukraine.

As we can see, this includes, on the one hand, various ways of discrediting opponents (Ukraine and the West) – criticism, accusations, ridicule, humiliation, and on the other – protecting Russia from criticism and accusations. With its fake news, the RF strategically uses caution and deflection of accusation/criticism to shift blame and criticism onto Ukraine, thus contributing to the creation of an atmosphere in which a “Third party” (the West) must join the humiliation of Ukraine.

The second cluster resembles the previous one, “Justifying Russia and Discrediting Opponents.” but it has somewhat broader intentions. The main intentions of this cluster include:

– “We” (Russia): justification,

– “They” (Ukraine): dependence on the West,

– “They” (Ukraine): revelation,

– “Third party”: accusation,

– “Third party”: discredit.

Through this set of intentions, Russia justifies its actions by emphasizing Ukraine’s dependence on the West to expose Ukraine’s potential weaknesses and to encourage a third party to blame and discredit Ukraine. The main idea is to show that Russia has the right to its aggressive actions, and its opponents disguise their real evil goals.

The name of the third cluster is “Positive Image of Russia and Criticism of Opponents.” It includes the following intentions:

– “We” (Russia): self-presentation,

– “They” (Ukraine): division of society,

– “They” (Ukraine): Russophobia,

– “Third party”: criticism/mockery,

– “Third party”: encouragement to support the RF,

– “Third party”: Russophobia (of the West),

– “Third party”: support of the RF (by the West).

The Russian Federation engages in self-presentation, inciting divisions in Ukrainian society and exploiting Russophobia. As for the West, the RF’s intentions here are twofold: on the one hand, criticism/ridicule, at Russophobia as well, and on the other hand, gratitude for supporting the Russian Federation and further encouragement to continue doing this. The main idea is to create a positive image of Russia and a negative image of its opponents, to present Russia as a victim of Russophobia and a hostile environment.

The fourth cluster can be called “Demonstration of Power and Intimidation.” It covers the following intentions:

– “We” (Russia): successes/achievements of the SMO,

– “They” (Ukraine): discredit,

– “They” (Ukraine): intimidation of Ukraine,

– “Third party”: intimidation of the West,

– “Third party”: refusal to support Ukraine.

With the help of fake news, the Russian Federation highlights its military successes and achievements, actively discredits Ukraine and intimidates it, and also tries to intimidate the West and convince it not to support Ukraine. The main idea is to demonstrate Russia’s strength (create an image of invincible Russia) and intimidate opponents, to show that Russia is progressing, while Ukraine is doomed and abandoned by the West.

The fifth cluster can be called “Accusing Opponents of Nazism and Nationalism.” This cluster includes only two intentions, which are the key propaganda narratives of Russia in the war against Ukraine:

– “They” (Ukraine): nationalism/nazism,

– “Third party”: nationalism/nazism.

In its fake narratives, Russia portrays Ukraine and the West as harboring or promoting nationalism and nazism. The main idea is to portray both Ukraine and the West as followers of nazi ideology. This is used to delegitimize opponents and represent Russia as a victim doomed to fight back.

5. Discussions Regarding Disinformation Strategy in the RF’s Fake News Reconstructed on the Analysis of Intentions

Thus, the analysis of intentions in fake news reveals the hidden goals and methods of the Kremlin. The findings demonstrate the systemic and complex nature of Russian manipulative strategies. The distribution of intentions, i.e. 42.9% directed against Ukraine, followed by Russia (31.4%) and the West (25.6%), confirms the strategic focus of Russian disinformation on discrediting Ukraine in the eyes of its citizens and the world. Correlation analysis between intention pairs revealed four distinct subnetworks reflecting differences in the strategic elements of fake news influence. This shows that the RF builds tactical techniques of disinformation based on interrelations between intentions

The identified five clusters of manipulative intentions testify to the complex and systemic nature of the Russian disinformation campaign. All clusters have a common feature: they contain manipulative intentions to discredit opponents and justify Russia. At the same time, the differences lie in various manipulative tactics depending on the goal: to discredit the main opponent, the intentions of criticism, accusations, and ridicule are used; to justify Russia and to discredit both opponents, intentions of revelation and dependence on the West are applied; to create a positive image of Russia and to criticize opponents, the intentions of accusing them of Russophobia and dividing society of Ukraine are utilized; to demonstrate the RF’s power and to intimidate opponents, Russia tries to demonstrate its military successes and isolation of Ukraine; to accuse the West and Ukraine, the RF primarily appeals to alleged nazism and nationalism of adversaries.

Hence, the clusters show various manipulative tactics used by the Kremlin in its disinformation campaign against Ukraine – from discrediting to intimidation and revelation. This gives an idea of a well-thought-out systemic strategy for influencing public opinion (Fig. 4).

Therefore, the conducted research makes it possible to reconstruct the following disinformation strategies embodied in the RF’s fake news: to deliberately undermine trust to Ukraine in the eyes of its citizens and the world, to discredit the support of Ukraine from the West, to portray the West as an aggressor and to undermine the unity of the West in its opposition to Russian; to justify and legitimize the actions of the Russian Federation in the eyes of its citizens primarily.

Actually, Russia has offered nothing new; everything is included in the usual formula Divide et impera, highly used in times of the USSR. On the other hand, such a strategy indicates that the Russian Federation is focused on a long-term military and information and psychological campaign. Therefore, the most important task for Ukraine and the West is to maintain unity and support each other, not only to refute fake news but to act in advance and strengthen the information literacy of citizens, to decrease (ideally, prevent) the influence of disinformation on their consciousness.

Fig. 4. Components of the RF’s disinformation strategy in the war against Ukraine

Conclusion

A disinformation campaign is a powerful means to influence an audience, especially in a conflict situation like war. Such a campaign is thoroughly planned with predetermined intentions and fake news is the means of this influence. Fake news that the RF spread in the information space during the first year of the full-scale war against Ukraine served as empirical material. In this article, we assumed that by reconstructing the intentions in fake news, we can outline the strategic goals of the Russian Federation. As a methodological basis for the analysis, we developed the FIRS model, in which, when analyzing fake news, we move from the level of collecting and verifying fake news (F) to the level of determining intentions (I), then generalizing the relations between intentions based on correlation and clustering (R), and at the highest level, we define a general disinformation strategy (S).

The conducted intent analysis, analysis of intentions correlation, and intentions clustering method enabled us to conclude that in the conditions of a full-scale war against Ukraine and certain “ostracism” from the West, the general strategy of the Russian Federation involves undermining trust in Ukraine, primarily in the eyes of allies, discrediting aid to Ukraine and the destruction of the unity of the Western coalition, Russia’s justification of its crimes and the formation of a positive perception of the RF, first of all, on the part of its citizens (at least to encourage them to support aggression and hatred). This is a long-term goal; its achievement requires time and constant strengthening of efforts, so we believe that this is also a sign that the Russian Federation is focused on a long-term conflict in Ukraine and an attempt to “break” the West.

The findings of the study confirm the important role of fake news analysis in understanding disinformation strategies.

Regarding the prospects of further research in the area we outlined, the analysis of a longer period (not just a year, as in our work) would make it possible to trace changes in the RF’s strategies. We analyzed disinformation from the perspective of the influencer’s intentions, instead, one can analyze the results of the influence of fake news on the object(s). Our selection of fake news is based on the data of Ukrainian state bodies, and one can consider fake news collected by relevant Western resources, for example, EUvsDisinfo.

References

Ackoff, Russell. “From Data to Wisdom.” Journal of Applied Systems Analysis 16, no. 3 (1989): 1–9. https://www-public.imtbs-tsp.eu/~gibson/Teaching/Teaching-ReadingMaterial/Ackoff89.pdf

Adjin-Tettey, and Theodora Dame. “Combating Fake News, Disinformation, and Misinformation: Experimental Evidence for Media Literacy Education.” Cogent Arts & Humanities 9, no. 1 (2022). https://doi.org/10.1080/23311983.2022.2037229.

Adolphus, M. How to...Use Discourse Analysis. Emerald Publishing, 2023. https://www.emeraldgrouppublishing.com/how-to/research/data-analysis/use-discourse-analysis

Ahmed, Al Ahmed, Ayman Aljarbouh, Ayman Aljarbouh, and Myung Suh Choi. “Detecting Fake News using Machine Learning: A Systematic Literature Review.” GeeksforGeeks (2022). https://arxiv.org/pdf/2102.04458.pdf

Benkler, Y., R. Faris, and H. Roberts. Network Propaganda: Manipulation, Disinformation, and Radicalization in American Politics. New York: Oxford University Press, 2018.

Bennett, W. Lance, and Steven Livingston. “The Disinformation Order: Disruptive Communication and the Decline of Democratic Institutions.” European Journal of Communication 33(2) (2018): 122–139. https://doi.org/10.1177/0267323118760317

Berelson, B. Content Analysis in Communication Research. New York: Free Press, 1952.

Bhatia, V. Kiran, Mariam Elhussein, Ben Kreimer, and Trevot Snapp. “Internet Shutdown and Regime-Imposed Disinformation Campaigns.” Communicatio 49, no. 8 (2023): 53–71. https://www.tandfonline.com/doi/abs/10.1080/02500167.2023.2230391

Bittman, Ladislav. “The Use of Disinformation by Democracies.” International Journal of Intelligence and CounterIntelligence 4(2) (1990): 243–261. https://doi.org/10.1080/08850609008435142

Brahmachary, Ayan. “DIKW Model: Explaining the DIKW Pyramid or DIKW Hierarchy.” Certguidance (2019). https://www.certguidance.com/explaining-dikw-hierarchy/

Bridgman, A., E. Merkley, P. Loewen, T. Owen, D. Ruths, L. Teichmann, and O. Zhilin. “The Causes and Consequences of COVID-19 Misperceptions: Understanding the Role of News and Social Media.” HKS Misinformation Review, 1 (2020). https://doi.org/10.37016/mr-2020-028

Chen, D., and C. J. Anderson. “Quantitative Research and Educational Measurement.” International Encyclopedia of Education (Fourth Edition) (2023), https://www.sciencedirect.com/topics/social-sciences/pearson-correlation-coefficient

Clements, M. T. “Shock and Awe: The Effects of Disinformation in Military Confrontation.” Policy Studies 35, no. 3 (2014): 211–220. https://www.tandfonline.com/doi/abs/10.1080/01442872.2014.886679

“Disinformation and ‘Fake News’: Interim Report.” House of Commons Select Committee on Culture, Media, and Sport, July 29 (2018).  https://publications.parliament.uk/pa/cm201719/cmselect/cmcumeds/363/36304.htm#_idTextAnchor002

“Disinformation is not just any piece of “fake news” – It’s the deliberate dissemination of false or misleading information. Interview.” Heinrich Böll Stiftung (2022). https://il.boell.org/en/2022/03/04/disinformation-strategic-communication-its-purposeful-deliberate-strategic-dissemination

“Disinformation Pathways and Effects: Case Studies from Five African Countries.” CIPESA (2022). https://cipesa.org/wp-content/files/briefs/report/Disinformation-Pathways-and-Effects-Case-Studies-from-Five-African-Countries-Report-2.pdf

“Final Report of the High Level Expert Group on Fake News and Online Disinformation.” European Commission, March 12 (2018). https://digital-strategy.ec.europa.eu/en/library/final-report-high-level-expert-group-fake-news-and-online-disinformation

Freelon, D., and Ch. Wells. “Disinformation as Political Communication.” Political Communication 37, no. 2 (2020): 145–156. https://doi.org/10.1080/10584609.2020.1723755

Gibbons, A., and A. Carson. “What is Misinformation and Disinformation? Understanding Multi-Stakeholders’ Perspectives in the Asia Pacific.” Australian Journal of Political Science 57, no. 3 (2022): 231–247. https://www.tandfonline.com/doi/abs/10.1080/10361146.2022.2122776

Hasa. Difference Between Content Analysis and Discourse Analysis. PEDIAA (2016). https://pediaa.com/difference-between-content-analysis-and-discourse-analysis/

Herman, E. S. “The Propaganda Model Revisited.” Monthly Review, January 01 (2018). https://monthlyreview.org/2018/01/01/the-propaganda-model-revisited/

Hoffman, E. “Changing Minds: 4 Scientific Models of Persuasion.” LIFE (2020). https://www.lifeintelligence.io/blog/changing-minds-the-science-of-persuasion

Hsieh, H. F., and C. J. Shannon. “Three Approaches to Qualitative Content Analysis.” Qualitative Health Research 15(9) (2005): 1277–1288. https://journals.sagepub.com/doi/10.1177/1049732305276687

Ireton, C., and J. Posetti (Eds.). Journalism, Fake News & Disinformation: Handbook for Journalism Education and Training. Unesco Publishing (2018). https://en.unesco.org/fightfakenews

Iyengara, Sh., and D. Massey. “Scientific Communication in a Post-Truth Society.” PNAS 116, no. 16, 16 April (2019): 7656–7661. https://doi.org/10.1073/pnas.1805868115

Kameneva, N. “Linguistic Techniques and Methods of Mass Media Influence on the Public Consciousness.” Scientific Research and Development Socio-Humanitarian Research and Technology 7, no. 1 (2018): 67–73. https://doi.org/10.12737/article_5ad9bac171a519.20927468

Kavanagh, J., and M. Rich. “Truth Decay, an Initial Exploration of the Diminishing Role of Facts and Analysis in American Public Life.” RAND Corporation (2018). https://www.rand.org/pubs/research_reports/RR2314.html.

Khanam, Z. et al. “Fake News Detection Using Machine Learning Approaches.” IOP Conference Series: Materials Science and Engineering (2021). https://iopscience.iop.org/article/10.1088/1757-899X/1099/1/012040/pdf

Korsun, Ch. “Exploring News Engagement among Young Adults through Motivational Core Drives.” DiVA (2022). https://www.diva-portal.org/smash/get/diva2:1715154/FULLTEXT01.pdf

Livingstone, S. “The Participation Paradigm in Audience Research.” The Communication Review 16, no. 1–2 (2013): 21–30. http://eprints.lse.ac.uk/49630/1/Livingstone_Participation-paradigm-in-audience-research_2013.pdf

Luo, A. Critical Discourse Analysis. Definition, Guide & Examples. Scribbr (2023). https://www.scribbr.com/methodology/discourse-analysis/

McKeon, M. W. “Argument, Inference, and Persuasion.” OSSA Conference Archive 22 (2020). https://scholar.uwindsor.ca/ossaarchive/OSSA12/Friday/22

Medhat, W., A. Hassan, and H. Korashy. “Sentiment Analysis Algorithms and Applications: A Survey.” Ain Shams Engineering Journal 5, no. 4 (2014): 1093–1113. https://www.sciencedirect.com/science/article/pii/S2090447914000550

Media Effects Models: Elaborated Models.Communication (2023). https://communication.iresearchnet.com/media/media-effects-models-elaborated-models/

Mykus, S. et al. “Boyovyy dosvid z pytanʹ informatsiynoyi bezpeky otrymanyy pid chas rosiysʹko-ukrayinsʹkoyi viyny [Combat experience in information security gained during the Russian-Ukrainian war], Chastyna persha [Part One] (lyutyy 2014 – berezenʹ 2022 roku) [February 2014 – March 2022]: zbirnyk informatsiyno-analitychnykh materialiv [a collection of information and analytical materials].” Kyiv: Ivan Chernyakhovskyi National University of Defense of Ukraine, 2022. (in Ukrainian)

Mykus S. et al. “Boyovyy dosvid z pytanʹ informatsiynoyi bezpeky otrymanyy pid chas rosiysʹko-ukrayinsʹkoyi viyny [Combat experience in information security gained during the Russian-Ukrainian war], Chastyna druha [Part Two] (berezen 2022 – liutyi 2023) [March 2022 – February 2023]: zbirnyk informatsiyno-analitychnykh materialiv [a collection of information and analytical materials].” Kyiv: Ivan Chernyakhovskyi National University of Defense of Ukraine, 2023. (in Ukrainian)

OECD, Disinformation and Russia’s War of Aggression against Ukraine. Threats and Governance Responses. 03 November (2023). https://www.oecd.org/ukraine-hub/policy-responses/disinformation-and-russia-s-war-of-aggression-against-ukraine-37186bde/

Romerstein, H. “Disinformation as a KGB Weapon in the Cold War.” Journal of Intelligence History 1, no. 1 (2001): 54–67. https://doi.org/10.1080/16161262.2001.10555046

Rowley, J. “The Wisdom Hierarchy: Representations of the DIKW Hierarchy.” Journal of Information Science 33, no. 2 (2007): 163–180. https://doi.org/10.1177/0165551506070706

Shoben, E. J. “Review of: Communication and Persuasion.Journal of Consulting Psychology 18, no. 2 (1954): 152–152. https://doi.org/10.1037/h0053111

Silge, Julia, and David Robinson. Text Mining with R: A Tidy Approach (1st. ed.). O’Reilly Media, Inc. 2017. https://www.tidytextmining.com/

Stanford Encyclopedia of Philosophy (2023). Intentionality. https://plato.stanford.edu/entries/intentionality/

Taboada, M. “Sentiment Analysis: An Overview from Linguistics.” Annual Review of Linguistics 2 (2016): 325–347. Pre-publication version. https://core.ac.uk/download/pdf/85004137.pdf

Ullman, H., and J. Wade, “Shock and Awe: Achieving Rapid Dominance.” (2003). http://www.dodccrp.org/files/Ullman_Shock.pdf

Ushakova, T., N. Pavlova, V. Latynov, V. Tseptsov, and K. Alekseyev. Slovo v deystvii. Intent-analiz politicheskogo diskursa [The word in action. Intent analysis of political discourse]. SPb.: Aleteyya (2000). (in Russian)

Vosoughi, S., D. Roy, and S. Aral. “The Spread of True and False News Online.” Science 359, no. 6380 (2018): 1146–1151. https://doi.org/10.1126/science.aap9559

Waltzman, R. “The Weaponization of Information.” RAND (2017). https://www.rand.org/pubs/testimonies/CT473.html

Yuskiv, B., N. Karpchuk, and S. Khomych. “Media Reports as a Tool of Hybrid and Information Warfare (the case of RT – Russia Today).” Codrul Cosminului 27, no. 1 (2021): 235–258. https://codrulcosminului.usv.ro/article-12-vol-27-1-2021/

Zasiekina, L., and S. Zasiekin. Psykholinhvistychna diahnostyka [Psycholinguistic diagnosis]. Lutsk: RVV ‘Vezha’ (2008). (in Ukrainian)

Annex А

Characteristics of intentions by groups of reference objects

Intentions (mode*)

Reconstruction of intentions

Object 1: “We” (Russia)

• successes/ achievements of the SMO (special military operation) (+)

impressive results and efficiency of the SMO that demonstrate the high level of professionalism and military achievements of the Russian Federation

• self-presentation (+)

presenting the Russian Federation in an attractive, favorable light;

• deflection of accusation/criticism (+)

denial of guilt attributed to the Russian Federation, negative judgments about the RF or its actions

• justification (+)

providing arguments and/or facts to prove the rightness of the Russian Federation and to show the absurdity of its condemnation by the international community

• caution (+)

refraining from situations or actions to prevent possible negative consequences or problems

• self-criticism (–)

the Russian Federation criticizes itself

Object 2: “They” (Ukraine)

• accusation (–)

attributing any guilt to Ukraine

• revelation (–)

detection of improper actions, intentions, negative qualities of Ukraine

• discredit (–)

presenting facts and arguments that undermine trust in Ukraine and diminish its authority

• criticism/mockery (–)

a negative judgment or ridicule of Ukraine and its actions

• nationalism/nazism (–)

the priority of national interests and cultural purity of Ukrainians accompanied by extremism/discrimination based on racial, ethnic or ideological characteristics

• Russophobia (–)

a negative/hostile attitude to everything associated with Russia and the Russians

• dependence on the West (–)

Ukraine relies heavily on the West’s economic, political or cultural support, or has significant Western influence on its decisions and actions

• division of society (–)

conflict or lack of unity of Ukrainian society, in particular concerning values, views, ideologies or interests

• intimidation of Ukraine (–)

the RF’s acts/verbal statements to create fear, a sense of danger or the promise of trouble in Ukraine to control, punish or achieve certain goals of the Russian Federation.

Object 3: “Third party” (West – other countries)

• accusation (–)

attributing some guilt to the West

• discredit (–)

presenting facts and arguments that undermine trust in the West, diminish its authority

• criticism/mockery (–)

negative judgment or ridicule of the West and its actions

• nationalism/nazism (–)

the priority of national interests and Western culture accompanied by extremism/discrimination on the basis of racial, ethnic or ideological characteristics

• Russophobia (–)

a negative/hostile attitude to everything associated with Russia and the Russians

• support of the RF (+)

the West’s support of the RF’s certain actions or the RF as a whole

• refusal to support Ukraine (0)

refusal/encouragement of refusal of the West to provide support to Ukraine

• humiliation of Ukraine (–)

negative statements or actions of Western politicians or famous people aimed at accusing, insulting or humiliating Ukraine and the Ukrainians

• encouragement to support the RF (0)

encouragement to take the side of the Russian Federation through the declaration of the RF’s “great mission to fight any evil associated with Ukraine

• intimidation of the West (–)

the RF’s acts/verbal statements to create fear, a sense of danger or the promise of trouble in Western states to control, punish or achieve certain goals of the Russian Federation

* Note: Mode of intentions is defined as: “–“ – negative, “0” – neutral, “+” – positive.

Annex B

Share of intentions in the RF’s fake news*

Object / Intention

Total

Period 1

Period 2

Period 3

Period 4

Period 5

n

%

n

%

n

%

n

%

n

%

n

%

We (Russia)

855

31,4

17

44,7

451

36,5

287

37,8

74

19,4

26

8,5

deflection of accusation/criticism

244

9,0

5

13,2

133

10,8

73

9,6

25

6,5

8

2,6

justification

238

8,8

8

21,1

116

9,4

101

13,3

11

2,9

2

0,7

successes/achievements of the SMO

176

6,5

0,0

101

8,2

48

6,3

16

4,2

11

3,6

caution

131

4,8

3

7,9

66

5,3

54

7,1

8

2,1

0,0

self-presentation

66

2,4

1

2,6

35

2,8

11

1,4

14

3,7

5

1,6

They (Ukraine)

1168

42,9

16

42,1

511

41,4

289

38,1

182

47,6

170

55,4

discredit

360

13,2

3

7,9

195

15,8

67

8,8

45

11,8

50

16,3

intimidation of Ukraine

178

6,5

0,0

38

3,1

52

6,9

48

12,6

40

13,0

accusation

152

5,6

10

26,3

74

6,0

35

4,6

18

4,7

15

4,9

revelation

148

5,4

2

5,3

76

6,2

43

5,7

13

3,4

14

4,6

dependence on the West

90

3,3

0,0

28

2,3

33

4,3

9

2,4

20

6,5

criticism/mockery

89

3,3

1

2,6

41

3,3

22

2,9

16

4,2

9

2,9

nationalism/nazism

76

2,8

0,0

30

2,4

22

2,9

15

3,9

9

2,9

division of society

48

1,8

0,0

23

1,9

9

1,2

13

3,4

3

1,0

Russophobia

27

1,0

0,0

6

0,5

6

0,8

5

1,3

10

3,3

Third party

697

25,6

5

13,2

272

22,0

183

24,1

126

33,0

111

36,2

humiliation of Ukraine

196

7,2

3

7,9

116

9,4

40

5,3

16

4,2

21

6,8

refusal to support Ukraine

113

4,2

0,0

48

3,9

45

5,9

9

2,4

11

3,6

accusation

89

3,3

0,0

34

2,8

21

2,8

12

3,1

22

7,2

intimidation of the West

79

2,9

0,0

25

2,0

21

2,8

17

4,5

16

5,2

discredit

78

2,9

0,0

8

0,6

19

2,5

28

7,3

23

7,5

criticism/mockery

54

2,0

0,0

11

0,9

11

1,4

21

5,5

11

3,6

encouragement to support the RF

36

1,3

2

5,3

15

1,2

13

1,7

6

1,6

0,0

support of the RF

23

0,8

0,0

2

0,2

7

0,9

8

2,1

6

2,0

nationalism/nazism

15

0,6

0,0

7

0,6

3

0,4

4

1,0

1

0,3

Russophobia (Заходу)

14

0,5

0,0

6

0,5

3

0,4

5

1,3

0,0

Total

2720

100

38

100

1234

100

759

100

382

100

307

100

* Considering the assessments of Ukrainian and Western experts, the studied period is divided into five stages which reflect different phases of military operations (Pecherskyi & Buket 2023; Walker 2023; Westfall 2023). These periods include: 1) 17.02.2022 - 23.02.2023 – the RF’s prelude to the invasion; 2) 24.02.2022 - 31.05.2022 – Russia’s invasion of Ukraine and international condemnation of its actions; 3) 01.06.2022 - 31.08.2022 – Ukraine’s accumulation of forces to liberate the territories occupied by Russia; 4) 01.09.2022 – 30.11.2022 – Ukraine’s offensive operations and the liberation of some territories; 5) 01.12.2022 - 28.02.2023 – Ukraine’s defensive actions and depletion of the Russian Armed Forces.


  1. 1 Rand Waltzman, “The Weaponization of Information,RAND (2017), https://www.rand.org/pubs/testimonies/CT473.html

  2. 2 OECD, Disinformation and Russia’s War of Aggression against Ukraine. Threats and Governance esponses,03 November (2022), https://www.oecd.org/ukraine-hub/policy-responses/disinformation-and-russia-s-war-of-aggression-against-ukraine-37186bde/

  3. 3 Soroush Vosoughi, Deb Roy, and Sinan Aral, “The Spread of True and False News Online, Science 359, Iss. 6380 (2018): 11461151, DOI: https://doi.org/10.1126/science.aap9559

  4. 4 Mathew T. Clements, “Shock and Awe: The Effects of Disinformation in Military Confrontation, Policy Studies 35, No. 3 (2014): 211–220, https://www.tandfonline.com/doi/abs/10.1080/01442872.2014.886679 https://doi.org/10.1080/01442872.2014.886679

  5. 5 Mathew T. Clements, “Shock and Awe: The Effects of Disinformation in Military Confrontation, Policy Studies 35, No. 3 (2014): 211, https://www.tandfonline.com/doi/abs/10.1080/01442872.2014.886679 https://doi.org/10.1080/01442872.2014.886679

  6. 6 Harlan Ullman and James Wade, “Shock and Awe: Achieving Rapid Dominance” (2003), http://www.dodccrp.org/files/Ullman_Shock.pdf

  7. 7 Theodora Dame Adjin-Tettey, Combating Fake News, Disinformation, and Misinformation: Experimental Evidence for Media Literacy Education, Cogent Arts & Humanities 9, Issue 1 (2022), https://www.tandfonline.com/doi/full/10.1080/23311983.2022.2037229

  8. 8 Disinformation and Fake News: Interim Report,House of Commons Select Committee on Culture, Media, and Sport, July 29 (2018), https://publications.parliament.uk/pa/cm201719/cmselect/cmcumeds/363/36304.htm#_idTextAnchor002

  9. 9 Walaa Medhat, Ahmed Hassan, and Hoda Korashy, “Sentiment Analysis Algorithms and Applications: A Survey,” Ain Shams Engineering Journal 5, Iss. 4 (2014): 1093–1113, https://www.sciencedirect.com/science/article/pii/S2090447914000550; Maite Taboada, “Sentiment Analysis: An Overview from Linguistics, Annual Review of Linguistics 2 (2016): 325347. Pre-publication version, https://core.ac.uk/download/pdf/85004137.pdf

  10. 10 Alim Al Ayub Ahmed, Ayman Aljarbouh, Praveen Kumar Donepudi, and Myung Suh Choi, “Detecting Fake News using Machine Learning: A Systematic Literature Review,GeeksforGeeks (2022), https://arxiv.org/pdf/2102.04458.pdf; Z. Khanam et al., “Fake News Detection Using Machine Learning Approaches,IOP Conference Series: Materials Science and Engineering (2021), https://iopscience.iop.org/article/10.1088/1757-899X/1099/1/012040/pdf

  11. 11 Margaret Adolphus, How to...Use Discourse Analysis, Emerald Publishing (2023), https://www.emeraldgrouppublishing.com/how-to/research/data-analysis/use-discourse-analysis; Amy Luo, “Critical Discourse Analysis. Definition, Guide & Examples,Scribbr (2023), https://www.scribbr.com/methodology/discourse-analysis/

  12. 12 Bernard Berelson, Content Analysis in Communication Research (New York: Free Press: 1952); Hsiu-Fang Hsieh and Sarah E. Shannon, Three Approaches to Qualitative Content Analysis,Qualitative Health Research 15(9) (2005): 12771288, https://journals.sagepub.com/doi/10.1177/1049732305276687

  13. 13 Hasa, “Difference Between Content Analysis and Discourse Analysis,PEDIAA (2016), https://pediaa.com/difference-between-content-analysis-and-discourse-analysis/

  14. 14 Edward S. Herman, “The Propaganda Model Revisited,Monthly Review, January 01 (2018), https://monthlyreview.org/2018/01/01/the-propaganda-model-revisited/; “Media Effects Models: Elaborated Models,Communication (2023), https://communication.iresearchnet.com/media/media-effects-models-elaborated-models/

  15. 15 E. J. Shoben Jr., Review of: Communication and Persuasion, Journal of Consulting Psychology 18(2): 152152 (1954), doi: https://doi.org/10.1037/h0053111; Elisheva Hoffman, “Changing Minds: 4 Scientific Models of Persuasion,” LIFE (2020), https://www.lifeintelligence.io/blog/changing-minds-the-science-of-persuasion

  16. 16 Christina Korsun, “Exploring News Engagement among Young Adults through Motivational Core Drives,DiVA (2022), https://www.diva-portal.org/smash/get/diva2:1715154/FULLTEXT01.pdf

  17. 17 N. Kameneva, “Linguistic Techniques and Methods of Mass Media Influence on the Public Consciousness,Scientific Research and Development Socio-Humanitarian Research and Technology 7(1) (2018): 6773, DOI: 10.12737/article_5ad9bac171a519.20927468

  18. 18 Mathew W. McKeon, Argument, Inference, and Persuasion, OSSA Conference Archive 22 (2020), https://scholar.uwindsor.ca/ossaarchive/OSSA12/Friday/22

  19. 19 Deen Freelon and Chris Wells, “Disinformation as Political Communication, Political Communication 37, . 2 (2020): 145–156, https://doi.org/10.1080/10584609.2020.1723755

  20. 20 Ladislav Bittman, The Use of Disinformation by Democracies, International Journal of Intelligence and CounterIntelligence 4(2) (1990): 243–261, doi: https://doi.org/10.1080/08850609008435142

  21. 21 Herbert Romerstein, Disinformation as a KGB Weapon in the Cold War, Journal of Intelligence History 1(1) (2001): 54, doi: https://doi.org/10.1080/16161262.2001.10555046

  22. 22 Cherilyn Ireton and Julie Posetti (Eds.), Journalism, Fake News & Disinformation: Handbook for Journalism Education and Training, Unesco Publishing (2018): 1, https://en.unesco.org/fightfakenews

  23. 23 Final Report of the High Level Expert Group on Fake News and Online Disinformation,European Commission, March 12 (2018), https://digital-strategy.ec.europa.eu/en/library/final-report-high-level-expert-group-fake-news-and-online-disinformation

  24. 24 Andrew Gibbons and Andrea Carson, What is Misinformation and Disinformation? Understanding Multi-stakeholders’ Perspectives in the Asia Pacific, Australian Journal of Political Science 57, Issue 3 (2022): 231247, https://www.tandfonline.com/doi/abs/10.1080/10361146.2022.2122776

  25. 25 Deen Freelon and Chris Wells, “Disinformation as Political Communication, Political Communication 37, . 2 (2020): 145–156, https://doi.org/10.1080/10584609.2020.1723755

  26. 26 Shanto Iyengara and Douglas S. Massey, “Scientific Communication in a Post-Truth Society, PNAS, 116(16), 16 April (2019): 7656–7661, https://doi.org/10.1073/pnas.1805868115

  27. 27 Jennifer Kavanagh and Michael D. Rich, “Truth Decay, an Initial Exploration of the Diminishing Role of Facts and Analysis in American Public Life, RAND Corporation (2018), https://www.rand.org/pubs/research_reports/RR2314.html

  28. 28 Aengus Bridgman, Eric Merkley, Peter John Loewen, Taylor Owen, Derek Ruths, Lisa Teichmann, and Oleg Zhilin, “The Causes and Consequences of COVID-19 Misperceptions: Understanding the Role of News and Social Media, HKS Misinformation Review 1 (2020), doi: https://doi.org/10.37016/mr-2020-028

  29. 29 W. Lance Bennett and Steven Livingston, The Disinformation Order: Disruptive Communication and the Decline of Democratic Institutions, European Journal of Communication 33(2) (2018): 124, doi: https://doi.org/10.1177/0267323118760317

  30. 30 Disinformation Pathways and Effects: Case Studies from Five African Countries,CIPESA (2022), https://cipesa.org/wp-content/files/briefs/report/Disinformation-Pathways-and-Effects-Case-Studies-from-Five-African-Countries-Report-2.pdf

  31. 31 Kiran Vinod Bhatia, Mariam Elhussein, Ben Kreimer, and Trevor Snapp, “Internet Shutdown and Regime-Imposed Disinformation Campaigns, Communicatio 49, Issue 8 (2023): 53–71, https://www.tandfonline.com/doi/abs/10.1080/02500167.2023.2230391

  32. 32 Disinformation Pathways and Effects: Case Studies from Five African Countries,CIPESA (2022): 4, https://cipesa.org/wp-content/files/briefs/report/Disinformation-Pathways-and-Effects-Case-Studies-from-Five-African-Countries-Report-2.pdf

  33. 33 Sonia Livingstone, “The Participation Paradigm in Audience Research, The Communication Review 16/12 (2013): 21–30, http://eprints.lse.ac.uk/49630/1/Livingstone_Participation-paradigm-in-audience-research_2013.pdf

  34. 34 Yochai Benkler, Robert Faris, and Hal Roberts, Network Propaganda: Manipulation, Disinformation, and Radicalization in American Politics (New York: Oxford University Press, 2018).

  35. 35 W. Lance Bennett and Steven Livingston, The Disinformation Order: Disruptive Communication and the Decline of Democratic Institutions, European Journal of Communication 33(2) (2018): 122–139, doi: https://doi.org/10.1177/0267323118760317

  36. 36 Disinformation is not just any piece of “fake news” – It’s the deliberate dissemination of false or misleading information. Interview,” Heinrich Böll Stiftung (2022), https://il.boell.org/en/2022/03/04/disinformation-strategic-communication-its-purposeful-deliberate-strategic-dissemination

  37. 37 R. L. Ackoff, From Data to Wisdom,” Journal of Applied Systems Analysis 16 (1989): 3–9, https://www-public.imtbs-tsp.eu/~gibson/Teaching/Teaching-ReadingMaterial/Ackoff89.pdf

  38. 38 Jennifer Rowley, The Wisdom Hierarchy: Representations of the DIKW Hierarchy, Journal of Information Science 33(2) (2007): 163–180, doi: https://doi.org/10.1177/0165551506070706

  39. 39 Ayan Brahmachary, IKW Model: Explaining the DIKW Pyramid or DIKW Hierarchy, Certguidance (2019), https://www.certguidance.com/explaining-dikw-hierarchy/

  40. 40 Stanford Encyclopedia of Philosophy (2023). Intentionality, https://plato.stanford.edu/entries/intentionality/

  41. 41 The first stage covers the period of February 20, 2014 – February 23, 2022.

  42. 42 S. Mykus et al. Boyovyy dosvid z pytanʹ informatsiynoyi bezpeky otrymanyy pid chas rosiysʹko-ukrayinsʹkoyi viyny [Combat experience in information security gained during the Russian-Ukrainian war], Chastyna persha [Part One] (lyutyy 2014 – berezenʹ 2022 roku) [February 2014 – March 2022]: zbirnyk informatsiyno-analitychnykh materialiv [a collection of information and analytical materials],” Kyiv: Ivan Chernyakhovskyi National University of Defense of Ukraine (2022). (in Ukrainian); S. Mykus et al., Boyovyy dosvid z pytanʹ informatsiynoyi bezpeky otrymanyy pid chas rosiysʹko-ukrayinsʹkoyi viyny [Combat experience in information security gained during the Russian-Ukrainian war], Chastyna druha [Part Two] (berezen 2022 – liutyi 2023) [March 2022 – February 2023]: zbirnyk informatsiyno-analitychnykh materialiv [a collection of information and analytical materials],” Kyiv: Ivan Chernyakhovskyi National University of Defense of Ukraine (2023). (in Ukrainian)

  43. 43 Tatiana Ushakova, N. Pavlova, V. Latynov, V. Tseptsov, and K. Alekseyev, “Slovo v deystvii. Intent-analiz politicheskogo diskursa [The word in action. Intent analysis of political discourse],” SPb.: Aleteyya (2000). (in Russian); Larysa Zasiekina, Serhii Zasiekin, Psykholinhvistychna diahnostyka [Psycholinguistic diagnosis],” Lutsk: RVV ‘Vezha’ (2008) (in Ukrainian)

  44. 44 Bohdan Yuskiv, Nataliia Karpchuk and Serhii Khomych,“Media Reports as a Tool of Hybrid and Information Warfare (the case of RT – Russia Today),Codrul Cosminului XXVII, No. 1 (2021): 235–258, https://codrulcosminului.usv.ro/article-12-vol-27-1-2021/

  45. 45 Julia Silge and David Robinson, Text Mining with R: A Tidy Approach (1st. ed.). O’Reilly Media (2017), Inc. https://www.tidytextmining.com/

  46. 46 Dandan Chen and Carolyn J.  Anderson, “Quantitative Research and Educational Measurement,” International Encyclopedia of Education (Fourth Edition) (2023): 20, https://www.sciencedirect.com/topics/social-sciences/pearson-correlation-coefficient