Written by Eric Adamson
“The problem with phrasings in which the noun “war” is qualified by an adjective such as “hybrid” is that they sound like “war minus” when what they really mean is “war plus.”
–Timothy Snyder, “The Road to Unfreedom”
Russian Disinformation: History and Modern Goals
Since the 2016 US Presidential Election, Western governments are now aware of Russian disinformation, tactics, and general aims. “Disinformation” can be traced back to the Soviet term dezinformatsiya, a tactic that deliberately sought to spread false information in order to undermine foreign publics’ trust in official versions of events, manipulate opinion, and distort reality for strategic gains. Modern disinformation is a low-risk tool in the larger arsenal of hybrid warfare. Hybrid warfare operates in a grey zone between war and peace. It mixes conventional and covert military action with coercive economic, phycological, and cyber capabilities to achieve political, even territorial gains, without escalating to overt conventional warfare (Brands 2016; Bugajski & Assenova, 2016; Pugsley & Wesslau 2016). These tactics have proven success in Russia’s border regions in countries, such as Georgia, Ukraine, and Transnistria, but have now moved beyond the testing grounds to the West at large.
With targeted disinformation campaigns, Russia seeks to increase its relative power by destabilizing not only the cohesion of individual countries, but the European project and transatlantic community. Thus, what the West faces is conflict as “a continuation of policy by other means” not with Clausewitzian military power, but of a “concealed character” with “coercive communication” (Simpson 2014, Patrikarakos 2017). If disinformation campaigns in the information space are not preluding to conflict, but part of the conflict itself, then governments must respond and defend their societies accordingly.
Many modern disinformation tactics are drawn right from the Soviet playbook and adapted for the 21st century, namely active measures and reflexive control (Cull et al. 2017, Snegovaya 2015). Active measures attempt to influence and discredit the policies of another government as well as undermine domestic support and confidence in institutions and elected officials. Disinformation may amplify and distorts real domestic issues or simply spread outright falsehoods. Reflexive control attempts to manipulate opponents into voluntarily choosing actions advantageous to Russian interests. The goal of modern Russian disinformation is not to convince or persuade, as was the case with the communist ideology that offered an alternative model but to sow doubt and keep publics passive, demoralized, even paranoid. It may disrupt decision-making processes or cause nations to descended into factional strife, eroding democracy and the nation (Galeotti, 2017; Helmus et al. 2018; Toucas 2017).
Such tactics have long aided other aspects of Russian warfare. As when Russia conducted a disinformation campaign to provide cover for the “little green men” entering Ukraine unimpeded in 2014, so too did Lenin order Bolshevik forces to enter Ukraine under the banner of a “Soviet Ukrainian Liberation Movement” in 1917. (Appelbaum 2017, pg 25). Two decades later when Stalin was concealing his man-made famine in Ukraine, itself a particularly cruel form of hybrid warfare, the The New York Times most infamously illustrated the results of Soviet disinformation when reporter Walter Duranty wrote, “there is no actual starvation or deaths from starvation but there is widespread mortality from diseases due to malnutrition.” Duranty would later receive the Pulitzer Prize for his reporting in Ukraine. Today, Russia obfuscates its involvement in the downing of MH17 and alleged Skripal poisoning through polluting the 21st-century media ecology with numerous, often contradictory, conspiracy theories and actual fake news (Diamond et al. 2016, Maréchal, 2017; Richey 2017). Disinformation, then as now, confuses and impedes a coherent Western response.
To be sure, not all disinformation comes from foreign sources. Far from it. Western politicians are often responsible for the creation and distribution of disinformation as they search for public support of their policies and candidacy (Marwick & Lewis 2017, Berinsky 2017). Lying has always been a part of politics, but the frequency at which politicians bend the truth or speak outright falsehoods, not just in the American context, has reached unprecedented levels—most notably President Donald Trump (Skjeseth 2017, McGranahan 2017).
While some nations like Ukraine and the Baltics have been combatting Russian active measures in the information space for over a decade, others like Britain and the United States have only recently been exposed and remain at risk. In the US presidential election and Brexit campaign, the Anglo-Saxon publics were the targets of Russian information campaigns. The US intelligence community (i.e. CIA, FBI, and NSA) found that American voters were targeted with messages to “undermine public faith in the US democratic process…and aspired to help President-elect Trump’s election chances” (Intelligence Community Assessment, 2017). Social media companies additionally disclosed Russia’s role to Congress, providing evidence of sustained paid advertising by the Kremlin-linked Internet Research Agency that reached over 126 million Americans (Isaac & Wakabayashi, 2017). In the Brexit campaign, a UK communications company found Russia spent over 4€ million on pro-Brexit social media influence (Harris, 2018).
Learning this lesson, both France and Germany took measures to secure their own elections in 2017. In France, hours before the first round of presidential elections, thousands of emails from Macron’s campaign were leaked, some of which were doctored. France’s strong electoral integrity institutions, however, raised awareness of Russian tactics, and responded accordingly and transparently, resulting in the clearest example a failed Russian electoral influence campaign (Jeangène-Vilmer, 2018). In Germany, political parties agreed not to use bots in their online campaigns, which have the potential to sway to distort issues, sway public opinion, and potentially change election outcomes (Woolley 2016).
Recognizing disinformation as a non-trivial security matter has been an important first step in combatting Russian hybrid warfare and preventing future information space attacks, but Western governments remain vulnerable, often slow to implement and coordinate mechanisms to protect their democracies. Numerous policy recommendations and counter strategies have been made to combat disinformation and a modest number have been acted upon (Cull, et. Al, 2017; European Commission, 2018; Fried and Plyakova, 2018; Lucas and Pomerantsev, 2016; Paul and Matthew, 2016; Toucas, 2017; West, 2017).
For example, the US and 15 other EU and NATO nations signed on to establish the European Center of Excellence for Countering Hybrid Threats in April 2016; the EU established the East Stratcom Task Force in 2015 tasked with debunking Russian disinformation and supporting genuine journalism in former Soviet countries; the EU also released new strategy for tackling online disinformation and proposed legislation to combat disinformation online, including an “EU-wide Code of Practice on Disinformation” in April 2018; Germany created a 13,500 strong Cyber and Information Space Command as the sixth branch of the German Armed Forces; Sweden launched a nationwide school program to teach students to identify Russian propaganda; and the British Foreign Office is developing a ‘content factory’ to help EU Association and Baltic countries to create new Russian-Language media content to provide a counter-narrative (European Commission, 2018; Stelzenmüller, 2018). In the private sphere, Facebook and Twitter have identified and deleted state-backed accounts disseminating disinformation.
Though the above efforts address the needs to educate Europe’s own populations in media literacy, set media standards, craft and strengthen a Western narrative, and debunk and expose Russian disinformation, efforts remain fragmented and often underfunded. Efforts are further impeded when countries’ media and leaders, deliberately or inadvertently, spread Kremlin talking points and cast doubt upon Washington, Brussels, and EU Member State institutions.
There is a vast literature of policy proposals to combat disinformation. Each country’s specific context—values, history, legal precedents, self-conceptualized image—must inform the choice in policy most apt for states’ domestic and foreign publics. Most importantly, counter-actions must adhere to shared Western democratic values and ideals. While Western governments may conduct information campaigns, pushing their narratives, world view, and values to foreign publics, they cannot deliberately spread false information. Shutting down information channels through censorship online, while effective in stopping the spread of disinformation, also does not adhere to Western nations’ values of free-speech. To do so would only undercut Western nations’ own values and soft-power. There is thus a certain asymmetry to information warfare between democracies and authoritarian-leaning regimes. But if the “West” is to mean anything at all, Western states must find policies that adhere to their own democratic and liberal ideals.
Such efforts begin by looking inward. If a Russian internet troll with poor English grammar is able to sway a Western voter, then this says more about the state of the West’s own democratic health than the effectiveness of Russian tactics. This is not to trivialize the powerful psychological effects of these tactics. Indeed, improving the resilience of the average voter and rebuilding trust in democracy will be some of the most important, and difficult long-term projects Western governments will undertake.
Applebaum, Anne. (2017) Red Famine: Stalin’s War on Ukraine. 1st ed., New York, Doubleday.
Brands, Hal. (2016) “Paradoxes of the Gray Zone.” Foreign Policy Research Institute, 5 Feb.
Berinsky, Adam J. (2017). “Rumors and Health Care Reform: Experiments in Political Misinformation.” British Journal of Political Science. 47(2): 241-262.
Bugajski, Janusz, and Margarita Assenova. (2016). Eurasian Disunion: Russia’s Vulnerable Flanks. The Jamestown Foundation.
Cull, Nicholas J, et al. (2017) “Soviet Subversion, Disinformation and Propaganda: How the West Fought Against It.” LSE Consulting, October.
European Commission, Directorate-General for Communication Networks, Content and Technology. (2018) “A Multi-Dimensional Approach to Disinformation.” Publications Office of the European Union.
Diamond, Larry, Marc F. Plattner, and Christopher Walker. (2016) “Authoritarianism Goes Global: the Challenge to Democracy.” Johns Hopkins University Press.
Galeotti, Mark. (2017). “Controlling Chaos: How Russia Manages Its Political Warfare in Europe.” European Council on Foreign Relations, August, 2017.
Fried, Daniel, and Alina Polyakova. (2018). “Democratic Defense Against Disinformation.” Atlantic Council, Feburary.
Helmus, Todd et al. (2018) “Russian Social Media Influence: Understanding Russian Propaganda in Eastern Europe.” RAND Corporation.
Lucas, Edward, and Peter Pomerantsev. (2016). “Winning the Information War Redux: Techniques and Counterstrategies to Russian Propaganda in Central and Eastern Europe.” Center for European Policy Analysis, April, 2016
Intelligence Community Assessment, Office of the Director of National Intelligence. (2017) “Background to ‘Assessing Russian Activities and Intentions in Recent US Elections’: The Analytic Process and Cyber Incident Attribution.” January 06, 2017. ICA-2017-01D, 2017.
Isaac, M., Wakabayashi, D. (2017) “Russian Influence Reached 126 Million Through Facebook Alone.” New York Times. October 30, 2017
Jeangène-Vilmer, Jean- Baptise (2018). “Successfully Countering Russian Electoral Interference: 15 Lessons Learned from the Macron Leaks.” Center for Strategic and International Studies. June, 2018
Maréchal, Nathalie. (2017) “Networked Authoritarianism and the Geopolitics of Information: Understanding Russian Internet Policy.” Media and Communication. 5(1):29-41.
Marwick, Alice, and Rebecca Lewis. (2017).”Media Manipulation and Disinformation Online.” Data & Society Research Institute.
McGranahan, Carole. (2017). “An Anthropology of Lying: Trump and the Political Sociality of Moral Outrage.” American Ethnologist. 44(2): 243-248
Patrikarakos, David. (2017) War in 140 Characters: How Social Media Is Reshaping Conflict in the Twenty-First Century. 1st ed., Hachette Book Group, 2017
Paul, Christian, and Miriam Matthews. (2016). “The Russian ‘Firehose of Falsehood’ Propaganda Model.” RAND Corporation.
Pugsley, Sophia, and Fredrik Wesslau. (2016) “Russia in the Grey Zones.” European Council on Foreign Relations, September 1, 2016.
Richey, Mason. (2017) .”Contemporary Russian revisionism: understanding the Kremlin’s hybrid warfare and the strategic and tactical deployment of disinformation.”Asia Europe Journal. 15(54). 1-13.
Skjeseth, Heidi Taksdal. (2017).”All the president’s lies: Media coverage of lies in the US and France.” Reuters Institute for the Study of Journalism, University of Oxford
Snegovaya, Maria. (2015) “Putin’s Information Warfare in Ukraine: Soviet Origins of Russia’s Hybrid Warfare.” Institute for the Study of War, September, 2015.
Stelzenmüller, Constanze. (2017). “The Impact of Russian Interference on Germany’s 2017 Elections.” Brookings, June 28, 2017.
Simpson, Ellie. (2014) “Neither War Nor Peace: Why the Information Revolution Makes ‘Forever Wars’ a New Normal.” Belfer Center for Science and International Affairs. December 16, 2014
Toucas, Boris. (2017) “Exploring the Information-Laundering Ecosystem: The Russian Case.” Center for Strategic and International Studies (CSIS), August 31, 2017.
West, Darrel M. (2017) “How to Combat Fake News and Disinformation.” Brookings, December 18, 2017.