Disinformation as an instrument of foreign intervention into electoral process: a manifestation of lawfare

June 23, 2022

Huge scandals regarding interference into foreign electoral process in most cases started with the accusations of disinformation distribution. One of the most well-known examples is 2016 US elections, during which Russia has been conducting coordinated campaigns on Facebook and Twitter, shared news in the printed and online media and manipulated the thoughts of the US citizens [1]. However, few people know that it is only one among hundreds, if not thousands, similar intrusions into democratic processes all around the globe. Other instances involve disinformation, manipulation and propaganda campaigns by the US in Iran and Afghanistan [2], Chinese interference into Taiwan, Australian and the US elections [3], Russian intrusion into Ukrainian electoral processes[4] and many more other less famous, but still influential examples.

According to the EU Commission, disinformation is «verifiably false or misleading information that is created, presented, and disseminated for economic gain or to intentionally deceive the public, and may cause public harm» [5]. In brief, it comprises materials, which are both false and malicious, aiming at harm for the protected value. So, apparently, the listed above activities serve as examples of subversive practices. The issue, however, arises as to their legitimacy and qualification under the international law. Whether they can serve as a basis for legal responsibility? If not, what is their status nowadays – permissible actions, lawfare or nationally prohibited content-sharing actions? How the States are combating such campaigns and whether these responses can be considered as proper ones?

Where violations end – misuses start

Intrusions into electoral process can have various manifestations – from fraudulent operations with voting bulletins, [6] blockage of entrance to polling stations [7] or physical coercion to vote [8] up to distribution of distorted information on the candidates [9], manipulations on the change of election date [10] or disinformation campaigns on social media. Such malicious activities are conducted by both domestic and foreign authorities. In case of the latter the issues of sovereignty, principle of non-intervention and breach of the right to free vote reasonably arise. The main challenge here is to define, whether disinformation campaigns reach the threshold for intervention or they are viewed as misuse of law with no coercive effect [11], which triggers legal responsibility in very few cases, if any.

The concept of sovereignty is reflective of the State’s independence in its choices within a given territory, as was set forth in the Island of Palmas Arbitral Award. [12] According to Lotus and Nicaragua cases, it implies that States «may not exercise [their] power in any form in the territory of another State» [13], which covers political, economic, social and cultural systems and formulation of domestic and foreign policies therein [14]. In this respect, State’s choices at stake are intended to be «political and circumstantial, rather than exclusively or constantly of a legal character» [15]. Therefore, a choice of political order, including via free and fair elections, falls within the ambit of sovereign powers. Any of the protected values might be externally influenced purely based on the free consent of the State [16] with a knowledge of influence of such actions on the well-being of its population [17]. In this regard, in Nicaragua the ICJ has outlined coercion as a necessary element of intervention [18]. Accordingly, a breach emerges in case of unconsented coercive interference of any form or attempted threats against the personality of the State [19], including direct or indirect assistance to subversive activities, interference in civil strife or fields, naturally considered to be within the State’s own discretion [20]. The notion of «fields within State’s discretion» comprises processing of data from Internet domains [21] or computer systems [22]. The cyber domain, which according to the Tallinn Manual includes physical, logical and social layers of cyber operations, is closely tied to the principle of State sovereignty [23]. Consequently, cyber interference can be also considered as a breach of the non-intervention principle. However, whether all forms of cyber interference are explicitly prohibited by international law?

Some scholars stress that coercive cyber operations shall be distinguished from those, that are merely influential or persuasive [24]. The Netherlands Ministry of Foreign Affairs underlined that a goal of cyber intervention shall be to affect change in behaviour of the target State. A scholar Michael Schmitt stressed that attempts to achieve regime changes by manipulating public opinion ahead of elections constitutes improper intervention [25]. In his works, however, he failed to clarify the forms of manipulations, which are considered impermissible. Since a mere Facebook publication with misleading statement regarding elections abroad then might constitute a violation of international law (which apparently is not the aim of existing legal framework). The current international regulation forbids the limited types of activities in the cyberspace with none of regulatory acts addressing the issue of disinformation. For instance, the Convention on Cybercrime outlaws physical data interference, content offences related to child pornography and copyright infringements, and interception of content data [26]. Similarly, the Tallinn Manual covers primarily integrity of cyber infrastructure, such as cables, routers, servers or computers, and individuals acting within the cyber domain from the jurisdictional perspective [27]. The commentaries to this Manual evidence a lack of consensus on qualification of cases with no physical damage as violation of non-intervention principle. Moreover, regarding content regulation a reference is made to international human rights law, which itself has not developed a proper response to disinformation until now. In relation to propaganda, the International Group of Experts considered it to be in compliance with the principle of sovereignty [28]. However, the ICCPR in Article 20 provides for a requirement to ban any kind of expression that might constitute incitement to war, discrimination or violence [29]. Therefore, disinformation might constitute unlawful intervention only if having physical consequences, such as unrests, violent rallies or commencement of armed activities. Otherwise, it complies with the existing content limitations and cannot be qualified as coercive practice, falling within the grey zone of regulatory mechanisms.

Likewise, there is no crystalised international regulation of non-interference in the other States’ electoral process by cyber means [30]. Although the Tallinn Manual might prescribe protection of the critical electoral infrastructure from certain forms of physical interference, less coercive intrusion is not covered. It might involve creation of deepfakes, support of opposition via republishing of its thesis etc. In such case activities amount not to direct pressure, but rather to support of already existing opinions and narratives. To compare, Russian hacking of the Democratic National Committee and subsequent leaking of sensitive documents during the US 2016 elections, as well as hacking of e-mails during elections in Ukraine [31], suffice to constitute a violation. At the same time, coordinated inauthentic behaviour on Facebook hardy falls within the breach category. For example, a social media campaign possibly «organized in support of Iranian political interests» including Twitter accounts impersonating Republican candidates for the House of Representatives in 2018 could not be qualified as illegal in absence of a proper regulatory framework [32]. Similar situations have happened during French presidential elections in 2017 and during the Brexit referendum [33]. Although having no legal regulation, they have led to significant legal consequences since the impact of these activities was quite great in both cases.

As a brief summary, foreign disinformation campaigns are rarely qualified as violation of sovereignty, non-intervention principle or respective human rights. Simultaneously, they exert enormous influence on the political agenda of different States, democratic processes and rule of law. The main reason behind such behaviour is absence of strict regulations, enabling States to misuse the law, stretch and distort the legal concepts and hide beyond the idea of legitimate contributions to political discussion. De facto, States are resorting to lawfare in order to conduct malicious activities and avoid responsibility. So which tools, influence and potential solutions do we face, while analysing disinformation as a manifestation of lawfare?

Disinformation as a winning lawfare strategy

Lawfare constitutes the misuse of existing legal norms by the State with a purpose of achieving the advantage in political, military, social or any other dimension. Disinformation, distributed mostly via social media, is directly using gaps in regulation, absence of legal norms or vagueness of legal concepts in order to share the «malicious» narrative to the proper public. One of the problems nowadays implies the lack of international legal regulation of social media [34], thus enabling anyone to share misleading statements from any point of the world map. If the platform does not possess a status of legal entity in the State – the sanction mechanism is hardly applicable to it, while blocking is generally considered a disproportionate response [35]. As a result, anyone desiring or hired to share misleading news can proceed with such activity without fear of being interrupted or prosecuted.

In this respect, it is interesting to analyse the due diligence obligation of the State to prevent illegal or malicious activities stemming from its territory [36]. For example, one might imagine the case, where the territory of State A is used by non-state actors to distribute electoral disinformation to State B via the platform registered in the State C. According to Corfu Channel, «it is every State’s obligation not to allow knowingly its territory to be used for acts contrary to the rights of other States» [37]. But which State is responsible in the given example – State A or State C? Who shall take steps to cease an alleged misuse of law and whether there is any regulation, empowering to do it? This question can become even more sophisticated if we bear in mind the possibility to remain anonymous on the Internet, thus keeping location of the non-state actors in relative secrecy. Who will be responsible for detecting the place of the distributors of disinformation? Since no such obligation exists nowadays, most probably the clear violation will be absent. Therefore nothing, but a pure good faith will guide the State towards detecting the «perpetrators». However, if it is favourable for the State to allow such activities, no liability will occur for ignoring disinformation campaigns shared from its territory. As a result, due diligence principle is not be breached, while highly malicious activities proceed with impacting the democratic processes in the neighbouring countries. Similarly, the State might provide directions or instructions to non-state actors within or outside its territory, aiming at intervention into foreign electoral process. Attribution links will be very difficult, if not impossible, to establish both under the law on State responsibility and in practice.

Another important aspect, partly touched in a previous section, is that disinformation campaigns are not always reflective of the intervening State’s vision. Sometimes the shared narratives are only amplifying an already existing position [38], such as support of opposition, minorities or radicalized groups’s thoughts. The estimates show that 72% of the Russian disinformation campaigns in 2017 target Israel, Saudi Arabia, the UK and the US, primarily supporting election opposition [39]. In such cases, the practices are not coercive or manipulatory, but aimed at multiplying the amount of content and overwhelming the web with the necessary narratives. Similarly, during Spanish referendum, several Russian fake social media accounts promoted support for declaring Catalan independence [40] by distributing ideas already existing among Spanish citizens [41]. While during the sadly famous Russian interference in 2016 US elections 7,000 pro-Trump accounts on various social media platforms were dissolved in national public pages, accounts, bots and profiles of real Trump-supporters [42]. Also, several circumvention tools and anonymous messaging platforms, such as WhatsApp, Telegram and LINE, are very efficient in distribution of disinformation since users cannot be tracked or identified without intervening with their privacy (which requires a special court order and is more time-consuming for law enforcement). For instance, Russia outsourced part of its 2020 US presidential disinformation campaign to Ghanaian and Nigerian nationals, who were employed to generate and share content on social media [43]. In such circumstances, foreign disinformation is more difficult to be detected, combated and proven by its origin [44]. Accordingly, its influence is more powerful, being perceived as natural flow of social opinion. But which forms are used for most effective sharing of malicious narratives?

Manipulations, alternative facts and post-truth are the most common manifestations of lawfare in the information sphere. For example, the state-owned Russian news agency Sputnik called French then-presidential candidate Emmanuel Macron an agent of «the big American banking system» [45], not lying, but manipulating by his links with the US politicians. In a similar way, Chinese disinformation campaigns against Taiwan de-personalised famous individuals and created fake personalities in order to address malicious narratives [46], distort political agenda and even manipulate by legal concepts. Other cases involve accusation of the UK authorities of intruding into regional affairs [47], inflaming and subverting the international relations. In addition, the range of disinformation tools includes creation of deepfakes of the candidates, making them «admit to egregious criminal behavior» [48]. If it seems to be absolutely lacking any legal implication, one might consider the consequences of such campaigns – namely, the loss of the candidate at the elections, rapid and radical change of the political regime, resort to violent activities in the foreign policy and even commission of international crimes.

As regards the post-truth and alternative facts, the Dutch anti-disinformation campaigns within election periods are aimed at discrediting troll-manufactured videos, re-explaining the evidence-based connotations and providing the official version of truth [49]. It is the most emblematic example of how disinformation, and lawfare in general, actually work. Since the information environment if filled with numerous visions on the same subject-matter, even the most reasonable reader might lose the bottom line, below which the manipulations start. It allows exploiting domestic discourses [50] and substitute the existing legal concepts and their interpretations by the «modern inventions» in the sphere of public international and freedom of expression laws. In this respect, Russian strategy of disinformation campaigns, according to Schmitt, includes working with grey zones of law, stretching them even more [51]. In particular, a scholar aptly labelled it as asymmetrical lawfare, which extends far beyond the mere misperception of the legal concepts by the audience.

As a consequence, social media are lacking an appropriate guidance since the burdens of misinformation, real facts and malicious narratives are blurred and blended, causing mess in terminology, interpretations and legal qualifications of the given cases. For instance, moderators cease to understand what activities do constitute election interference (and, thus, shall be removed from the public domain), and what are merely manifestation of person’s thought (even misleading or mistaken one). Accordingly, platforms become the place for distortion of law, namely lawfare and other malicious operations. Sometimes scholars are only contributing in such state of confusion, since they are producing numerous concepts – extracting information warfare from the scope of lawfare, making fake news, false narratives, public opinion warfare and many more other terms absolutely separate definitions [52]. On the one hand, such methodological division assists in legal assessment of informational lawfare, structuring research. On the other hand, it leads to fragmentation of malicious activities and allows avoiding liability for such actions even easier than before. Hence, disinformation reaches the objective of achieving the decision-making paralysis [53], causing a shift in political positions. This success, in turn, leads to further application of disinformation practices. But whether these are the most dangerous consequences? Apparently no, since disinformation campaigns are not half as successful as one thinks in changing political regimes. Yet, they are successful in undermining the trust in democratic institutions in general, which might lead to the collapse of rule of law in the long-term perspective.

If the «threat of malign interference by foreign actors aimed at undermining electoral processes» [54] is perceived as real and credible, any outcome of the elections might be considered as influenced from abroad. For instance, it was established that approximately 2 million Chinese people in an organized manner are distributing disinformation [55], aimed at changing the political agenda of the neighbouring States at the eve of elections. Even if such campaign had no impact on the election results, was held in another country or during previous electoral period, there is already a general distrust in the capacity of the State to ensure free and fair voting procedure. Consequently, any elected candidate or legal institution will be considered as illegitimate, thus ruining the State from inside. In a similar way, any action by the newly elected parliament, for example, will be considered as influenced from outside the country and reflecting the foreign, not nationalistic position [56]. Additionally, another legal problem emerges in case a winning democratic party is suspected in benefiting from the foreign information interference. Here, the best example might be the 2016 US elections, where almost half a year after the electoral campaign there was a heated debate around the legitimacy of the US President [57]. At the same time, such Russian activities still have no coercive, but rather supplementary and supporting effect. Thus, no breach of international law occurs, but misuse apparently takes place.

As well, the result might be even worse, when the society is unwilling to cope with the election results. It can lead to delegitimization of the held elections, causing a period of absence of legitimate authority. Moreover, such calls to re-elect the President or the parliament might have the violent and aggressive character, which makes further disinformation contributions, destabilisation and exploiting social distrust [58] much easier for the foreign actors. Weakening of the social cohesion, as a secondary effect of foreign election interference [59], can create preconditions for disturbances, discrimination of minorities and even dissolution of the State. Nevertheless, it cannot be fully ascribed to a State, which conducts disinformation campaigns, since there might be no direct incitements.

Furthermore, filling of information environment with misleading information threatens the institutes of journalism and validity of the regulatory framework on freedom of expression. Since it illustrates the State’s inability to control the spread of malicious content or establish a clear guideline for its removal by social media. Another negative impact is chilling effect on free speech [60], experienced by numerous lawfully acting speakers due to governmental attempts to suppress the endless flows of disinformation via any available method. For example, distribution of pro-Russian propaganda and political narratives via Vkontakte caused the blocking of this social media in Ukraine [61]. Abstaining from necessity analysis, it is worth mentioning that lots of legitimate users lost the platform for sharing thoughts or trading. At some instances, such responses might even cause more harm for a target State, depicting it as an adherent of discriminatory practices.

As regards the justifications of the given actions, responsible States are trying to prove that interference is conducted by the non-state actors, the causal links between activities and harm cannot be established or operations do not reach the threshold for intervention [62]. Sometimes the States also try to deny accusations claiming that they are politically motivated [63], while the collected pieces of evidence are not enough to qualify an act as an intervention. In this respect, the China’s Foreign Ministry noting that the «Internet was full of theories that were hard to trace» [64], referring to the fact that websites and materials might be fake and the origin of the source is unidentifiable. Another dangerous response includes resort to authoritarian practices, which might serve a ground for further accusations against the State. For example, to combat Chinese disinformation flows Taiwan developed numerous laws on fake news, mis- and disinformation, false propaganda etc [65]. In practice, those measures are both ineffective and contravening international standards in the sphere of freedom of expression [66]. Similarly, Internet shutdowns and other radical restrictions are considered a disproportionate response to foreign disinformation campaigns [67].

Hence, foreign interferences with electoral process constitute one of the best examples of lawfare, remaining either in the grey zone of legal regulation or being a misuse of law. Moreover, they lead to visible legal effects and shifts in the political balance on international arena, using all available tools to distort the existing values, diminish a belief in democracy and rule of law.


Disinformation apparently is not the most dangerous tool of lawfare from the perspective of severity of consequences. However, it might be unimaginably influential and tricky in the issues of detecting and providing the evidence. Also, foreign intervention into electoral process taking the form of disinformation campaigns might be both the breach of international law and a manifestation of lawfare. Therefore, it is critically important to distinguish those cases and to draw a strict threshold for such differentiation. Otherwise, non-defined threshold will enable numerous actors to justify their interference by absence of legal regulation and to proceed with similar malicious activities.

In addition, foreign disinformation campaigns have malicious consequences for international legal order in general – being able to distort legal concepts, undermine the perception and importance of democratic institutions, and destabilize the situations even in the most stable and impactful States. However, irrespective of any kind and degree of negative effect, the response to such lawfare shall not become resort to the same instrument. Alternative facts shall not be combated by other alternative facts, while manipulations shall not cause pulling of the rope to the other side. Since resorting to such practices will serve only the additional proof of the effectiveness of lawfare in the modern legal order, resulting into applications of such practices more and more worldwide.

[1] Ohlin J D, Election Interference: International Law and the Future of Democracy (CUP, 2020) 231

[2] Nicolas A C, ‘Taming the Trolls: The Need for an International Legal Framework to Regulate State Use of Disinformation on Social Media’ (2018) Vol 107 GLJO 36, 39

[3] O’Connor S, Hanson F, Currey E and Beattie T, ‘Cyber-enabled foreign interference in elections and referendums’ (2020) <https://www.aspi.org.au/report/cyber-enabled-foreign-interference-elections-and-referendums> accessed on 7 November 2021

[4] Weissmann A, ‘How to Prevent Foreign Election Interference’ (2020) <https://www.lawfareblog.com/how-prevent-foreign-election-interference> accessed on 7 November 2021

[5] Pielemeier О, ‘Disentangling Disinformation: What Makes Regulating Disinformation So Difficult?’ (2020) 2020 ULR 917

[6] Gabriel T and Wines M, ‘Trump Is Pushing a False Argument on Vote-by-Mail Fraud. Here Are the Facts.’ (2020) <https://www.nytimes.com/article/mail-in-voting-explained.html> accessed on 7 November 2021

[7] Corasaniti N and Saul S, ‘Trump Supporters Disrupt Early Voting in Virginia’ (2020) <https://www.nytimes.com/2020/09/19/us/politics/trump-supporters-early-voting-virginia.html> accessed on 7 November 2021

[8] Coynash H, ‘Crimean Tatars boycott Russia’s illegitimate elections in occupied Crimea’ (2016) <https://khpg.org/en/1474246406> accessed on 7 November 2021

[9] Salov v Ukraine App no 65518/01 (ECtHR, 6 September, 2005), paras 111-113

[10] Vandewalker I, ‘Digital Disinformation and Vote Suppression’ (2020) <https://www.brennancenter.org/our-work/research-reports/digital-disinformation-and-vote-suppression> accessed on 7 November 2021

[11] Schmitt M N, Tallinn Manual 2.0 on the International Law Applicable to Cyber Operations (CUP, 2017) < https://cutt.ly/oTewb1m> accessed on 7 November 2021

[12] Island of Palmas (Netherlands, USA) (PCA, 4 April 1928) 829, 838

[13] Lotus case (France v Turkey) (1927) PCIJ Ser. A No. 10, para 45

[14] Military and Paramilitary Activities in and against Nicaragua (Nicaragua v United States of America) (1986) ICJ Rep 14, para 263

[15] Rajan M S, United Nations and World Politics: essays from a nonaligned perspective (New Delhi: Har-Anand Publ, 1995) 147

[16] Currie R J and Rikhof J, International and Transnational Criminal Law (Toronto: Irwin, 2013) 2nd ed, 390

[17] Memorial of Australia, Nuclear Tests (Australia v France) 1974 ICJ Plead 249, para 454

[18] Military and Paramilitary Activities in and against Nicaragua (Nicaragua v United States of America) (1986) ICJ Rep 14, para 205

[19] Declaration on Principles of International Law concerning Friendly Relations and Cooperation Among States, UNGA Res 2625, UN Doc A/1883 (1970)

[20] Conference on Security and Cooperation in Europe: Final Act (Helsinki, 1975), para IV; UNGA Res 2131(xx) of 21 December 1965

[21] Trudel P, ‘Jurisdiction over the Internet: A Canadian Perspective’ (1998) Vol 32 Int’l L. 1047

[22] US DoD, Office of GC, An Assessment of International Legal Issues in Information Operations (1999) 2nd ed CNAIL 459, 485

[23] Schmitt M N, Tallinn Manual 2.0 on the International Law Applicable to Cyber Operations (CUP, 2017) < https://cutt.ly/oTewb1m> accessed on 7 November 2021

[24] Schmitt M N, ‘Foreign Cyber Interference in Elections’ (2021) Vol 92 Int’LL.Stud. 739, 786

[25] Ibid 793

[26] Convention on Cybercrime (signed 23 November 2001, effective 1 July 2004) ETS 185, Articles 4, 9, 10, 21

[27] Schmitt M N, Tallinn Manual 2.0 on the International Law Applicable to Cyber Operations (CUP, 2017) < https://cutt.ly/oTewb1m> accessed on 7 November 2021

[28] Principles Governing the Use by States of Artificial Earth Satellites for International Direct Television Broadcasting, GA Res. 37/92, UN Doc. A/RES/37/92 (10 December 1982), paras 13–14

[29] ICCPR (adopted 16 December 1966, entered into force 23 March 1976) 999 UNTS 171, Article 20

[30] Torossian B, Fagliano L and Görder T, ‘Hybrid Conflict: Neither war, nor peace’ (2019-2020) <https://www.clingendael.org/pub/2019/strategic-monitor-2019-2020/hybrid-conflict/> accessed on 7 November 2021

[31] Ibid

[32] Kornbluh K, Goodman E P and Weiner E, ‘Safeguarding Democracy Against Disinformation’ (2020) <https://www.gmfus.org/news/safeguarding-democracy-against-disinformation> accessed on 7 November 2021

[33] Waldemarsson Ch, ‘Safeguarding Democracy Against Disinformation’ (ADF, 2020)

[34] Schmitt M N, ‘Foreign Cyber Interference in Elections: An International Law Primer’ (2021) < https://cutt.ly/7Tettrk> accessed on 7 November 2021

[35] Дворовий М, ‘Чорний вівторок для блокувань сайтів, або як Страсбург рятує російський інтернет (і наш заодно)’ (2020) < https://cutt.ly/iTerIWt> accessed on 7 November 2021

[36] Schmitt M N, Tallinn Manual 2.0 on the International Law Applicable to Cyber Operations (CUP, 2017) < https://cutt.ly/oTewb1m> accessed on 7 November 2021

[37] Corfu Channel case (UK v Albania) (1949) ICJ Rep 4, 22

[38] Hanson F, O’Connor S, Walker M and Courtois L, Hacking democracies: Cataloguing cyber-enabled attacks on elections (2019) Policy Brief Rep No 16/2019, 13

[39] Goel A, Martin D A and Shapiro J N, ‘Managing and Mitigating Foreign Election Interference’ (2019) <https://www.lawfareblog.com/managing-and-mitigating-foreign-election-interference> accessed on 7 November 2021

[40] Parfitt T, ‘Catalonia crisis: Russia MEDDLED in referendum on breaking up Spain, spy chief claims’ (2018) <https://cutt.ly/ZTeaHKT> accessed on 7 November 2021

[41] O’Connor S, Hanson F, Currey E and Beattie T, ‘Cyber-enabled foreign interference in elections and referendums’ (2020) <https://www.aspi.org.au/report/cyber-enabled-foreign-interference-elections-and-referendums> accessed on 7 November 2021

[42] Weisburd A, Watts C and Berger J M, ‘Trolling for Trump: How Russia is Trying to Destroy Our Democracy’ (2016) < https://cutt.ly/lTeaMXO> accessed on 7 November 2021

[43] Ward C and Others, ‘Russian election meddling is back – via Ghana and Nigeria – and in your feeds’ (2020) <https://edition.cnn.com/2020/03/12/world/russia-ghana-troll-farms-2020-ward/index.html> accessed on 7 November 2021

[44] Polyakova A, Fried D, Democratic Offence Against Disinformation (CEPA, 2020) 8

[45] Minelle B, ‘French presidential candidate Macron target of Russian “fake news”, his party chief claims’ (2017) <https://cutt.ly/kTesdc9> accessed on 7 November 2021

[46] Reinl J, ‘“Fake news” rattles Taiwan ahead of elections’ (2018) < https://cutt.ly/wTefnRX> accessed on 7 November 2021

[47] Hanson F, O’Connor S, Walker M and Courtois L, Hacking democracies: Cataloguing cyber-enabled attacks on elections (2019) Policy Brief Rep No 16/2019, 13

[48] Schmitt M N, ‘Foreign Cyber Interference in Elections’ (2021) Vol 92 Int’LL.Stud. 739, 750

[49] Brattberg E and Mauer T, ‘Russian Election Interference: Europe’s Counter to Fake News and Cyber Attacks’ (2018) <https://cutt.ly/VTesQIK> accessed on 7 November 2021

[50]  Henschke A, Sussex M and O’Connor C, ‘Countering foreign interference: election integrity lessons for liberal democracies’ (2020) <https://www.tandfonline.com/doi/full/10.1080/23738871.2020.1797136> accessed on 7 November 2021

[51] Barela S J, ‘Zero Shades of Grey: Russian-Ops Violate International Law’ (2018) <https://www.justsecurity.org/54340/shades-grey-russian-ops-violate-international-law/> accessed on 7 November 2021

[52] Gershaneck K, ‘To Win without Fighting. Defining China’s Political Warfare’ (2020) <https://cutt.ly/TTesAE2> accessed on 7 November 2021

[53] Wheatley S, `Foreign interference in elections under the non-intervention principle: we need to talk about “coercion”` (2019) Vol 31 Duke Journal Of Comparative & International Law 161, 193-194

[54] Broeders D K, ‘The (im)possibilities of addressing election interference and the public core of the internet in the UN GGE and OEWG: a mid-process assessment’ (2021) <https://cutt.ly/3TesNn3> accessed on 7 November 2021

[55] Gershaneck K, ‘To Win without Fighting. Defining China’s Political Warfare’ (2020) <https://cutt.ly/TTesAE2> accessed on 7 November 2021

[56] Cheng D, ‘Winning Without Fighting: Chinese Legal Warfare’ (2012) < https://cutt.ly/fTefbDl> accessed on 7 November 2021

[57] Yourish K, Griggs T, ‘8 US intelligence groups blame Russia for meddling, but Trump keeps clouding the picture’ (2018) <https://cutt.ly/3Tes8Z2> accessed on 7 November 2021

[58] Hanson F, O’Connor S, Walker M and Courtois L, Hacking democracies: Cataloguing cyber-enabled attacks on elections (2019) Policy Brief Rep No 16/2019, 17

[59] Goldstein J A, DiResta R, ‘Foreign Influence Operations and the 2020 Election: Framing the Debate’ (2020) <https://www.lawfareblog.com/foreign-influence-operations-and-2020-election-framing-debate> accessed on 7 November 2021

[60] Report of the Special Rapporteur on the promotion and protection of the right to freedom of opinion and expression, Irene Khan (13 April 2021) UN Doc A/HRC/47/25, para 24

[61] Article 19, ‘Ukraine: Blocking of Russian social media sites is a severe form of censorship’ (2017) <https://cutt.ly/qTeda19> accessed on 7 November 2021

[62] Schmitt M N, Tallinn Manual 2.0 on the International Law Applicable to Cyber Operations (CUP, 2017) < https://cutt.ly/oTewb1m> accessed on 7 November 2021

[63] O’Connor S, Hanson F, Currey E and Beattie T, ‘Cyber-enabled foreign interference in elections and referendums’ (2020) <https://www.aspi.org.au/report/cyber-enabled-foreign-interference-elections-and-referendums> accessed on 7 November 2021

[64] Packham C, ‘Exclusive: Australia concluded China was behind hack on parliament, political parties – sources’(2019) < https://www.reuters.com/article/us-australia-china-cyber-exclusive-idUSKBN1W00VF> accessed on 7 November 2021

[65] Quirk S, Lawfare in the Disinformation Age: Chinese Interference in Taiwan’s 2020 Elections (2021) Vol 62, No 2 Harvard International Law Journal 525, 545-546

[66] Report of the Special Rapporteur on the promotion and protection of the right to freedom of opinion and expression, Irene Khan (13 April 2021) UN Doc A/HRC/47/25, paras 50, 68, 78

[67] UN High Commissioner for Human Rights, Press briefing notes on Myanmar (27 October 2020) <https://cutt.ly/XTednn6> accessed on 7 November 2021