Compatibility of the Social Media Blocking with the European Standards on Freedom of Expression

November 29, 2021

In the Concurring Opinion to Ahmet Yıldırım v Turkey, Judge Pinto De Albuquerque has provided a fundamental rule, stating that “blocking access to the Internet, or parts of the Internet, for whole populations or segments of the public can never be justified” regardless of the provided reasoning [1]. Throughout the years this quotation has become a guideline for most international bodies, dealing with the issues of Internet shutdowns. Although blocking access to the Internet itself is considered impermissible in democratic societies, the world’s practice has drastically changed regarding the targeted blocking policies – such as removal of numerous applications from the Appstore and Google store or delisting of webpages at search engines’ lists. Likewise, it covers the issues of account blocking, addressed in recent Kablis v Russia judgement, where the ECtHR insisted on prohibition of the collateral damage and necessity to tailor restrictions correspondingly to a committed offence [2].

However, all those examples are hardly comparable to website blocking due to the localized effect – less people will suffer from account blocking than from inaccessibility of the whole platform. Back in 2003 it was the main reason for developing the Amsterdam Recommendations [3], authors of which insisted on prohibition of any blocking measure except for targeted content disabling. Nevertheless, rapid change of technologies produced new ways of misusing the Internet, forcing numerous States to resort to blocking measures towards the whole websites. Most examples were recorded in non-democratic or quasi-democratic regimes, e.g. Indian government ordered blocking of 27 websites [4], Belarus disabled access to 50 online media [5], Iran restricted nearly 35% of the world’s most visited websites [6], while China has a well-developed large-scale system on website shutdown[7]. Albeit, more democratic States, including Australia [8], Japan and the UK [9] also considered interim or permanent blocking as an effective remedy against manifestly illegal speech.

Within the scope of this research the episodes of social media blocking by the States are reviewed, abstaining from analysis of the voluntary content restrictions implemented by the ISPs, hosting services or on the self-regulatory level. According to Association Ekin v France [10] and Ahmet Yildirim v Turkey [11], regardless of the interim or permanent character of the blocking measure, a classic three-fold test, developed in the ECtHR jurisprudence, shall be met. Namely, the limitation must be based on an explicit legal framework with appropriate safeguards, pursue a legitimate purpose, being necessary in a democratic society. All of the listed requirements, however, have the legal pitfalls to be addressed while assessing compatibility of social media blocking measures with the ECHR.

Legal basis for the restriction

The legality standard implies existence of a basis for restriction in domestic law [12], which does not confer unfettered discretion on those, charged with its execution [13], being sufficiently predictable as to the prohibited activities. An absolute certainty of the law is undesirable [14], leading to excessive rigidity [15] and inability “to keep in pace with changing circumstances[16]. According to Tammer v Estonia, many laws are couched in terms “to some extent vague and whose interpretation and application are questions of practice[17]. Thus, general laws theoretically can serve a proper basis for limitations, while conditions and procedures governing limitations can be detailed in other acts, including enactments of lower rank [18] or international standards, incorporated into domestic legal framework [19]. However, it does not lift the responsibility of the States to ensure that legislation and bylaw are foreseeable enough. In this respect, overly vague, unclear or too general laws are incompliant with provided by law standard, being unpredictable [20].

These standards are important in the issues of social media blocking, since most laws in this area are rather general. It is reasoned either by unwillingness of the State to introduce the more precise laws or the peculiarities of the legal traditions, as happens in the UK [21]. Regardless of the reasoning, it leads to blocking of websites based on content-restricting rather than procedural or sanctions laws. The most well-known grounds for such restrictions include combating child pornography, malware, investment fraud, online gambling, obscene materials, prostitution, terrorism, and Nazi propaganda [22]. The problem arises, however, if the owners of social media cannot foresee the kind of applicable penalty. Sanctions involve fines, imprisonment or removal of unlawful materials. However, blocking itself is rarely mentioned. As a result, there is already quite extensive ECtHR’s practice, addressing the issues of breaches of prescribed by law standard.

In Yildirim v Turkey, the ECtHR applied a traditional categorical balancing approach by stating that any limitation requires a strict legal basis. In that case, Turkey decided to block all Google Sites, which led to the inability of the applicant to access his own resource. Within the legality requirement, the ECtHR analysed the provision “a judge may order the blocking of access to Internet publications where there are sufficient grounds to suspect that their content is such as to amount to … offences[23]. As a conclusion, the Court has established that although prior restraints are not incompatible with the ECHR in principle, the formulations of domestic laws shall be precise and foreseeable. Since this one does not enable individuals to predict applicability of wholesale blocking, it violates the legality standard. Similar conclusion has been reached by the ECtHR in Cengiz and Others v Turkey concerning the wholesale blocking of YouTube for publishing of several videos insulting the memory of Ataturk. In particular, the Court has underlined the absence of legal basis for blocking, while also stating that even existence of such restriction will violate the requirement of “quality of law”. It happens because national authorities are not allowed to incorporate the possibility of collateral blocking into domestic laws [24].

In parallel with blocking for domestically prohibited content, the ECtHR reviewed two cases on restricting access to websites for IP law breaches – Akdeniz v Turkey and Neij and Sunde Kolmisoppi v Sweden. The first one is an admissibility decision on blocking of music websites myspace.com and last.fm for alleged breaches of IP law by publishers. The ECtHR reached a conclusion that the listeners, contrary to publishers of music, did not possess a victim status [25]. However, the Court still addressed the improperness of collateral blocking [26]. In the second case, the ECtHR declared application regarding blocking of ThePirateBay manifestly ill-founded [27] since individuals cannot complain about violation if they themselves violate others’ rights. Also, general provision on prohibition of IP law violations suffices for blocking measure. Inter alia, similar position is maintained by the CJEU, which in UPC Telekabel Wien [28] stressed the necessity of a sufficient legal basis for the restriction, however, was satisfied by a relatively general norm of the Info-Soc-Directive.

In 2020 the ECtHR brought some clarity to the rules governing website blocking. Specifically, the line of four decisions against Russia it developed a general guideline on safeguards under the provided by law standard. For instance, in Kharitonov v Russia the applicant’s Electronic Publishing News website was blocked as an automatic consequence of blocking order against another website with the same IP address. A legal basis was the Law on Information, prescribing the possibility of wholesale blocking of websites regardless of quantity of illegal materials thereon. Yet, the ECtHR declared the breach of the quality of law, since it provided too much discretion for the domestic authorities [29]. Moreover, the list of the websites to be blocked was updated by Roskomnadzor without checking other resources possessing the same IP address, thus lacking any safeguard against abuses [30]. Finally, the Court declared absence of any judicial review an additional threat to the rights of website owners [31].

Similar reasons were advanced in Bulgakov v Russia, where participation of the Internet provider in the court proceedings did not suffice for consideration of restrictive measures, since providers are not aware of the substance of websites [32]. Moreover, it condemned the refusal of domestic courts to lift the blocking order after removal of unlawful information [33]. In other cases, Engels v Russia and OOO Flavus v Russia, the ECtHR underlined the failure of domestic courts to balance the interests and consider any safeguard against abusive practices by executive authorities. In Engels v Russia it stated “as a printing press can be used to print anything from a school textbook to an extremist pamphlet, the Internet preserves and makes available a wealth of information” some arrays of which might be illegal [34]. By proclaiming this, the Court reinforced its position on prohibition of collateral blocking and its crystallization in domestic laws.

A brief summary of the ECtHR’s requirements towards prescribed by law standard in cases of website blocking implies consideration of five criteria: safeguards against prior restraints, safeguards against collateral effect, procedural guarantees, balancing of interests and transparency of blocking mechanisms (their foreseeability). It is impossible to fully disclose a technical mechanism of blocking since it will open space for circumventing limitations [35]. However, it is important to describe the legal circumstances of resource blocking in absolutely exceptional circumstances (as in the ThePirateBay case). Consequently, the whole array of cases on website blocking provides not only for consideration of “any legislation” prescribing a limitation, but also for consideration of its quality. Accordingly, the reconsideration of numerous domestic laws is needed. For example, Polish intelligence agencies are granted a right to block websites for 5 days without the court authorisation, French law legitimizes website blocking for combating terrorism propaganda and child pornography [36].

Similar problems exist in Ukraine. In the recent years numerous websites were blocked with different reasoning. The only baseline in domestic law, providing for possibility to restrict access to resources, is the Law on Telecommunications. Being based on the court decision, it establishes obligations for operators of telecommunications to block resources sharing child pornography [37]. Meanwhile, the highly criticized [38] blocking of websites under the Law on Sanctions contains no ground for blocking Internet resources at all [39]. Applying such types of sanctions the State is referring to the notion of “other measures”. Accordingly, it is important to remind the findings from Kovach v Ukraine, where the ECtHR recognized wide interpretations of legal provisions to be unforeseeable [40]. Namely, the Parliamentary Elections Act contained term “other circumstances” with no practice explaining its wording [41], which was relatively the same as terms of the Law on Sanctions. Consequently, there are strong grounds to believe that the first case reaching the ECtHR regarding the quality of this legislative act might be successful for the applicants. As regards the blocked platforms, such social media as Vkontakte and Odnoklassniki can be mentioned along with other less famous resources. The restriction in this situation has been imposed via the Decrees of the President of Ukraine, abstaining from delivering the case to any judicial authority, which is also in breach of the ECtHR’s requirements. The research of Digital Security Lab Ukraine also provides an extensive analysis of the bouquet of mistakes made by domestic courts [42]. The brightest ones include arrests of the “intellectual property rights that arise from Internet users when using a web resource” as a reasoning for its blocking (one case is already communicated to the ECtHR) [43], sanctions imposed on the deceased leader of the so-called DPR Oleksandr Zakharchenko [44] and others. Thus, nowadays the practice of social media blocking in Ukraine contravenes the provided by law standard.

Legitimate aim: presence of threat and its character

Legitimate aim of the restriction embodies the purpose for limiting rights. It is present if a threat to protected values is grave enough to significantly undermine the rights. Rarely one might find a case with no legitimate aim for the restriction. Safeguarding the rights of others, including religious, social or ethnic groups, voters or private individuals, protection of public order and national security serve the most common grounds for limitations. As follows from Feret v Belgium, aggressive statements have “inevitable risk of arousing feelings of hatred” towards particular communities [45]. According to Salov v Ukraine, States are empowered to protect voters from untrue information [46]. Also, risks of social unrest with criminal activities destabilize public order [47]. In those cases, the ECtHR assessed whether the threat has been grave enough to endanger the values at stake, proving that limitations often pursue legitimate purposes.

However, pursuant to OOO Flavus v Russia, often does not mean always. In this case, the owners of the websites grani.ru, ej.ru and kasparov.ru shared information regarding non-authorized protests and scandalous events, preceding those demonstrations. The Prosecutor sent a request on blocking of those websites to Roskomnadzor, reasoning it by the alleged extremist nature of materials. The request was satisfied. The ECtHR stressed the absence of a legal basis for blocking, since the law was overbroad and imprecise [48]. Also, the Court underlined that incitement to protest was not among the illegal types of incitements according to domestic law [49]. Yet, the most interesting part implied addressing the legitimate aim, where the ECtHR has found none. Namely, “even if there were exceptional circumstances justifying the blocking of illegal content, a measure blocking access to an entire website has to be justified on its own, separately and distinctly[50]. By stating this, the Court diversified the approach to targeted blocking and wholesale restriction of access to resources. Thus, the threat stemming from the platform itself shall be of such a grave nature that all materials are undermining the protected values, not merely a part of them. Otherwise, legitimate aim is absent.

As regards the Ukrainian situation, apparently it was possible to justify restriction of certain groups or accounts, sharing propaganda of war and threatening the national security and human rights of Ukrainian citizens. However, restricting access to entire resource could not be reasoned by the same arguments. It requires more comprehensive reasoning, showing the extensiveness and gravity of threat along with inability to resort to more targeted measures. Yet, issues of effectiveness and availability of alternatives shall be reviewed within the scope of a necessity criterion.

Standard of necessity – the last resort criterion

Necessity standard requires consideration of the pressing social need and proportionality of the limitation. To block the entire website the State has to prove that the scope, duration and reasoning behind the restriction correspond to its gravity, while the balance between protected and restricted rights is struck. The ECtHR has never addressed this criterion, being satisfied with absence of a legal basis and legitimate purpose. Therefore, this part of the research is primarily focused on the domestic practices and the CJEU approach.

Prior restraint or permanent measures shall be as narrowly targeted as possible to preserve the rights, being only justified when a platform is entirely filled with illegal materials. Otherwise, there is an apparent collateral effect upon lawful information [51]. According to Super Cassetes Industries v Myspace, general injunction was impertinent due to an unspecified number and kind of infringements against which the protection was sought [52]. Similar approach was maintained by Germany [53], Denmark [54] and Britain [55]. Moreover, suspension of the entire platform in the country might “punish unprotected speech rather than hinder any possibly safeguarded one[56]. Admitting the application of interim injunction unnecessary in Cumhuriyet Vakfi, the ECtHR accounted for the absence of a specific time-limit [57]. The Manila Principles also recognized periodical review vital to ensure the validity of blocking order [58]. Since news is a perishable commodity, postponing publication impairs its value [59]. Thus, the domestic courts are required to define the amount of unlawful materials and establish the proportionate duration of the restriction. The limitations cannot have unpredictable duration, as well as cannot be based on the presence of few illegal publications. To compare, it is unnecessary to ban the whole newspaper due to a couple of unlawful articles.

As regards the reasoning, any State bears an obligation ‘to demonstrate in specific fashion the precise nature of the threat’ to protected values [60]. The balance between rights implies consideration of potential collateral effect. In Akdeniz v Turkey blocking of music resource extended its effects to the users of the website apart from publishers, however, the impact was negated given availability of alternative ventures. Other social networks are often unable to cover the same scope of material [61] as well as be similarly content-oriented [62]. Such restrictions on social media were heavily criticized by Special Rapporteurs [63] and human rights organizations [64]. Given platforms’ nature as vehicles for debating politics and engaging with elected representatives [65], it is essential to maintain networks accessible in election periods.

On the other hand, in Myanmar’s case Facebook became a “useful instrument for those seeking to spread hate[66]. Failing to properly regulate platform’s environment, the government faced massive manifestations of violence, embodied in an alleged genocide. Similarly, the German example describes Facebook as fuelling anti-refugee attacks through the slow and ineffective response [67]. Also, a large array of misleading materials unlawfully tampered 2016 US elections [68]. In the Ukrainian situation, Vkontakte and Odnoklassniki caused significant contributory damage to stirring of the ethnic hatred, incitements to violence and disinformation in election periods. Since people are interested neither in being misled, nor in being subjected to hatred, theoretically the restriction might be considered as satisfying the pressing social need requirement. The Decrees of the President of Ukraine provide for limitation upon the time-frame of this particular limitation (although some other measures were applied for unlimited time, which is incompatible with the European standards). However, the main problem with the array of illegal materials and its (in)sufficiency to justify a blocking measure. In this respect, a more detailed analysis is needed, while the State must also prove the impossibility of restricting such amounts of unlawful information in other way. It might include targeted blocking of groups or accounts, blocking of individuals based on their IP-addresses etc. Yet, this issue shall be addressed while assessing proportionality of the restriction.

The consideration of proportionality requires dwelling upon the criteria outlined by the CJEU [69], namely availability of alternative measures, efficiency of imposed ones and their impact. As regards the first criterion, given that monitoring of “a specific site during a given period of time to prevent specific tortious activity” is compatible with international practice [70], social media can be directed to block access to particular type of illegal material through the court orders [71]. Moreover, social media usually provide an opportunity for suspension or termination of accounts filled with infringing content. Similar to Facebook during the last Senate elections, platforms could be directed to block particular accounts, which may have been engaged in ‘coordinated inauthentic behaviour[72]. Also, removal of more than 25,000 publications on Facebook, YouTube, Instagram and Twitter after the respective order in PepsiCo India v Facebook and Others [73], confirmed intermediaries’ ability to take control over massive volume of content. At the same time, according to J19 and J20 v Facebook Ireland, it is impossible to control the entire volume of material [74] through the established model of functioning. Since automatic word-based filtering systems usually cause spillover effect [75], being unable to distinguish between lawful and illegal content [76], this procedure was likewise inapplicable. Therefore, even massive targeted removal of publications requires two conditions to be satisfied – defining the precise malicious publications and readiness of social media to cooperate.

It is well-accepted that measures amounting to over-blocking shall be considered ineffective [77]. For instance, merely 6% of YouTube content is potentially illegal [78], thus making wholesale blocking inappropriate. Also, blocked content remains available from alternative outlets [79], or could be easily accessed through various circumventions tools, which proves ineffectiveness of such measures. On the other hand, targeted blocking policies are also ineffective due to speedy reappearance and easy circumvention of filtered content [80], especially when quantity of illegal materials amounts to a substantial tort [81]. Thus, there is an outstanding necessity to conduct a balancing exercise, analysing the circumstances of the intrusion. If the speed for removal of unlawful speech overweighs, as in Myanmar case, then temporary blocking might serve a proper solution. Contrastingly, if it is possible to block one individual, as in Trump’s situation, then more intrusive measures are disproportionate.

Finally, the impact of the measure on the owners of social media likewise shall be considered. Relying on Delfi in assessing the consequences for intermediary, States must establish whether interference might engender an obviously detrimental effect upon intermediary’s business model [82]. To exemplify, in RIAA v Napster, continuous unavailability of social network drove its users to other peer-to-peer websites [83], which was destructive for the platform’s business. Moreover, constant risks of suspension infer possibility of self-regulation [84], which pushes intermediary into overbroad privatized censorship and “goes beyond the ethos of established free speech regimes[85].

In the Ukrainian situation, blocking of Russian social media platforms was relatively effective, since usage of those platforms decreased significantly throughout the blocking period [86]. Although the argument regarding the possibility of using circumvention tools is applicable, in such a situation there is no perfect measure for restriction of rights. However, the main mistake is the absence of attempts to officially require social media platforms to cooperate and voluntarily block prohibited content in Ukraine. Even if such activities most probably were ineffective, they still would show the good faith of Ukrainian authorities and exhaustion of all reasonable less intrusive alternatives before blocking measures. Also, there is no data regarding the financial losses of Vkontakte. It operates nowadays, showing that the business model did not suffer.

Conclusions

States, resorting to blocking measures, shall bear in mind their excessive intrusiveness with human rights, which is especially evident in cases of social media blocking. Even if there is a compelling public need to intervene with the operation of intermediaries, the State remains under the obligation to follow the three-fold test. Namely, the restriction shall be prescribed by law, pursue a legitimate purpose, being necessary in a democratic society.

Thus, the domestic law shall explicitly provide for the possibility of blocking of websites for a certain period of time, ensuring that such measure is applied based purely on the court order. Apart from the mere existence of the baseline in domestic law, the States must provide essential procedural and substantial safeguards against abuses, being in line with the ECtHR’s recent practice. Likewise, imposing restrictions, States shall pay attention to the fact that reasoning for the targeted and wholesale blocking of social media cannot be the same for the purposes of legitimate aim assessment. Unluckily, the ECtHR did not proceed with assessment of the necessity criterion, thus no complex test was developed, yet this requirement has been partly addressed in domestic jurisdictions and the CJEU practice. In particular, the necessity requires more complex and comprehensive balancing exercises, including the scope and duration of the restriction, as well as readiness of the platform to cooperate, effectiveness of the available alternatives and impact of the applied measure on the capacity of the service to operate.

However, despite developing a relatively comprehensive guideline in cases against Russia, the ECtHR has received several new applications on the same matter from Russian citizens. Therefore, there will be an opportunity for the Court to dwell upon the necessity issues in the future cases of Akdeniz and Altiparmak v Turkey no.1 and Akdeniz and Altiparmak v Turkey no.2 on blocking of 600 websites and 111 contents of websites respectively [87]. Hence, there is a hope that those two cases will finally force Russia to implement the ECtHR’s requirements into its domestic framework, while other Council of Europe States will get a review of necessity of blocking along with the assessment of legislative framework.

[1] Ahmet Yildirim v Turkey App no 3111/10 (ECtHR, 18 March 2013), Concurring Opinion Judge Pinto De Albuquerque

[2] Kablis v Russia App nos 48310/16 and 59663/17 (ECtHR, 30 April 2019) para 94

[3] OSCE, Amsterdam Recommendations on Freedom of the Media and the Internet (17 June 2003)

[4] Mihindukulasuriya R, ‘These are the apps and websites Modi govt blocked in 2020’ (2021) < https://cutt.ly/STDskBa> accessed 22 November 2021

[5] Karmanau Y, ‘Belarus blocks over 50 news websites but protests continue’ (2020) <https://cutt.ly/rTDsfgy> accessed 22 November 2021

[6] ’35 Percent of World’s Most Visited Websites Are Blocked in Iran’ (2019) <https://cutt.ly/oTDsFiF> accessed 22 November 2021

[7] Faris R, Roberts H and Wang S, ‘China’s Green Dam: The Implications of Government Control Encroaching on the Home PC’ (2009) <https://cutt.ly/CTDdqzx> accessed 22 November 2021

[8] Jacobs C, ‘Independent’s Day and the Censorwall’ (2010) <https://cutt.ly/vTDdCf9> accessed 22 November 2021

[9] Schlesinger M, Site Blocking Global Best Practices (2018) <https://cutt.ly/CTDXGW3> accessed 22 November 2021

[10] Ekin Association v France Аpp no 39288/98 (ECtHR, 17 July 2001)

[11] Supra 1

[12] Margareta and Roger Andersson v Sweden App no 12963/87 (ECtHR, 25 February 1992), para 75

[13] The Sunday Times v United Kingdom App no 6538/74, (ECtHR, 26 April 1979), para 47

[14] Silver and Others v United Kingdom App no 7136/75 (ECtHR, 25 March 1983), para 88; Perinçek v Switzerland App no 27510/08 (ECtHR, 15 October 2015), para 133

[15] Magyar Kétfarkú Kutya Párt v Hungary App. no 201/17, (ECtHR, 20 January 2020), para 94

[16] The Sunday Times v United Kingdom, App. no. 6538/74, (ECtHR, 26 April 1979), para 49

[17] Müller and Others v Switzerland App no 10737/84 (ECtHR, 24 May 1988), para 29

[18] Selahattin Demirtaş v Turkey App no 14305/17, (ECtHR, 22 December 2020), para 250

[19] Groppera Radio AG and Others v Switzerland App no 10890/84 (ECtHR, 28 March 1990), para 68

[20] Unifaun Theatre Productions Limited and Others v Malta App no 37326/13 (ECtHR, 15 May 2018), para 84

[21] Council of Europe, Study on filtering, blocking and takedown of illegal content on the Internet (2016) <https://cutt.ly/yTDX67n> accessed 22 November 2021

[22] Supra 9, 2

[23] Supra 1, para 61

[24] Cengiz and Others v Turkey App nos 48226/10 and 14027/11 (ECtHR, 1 December 2015), para 64

[25] Yaman Akdeniz v Turkey App no 20877/10 (ECtHR, 11 March 2014), paras 26-27, 29

[26] Ibid, para 28

[27] Neij and Sunde Kolmisoppi v Sweden App no 40397/12 (ECtHR, 19 February 2013)

[28] C-314/12 UPC Telekabel Wien GmbH v Constantin Film Verleih GmbH and Others [2014] OJ C151/2

[29] Vladimir Kharitonov v Russia App no 10795/14 (ECtHR, 23 June 2020), para 38

[30] Ibid, paras 40-41

[31] Ibid, para 43

[32] Bulgakov v Russia App no 20159/15 (ECtHR, 23 June 2020), para 36

[33] Ibid, para 38

[34] Engels v Russia App no 61919/16 (ECtHR, 23 June 2020), para 30

[35] “Regardless of Frontiers:” The International Right to Freedom of Expression in the Digital Age (Center for Democracy and Technology, April 2011), 49

[36] Muižnieks N, Arbitrary Internet blocking jeopardises freedom of expression (26 September 2017) <https://cutt.ly/cTD0UyS> accessed 22 November 2021

[37] Law of Ukraine on Telecommunications №1280-IV (18 November 2003), Article 39

[38] Public statement: NGOs call on the President of Ukraine and the National Security and Defense Council to ensure legality and transparency in the application of sanctions to Internet resources (14 April 2020) <https://cutt.ly/qTD0fuX> accessed 22 November 2021

[39] Law of Ukraine on Sanctions №1644-VII (14 August 2014), Article 4

[40] Kovach v Ukraine App no 39424/02 (ECtHR 2008), para 48-62

[41] Ibid, paras 57-58

[42] Dvorovyi M, Sanctions and Blocking of Websites in Ukraine: how to open the Pandora box unnoticed (DSLU, 2021)

[43] Statement of Platfrom of Human Rights on Facebook <https://cutt.ly/fTDMh45> accessed 22 November 2021

[44] Decree of the President of Ukraine №203/2021 (21 May 2021) “On application, cancellation and changes to personalized economic and other restrictive measures”

[45] Féret v Belgium App no 15615/07 (ECtHR, 16 July 2009), para 69

[46] Salov v Ukraine App no 65518/01(ECtHR, 6 September 2005), para 110

[47] Perincek v Switzerland App no 27510/08 (ECtHR, 15 October 2015), para 141

[48] OOO Flavus and Others v Russia App nos 12468/15, 23489/15 and 19074/16 (ECtHR, 23 June 2020), paras 33-34

[49] Ibid, para 35

[50] Ibid, para 38

[51] Supra 1, para 67

[52] Super Cassetes Industries Ltd v Myspace Inc. & Another AIR 2011 Del 2682

[53] BGHSt ZR 174/2014 (2015), para 89

[54] Case no A38-14 (Copenhagen Handelsretten, 11 December 2014)

[55] Twentieth Century Fox Film Corp & Others v British Telecommunications Plc [2011] EWHC 1981, paras 155, 186

[56] Kinney v Barnes 443 US 87 (2014)

[57] Cumhuriyet Vakfi And Others v Turkey App no 28255/07 (ECtHR, 8 October 2013), para 64

[58] A Global Civil Society Initiative, Manila Principles on Intermediary Liability <https://www.manilaprinciples.org/principles> accessed 22 November 2021, Principle IV

[59] Supra 60, para 65

[60] General comment no 34, Article 19 (12 September 2011) CCPR/C/GC/34, para 36

[61] Supra 26, paras 51-52

[62] Khurshid Mustafa and Tarzibachi v Sweden App no 23883/06 (ECtHR, 16 December 2008), para 44

[63] India must restore internet and social media networks in Jammu and Kashmir, say UN rights experts (2017) <https://cutt.ly/sTFwZDk> accessed 22 November 2021

[64] Article 19, ‘Malaysia: Blocking websites to prevent protest violates international law’ (2015) <https://cutt.ly/zTD7wLt> accessed 22 November 2021; Amnesty International, ‘HR Defenders under Threat – A Shrinking Space for Civil Society’ (2017) <https://cutt.ly/rTD77PY> accessed 22 November 2021

[65] Grutzmacher v Howard County 851 F.3d 332 (4th Cir. 2017)

[66] Report of the Independent International Fact-finding Mission on Myanmar, A/HRC/39/64 (27 August 2018), para 74

[67] The New York Times, ‘Facebook Fueled Anti-Refugee Attacks in Germany’ (2018) <https://cutt.ly/nTD6Mzc> accessed 22 November 2021

[68] BBC News, ‘Facebook bans pages aimed at US election interference’ (2018) <https://cutt.ly/MTFqTdL> accessed 22 November 2021

[69] C-324/09 L’Oréal SA and Others v eBay International AG and Others [2011] OJ C269/3, para 139

[70] E-Commerce Directive [2000] OJ L178/1, para 47

[71] Supra 76, para 141

[72] Gartenberg Ch, ‘Facebook blocks dozens of accounts for ‘coordinated inauthentic behavior’ ahead of the 2018 midterm elections’ (2018) <https://cutt.ly/KTD8YhQ> accessed 22 November 2021

[73] PepsiCo India Holdings Private Ltd v Facebook, Inc. and Others AIR 2018 Del 291/2018 (2018)

[74] J19 and J20 v Facebook Ireland Ltd [2013] NIQB 113, para 21

[75] Giggs (previously known as CTB) v News Group Newspaper Ltd [2012] EWHC 431, paras 78, 85

[76] C-70/10 Scarlet Extended SA v SABAM [2011] OJ L 157, para 52

[77] C 360-10 SABAM v Netlog NV [2012] OJ C98/6, para 50

[78] Viacom International, Inc. v YouTube, Inc. 718 US 514 (2010)

[79] Supra 26, paras 51-52

[80] OSCE Representative on Freedom of the Media, A study of legal provisions and practices related to FoE, the free flow of information and media pluralism on the Internet in OSCE participating States (2012) 204-205

[81] Davison v Habeeb and Others [2011] EWHC 3031

[82] Delfi AS v Estonia App no 64569/09 (ECtHR, 16 June 2015), para 161

[83] A&M Records v Napster, US (2001)

[84] Edwards L, The Fall and Rise of Intermediary Liability (2009) 74

[85] MySpace Inc. & Another v Super Cassetes Industries Ltd AIR 2011 Del 2682, para 71

[86] Attendance of the blocked in Ukraine social network “VKontakte” fell (2019) <https://cutt.ly/OTD8Nx7> accessed 22 November 2021

[87] ECtHR, Factsheet – Access to Internet and freedom to receive and impart information and ideas (March 2021) 3