Internet Intermediary Liability: An Analysis of Facebook’s Liability in the Myanmar Genocide
Updated: Feb 8
Article by Arpit Lahoti and Sherry Shukla,
On February 7, Internet providers in Myanmar blocked Facebook in response to a government directive finding the block necessary to ensure stability in the state. The recent directive is merely the most recent development regarding the presence of Facebook in Myanmar, highlighting the extent to which Facebook has influence throughout the state, spreading information to the masses. However, the Myanmar genocide made the internet’s devastating potential apparent. The internet instigated the genocide to a great extent, where sites like facebook served as forums for systemic Anti-Rohingya propoganda and ultimately, as tools for instigating hatred against Ronhingya even off the internet. As society moves towards a more cyber-centric model, the shocking violence in Myanmar raises a number of concerns about the regulations that govern cyberspace.
This article will analyze the law governing internet intermediaries in order to determine their liability for the actions of third parties. In particular, this article will consider Facebook’s role and potential liability in the Myanmar Genocide.
The Basic Role of Internet Intermediaries
In a report on The Economic and Social Role of Internet Intermediaries, the Organisation for Economic Co-operation and Development explained, “Internet intermediaries bring together or facilitate transactions between third parties on the Internet. They give access to, host, transmit and index content, products, and services originated by third parties on the internet or provide internet-based services to third parties.” Internet intermediaries include both Internet Service Providers (ISPs), which provide direct internet access for users, and hosts, which provide server space for users.
Intermediaries can perform any one or combination of the following four functions: search, commerce, payment, and social networking. This blog will focus on Facebook’s role as a social networking intermediary.
The Situation in Myanmar
Rohingya Muslims are one of the many ethnic minorities in Myanmar: nearly one million Rohingya Muslims resided in Myanmar in 2017, with most of the population living in the Rakhine state. However, the government of Myanmar still does not recognize them as citizens and considers them illegal immigrants.
On August 25, 2017 the Arakan Rohingya Solidarity Army (ARSA), a militant group which claims to fight on behalf of the oppressed Rohingya, attacked police posts. Their “defense” devolved into a massacre. Official figures report that 192 people were killed, 265 people were seriously injured, and about 8,614 houses were destroyed. However, the numbers are disputed, with various other resources estimating that as many as 6.700 homes were destroyed and noting that the Myanmar military raped Rohingya women amidst the attacks.
The UN Fact-Finding Mission (UNFFM) was established to investigate the event in Myanmar and to consider whether crimes against humanity including genocide took place. In its final report, the UNFFM concluded that evidence indicated that such crimes did occur, warranting further investigation and prosecution of culpable officials.
Facebook’s Role in the Crisis
For much of Myanmar, Facebook is synonymous with the internet in general and is arguably the predominant, if not only, mode of internet access for much of the population. The UNFFM also confirmed that the majority of the Myanmar population was active on Facebook, further giving credence to the argument that the website was an important platform for anti-Rohingya messages and the spread of hatred. The UNFFM highlighted that Facebook was used as a tool to disseminate hateful messages about Rohingya Muslims, and that Facebook possibly further incited the genocide. Users often referred to Rohingya Muslims as the “Bengali Problem” on the social media platform, and specifically called for violence to “solve” this alleged problem.
Facebook was used to spread hate speech, misinformation, and fake news to instigate hatred towards the Rohingya minority. In 2001, users spread warnings via Facebook messenger that allegedly “jihad attacks” would be carried out and that Buddhist Monks were to organize anti-Muslim protests. Various anti-Muslim rumours were spread on Facebook that incited violence and riots in the year 2012 and 2014. One prime example took place in 2014, a fake online story was circulated on facebook stating that a Muslim man had raped a Buddhist woman which lead to deadly clashes in the city of Mandalay. More recent examples were in 2017, when violent speech on Facebook triggered an army crackdown on Rohingya Muslims, and in 2018, when Facebook had to ban around 20 military officials for spreading disinformation including the commander in chief Senior General Min Aung Hlaing.
Facebook removed the pages only after much time had elapsed. The platform, however, claimed to have faced administrability issues when interpreting posts, as they were written in Burmese and that facebook was following its community standards to curb such hate speech. Facebook added that it had subsequently devoted more resources to monitor the content in Myanmar.
The Law Governing Internet Intermediaries
Cyberspace refers to “the tools which constitute a unique medium located in no particular geographical location but available to anyone, anywhere in the world, with access to the internet.” Facebook, as an internet intermediary, is undoubtedly part of cyberspace.
The question of jurisdiction, however, becomes an important and puzzling factor in determining disputes about Facebook’s intermediary role. An internet intermediary cannot be reasonably expected to comply with the legal obligations of every country in which it is used. The Manila Principles on Intermediary Liability, developed by expert organizations including the Electronic Frontier Foundation, the Centre for Internet and Society in India, and Article 19, shield intermediaries from liability of content posted by third parties in order to protect the freedom of expression. It is among very few documents that has incorporated opinions from different corners of the world in a single uniform code.
The International Criminal Court, earlier, The International Criminal Tribunal for the former Yugoslavia (ICTY), and the International Criminal Tribunal for Rwanda relied on International Conventions like the European Convention on Human Rights, (ECHR), the International Covenant on Civil and Political Rights (ICCPR) and the Convention on the Rights of the Child (CRC) to determine issues of human rights like the Right to Freedom of Expression.
The UNHRC’s General Comment No. 34, when working with Article 19 of the ICCPR relating to Freedom of opinion and expression noted that these freedoms have become an indispensable part of society. It lays down an obligation on the state to take all necessary steps to foster internet intermediaries' function of providing a global network of information and idea exchange.
Moreover, in their Joint Declaration on Freedom of Expression and the Internet, the four special mandates on right of freedom of expression declared that internet intermediaries cannot be held liable for user-generated content. However, the Declaration carved out a limited exception to the statement against platform liability: namely, that the internet intermediary should not have made any modifications to the user-generated content, and a precondition for nonliability is the absence of court orders to remove such content.
Rather than deriving jurisdiction from user location, laws governing cyberspace operate within the law of servers. The law of servers principle states that territorial jurisdiction in the case of cyber-crime proceedings exists where the server hosting the pages at issue is physically located. ). The High Court of Justice adopted the law of servers principle, (Chancery Division) holding that an internet intermediary should be held responsible in the country in which its servers are located rather than the place where the ‘material is read or used.’
The Canadian Supreme Court has applied the three-factor ‘real and substantial connection test’ for determining a jurisdiction cyber matters which considers (1) the location of the servers; (2) the geographic jurisdiction in which the person posting is situated, and; (3) targeted end users.
Facebook’s servers are located in the US, which means, as per the law of servers, Facebook is bound to comply with only U.S. law. As per Section 230 of the US Communications Decency Act, internet intermediaries enjoy blanket immunity from any kind of responsibility arising from any provider or user-generated posts, but might otherwise be legally responsible for what others say and do. Though there are certain exceptions for certain criminal and intellectual property-based claims, the provision gives wide-ranging protection to intermediaries in furtherance of innovation and free speech online.
It is clear that, from a law of servers perspective, Facebook cannot be held liable for user conduct and content. However, the interesting question is whether Facebook should, in a positive sense, be liable. The role of Facebook in the Rohingya genocide is that of a means to spread hatred among the masses. Though Facebook itself prohibits hate speech in its community standards, the platform has miserably failed to restrict anti-Rohingya propaganda before substantial harm was done.
The crisis in Myanmar illustrates the massive damage an unregulated and unmonitored cyberspace can cause. The idea of conferring absolute immunity upon internet intermediaries will only help them evade liability where redress is necessary. Facebook itself accepted that it failed to comprehend such misuse of its platform and it could have done better to handle the situation and thus should be held accountable for its failure to curb incendiary hate speech.
One possible solution is the drafting and implementation of a uniform code in order to prevent misuse of internet intermediaries by an international body. The specialized entity would ensure compliance, comprehensively regulate internet intermediaries, and decide what kind of liability is appropriate for noncompliance. However, such regulation should balance and in no way suppress the freedom of expression and opinion. The code ought to provide the ways in which internet intermediaries should handle cases where any such mischievous activity is traceable, such as in the case of hate speech catalyzing the Rohingya Genocide. Finally, the code should set standards for the requisite infrastructure intermediaries must establish to overcome language barriers and adequately prevent harmful user-generated content.
The temporary blocking of Facebook may be seen as an effective measure to control the spread of misinformation. However, the measure is not in harmony with human rights principles of the freedom of expression and thus should not be considered as a viable option to curb the misuse of such internet intermediary. Rather, the code proposed in the above paragraph provides a solution that balances effectiveness with users’ rights.
The current international law governing cyberspace, and particularly internet intermediaries, provides enough immunity for them to facilitate the freedom of expression and opinion. The idea of governing internet intermediaries has its limits. For example, if the law of servers theory prevails, intermediaries may be able to escape liability even more easily, as they could easily establish their servers in favorable jurisdictions which provide absolute immunity to such intermediaries. As a result, states that face the material effects of intermediaries’ conduct are left without recourse.
Arpit Lahoti is a IV Year B.A. LL.B. (Hons.) student National Law University Nagpur. Sherry Shukla is a III Year B.A. LL.B. (Hons.) student at National Law University Nagpur.