Oversight Board overrules Facebook deletion of Myanmar “hate speech” post

FEM has sent a submission to the Facebook Oversight Board which is hearing an appeal against the removal of a Myanmar language post for alleged “hate speech”. The post was published and removed during the coup. The Oversight Board has since over-ruled Facebook, deciding that the post was not “hate speech”.

Background

FEM makes this submission in response to the Oversight Boardโ€™s request for public comment. Case number 2021-007-FB-UA involves an appeal from a Facebook user whose post was removed between February and April 2021. The post โ€“ which has not been shared by the Oversight Board โ€“ reportedly criticised the military coup and called for the CRPH to place pressure on businesses that work with the military. The post referred several times to China and included profanities. It received 500,000 views and was shared 6,000 times. 

Facebookโ€™s โ€œAdministrative Action Botโ€ identified the phrase โ€œ$แ€แ€›แ€ฏแ€แ€บโ€ and deleted the post. No Facebook users reported the post. Under its Hate Speech Community Standard, Facebook deletes content targeting a person or group of people on the basis of their race, ethnicity and/or national origin with โ€œprofane terms or phrases with the intent to insult, including but not limited to: fuck, bitch, motherfucker.โ€

The Oversight Board requested public comments that address: 

  • Whether Facebookโ€™s decision to remove the post is consistent with the companyโ€™s Hate Speech Community Standard, specifically the rule against profane terms or phrases with the intent to insult. 
  • Whether Facebookโ€™s decision to remove the post is consistent with the companyโ€™s stated values and human rights responsibilities
  • Information about the social and political context in Myanmar, including efforts to discourage companies from engaging financially with the Myanmar military regime and to financially support the CRPH as well as the relationship between the Myanmar military regime and China. This information would help the Board better understand the possible intent and impact of the post. 
  • Trends in discourse around foreign government intervention in Myanmar and use of potentially discriminatory language in that context. 
  • Information about Facebookโ€™s potentially erroneous enforcement of Community Standards, for example on Hate Speech, to restrict political speech in Myanmar.   
  • Whether Facebook users have noted changes in Facebookโ€™s moderation and appeals in Myanmar-related posts since the 2021 coup. 
  • Content moderation challenges specific to the Burmese language.

In its decisions, the Board can issue policy recommendations to Facebook. While recommendations are not binding, Facebook must respond to them within 30 days. As such, the Board welcomes public comments proposing recommendations that are relevant to this case.

FEM submission to Facebook Oversight Board

FEM makes the following general comments and case-specific comments framed by the UN Rabat Plan of Action in response to the Oversight Boardโ€™s (OB) call. The OBโ€™s published list of requested information is answered within.

General comments

Avoid repeating Facebookโ€™s Orwellian language

Facebook has created a new language with some new words for new concepts and others used in an Orwellian way to replace words that carry negative connotations. One example is the anodyne โ€œtake downโ€ rather than the unambiguous โ€œdeleteโ€, which it effectively is. Measuring censorship becomes harder when words conceal meaning. Recommendation: The OB Charter states that โ€œthe purpose of the board is to protect free expression by making principled, independent decisions” and this should in practice include using clear, neutral, and accessible language (for those using English as a second language), avoiding Facebookโ€™s jargon unless absolutely necessary to clarify Facebook processes.

Recognise that crude AI and sanctions result in unnecessary and disproportionate effects

Myanmar has suffered from the prevalence of โ€œhate speechโ€ on Facebook, much of which started as intentional incitement. Facebookโ€™s earlier inaction led to the public normalisation of โ€œhate speechโ€ and now, when Facebook is taking action, its use of crude AI and sanctions to moderate content creates further risks, namely incorrect identifications, unnecessary approaches, and disproportionate sanctions. Deleting clear incitement is necessary, but blanket deletions of posts, pages, and users for โ€œhate speechโ€ that does not reach the incitement threshold does little to improve public knowledge, attitudes, and behaviour. Recommendation: The OB should encourage Facebook to implement a more sophisticated, proactive, and educational response with targeted sanctions incorporating international human rights standards on free expression.

Insufficient information published for public to provide valid comments

The OB has published just two words from the deleted post, plus a summary of a translation of the Myanma language post. At the same time, the OB has asked for public comment on Facebookโ€™s application of its rules, values, and responsibilities. For good reason there are many – including FEM – urging Facebook to take more or smarter action on โ€œhate speechโ€ in Myanmar, and that may be reflected in public comments. But the lack of information – particularly the actual Myanma language – raises concerns about the validity of public comments. Uninformed comments are invalid comments. They could also influence and lead to an invalid decision by the OB. Recommendation: The OB should consider complimenting public comments with expert consultation, expert testimony, confidential discussions with trusted partners, and offering appellants the option of waiving their privacy for open justice.

Assessment under the six-part test of the UN Rabat Plan of Action

Context: Chinese businesses and Myanmarโ€™s longstanding history of boycott campaigns

Many businesses operating in Myanmar, including those run by the military, have links to China, and China is Myanmarโ€™s greatest source of foreign direct investment. China has been closely associated with large infrastructure projects, including those linked to human rights violations, and there is a common public perception that China is protector and benefactor to the Myanmar military. There have been many civil society campaigns to boycott, divest, and sanction Myanmarโ€™s military businesses, military leaders, and their โ€œcroniesโ€, including their Chinese partners. These business-facing campaigns started decades ago but increased after 2017 and again after the coup. There is a national discourse on the role of Chinaโ€™s government and Chinaโ€™s businesses in Myanmarโ€™s political transition, and how the public in Myanmar and globally should respond to such a role. Civil society has led a successful campaign to disentangle public confusion between Myanmar-persons-of-Chinese-descent and China itself. The Myanma language uses the same word for China, Chinese (government), and Chinese (people).

Speaker: Unknown

The OB has not published details of the userโ€™s position or status, or their standing in the context of the audience to whom the post is directed. There is no information on how old the user account is, or how large the userโ€™s normal audience is.

Intent: Unknown. 

The OBโ€™s summary of a translation describes the userโ€™s intent to influence government policy and law, which is protected under international human rights standards as political speech. The summary references โ€œChinaโ€ but does not clearly establish whether this is to insult (again, protected under international human rights standards) or to incite.

Content and form: Word used is not profane

Facebook deleted the post on the basis that it included the phrase โ€œ$แ€แ€›แ€ฏแ€แ€บโ€ which includes the words โ€œdamnโ€ ($) and โ€œChinaโ€ (แ€แ€›แ€ฏแ€แ€บ – see above for multiple meanings). Facebook appears to regard โ€œ$โ€ as a profanity, defined as, โ€œterms or phrases with the intent to insult, including but not limited to fuck, bitch, motherfuckerโ€, and, when followed by โ€œแ€แ€›แ€ฏแ€แ€บโ€, is regarded as โ€œhate speechโ€. However, โ€œ$โ€ is not comparable to Facebookโ€™s list of examples. A middle school teacher in Myanmar may use โ€œ$โ€ when admonishing a child, but for an American teacher to use โ€œbitchโ€ would likely be a disciplinary matter. Rightly or wrongly, โ€œ$โ€ has achieved the status of general social acceptance and therefore should not be regarded as having reached Facebookโ€™s threshold of profanity. As there is no profanity, the phrase โ€œ$แ€แ€›แ€ฏแ€แ€บโ€ is not โ€œhate speechโ€ under Facebookโ€™s rules. This opinion may not apply to the full post, which remains undisclosed, nor should it prevent Facebook from encouraging more tolerant behaviour on its platform.

Moreover, Facebook’s decision to define โ€œ$โ€ as profane raises concerns that either its understanding of Myanma language is outdated or its threshold is intentionally low. Either of these would indicate a potentially more extensive violation of the right to freedom of expression in Myanmar, which is a trend that FEM has observed over the past two years.

Rabat test: Extent: Unknown

The OB stated that the post received 500,000 views and was shared 6,000 times. This would place it in the top 50 public posts in Myanmar in any one day. The OB has not shared information about the postโ€™s audience, which would indicate whether they have the means to act as a result of the post.

Rabat test: Likelihood including imminence: Unknown

The OBโ€™s summary describes the userโ€™s intent to influence government policy and law, and implies that any changes should affect China in particular. There is little evidence that Myanmarโ€™s policy makers have been significantly affected by user demands on Facebook, and little evidence too of them limiting Chinaโ€™s general access into Myanmar.