Oversight Board overrules Facebook deletion of Myanmar “hate speech” post

FEM has sent a submission to the Facebook Oversight Board which is hearing an appeal against the removal of a Myanmar language post for alleged “hate speech”. The post was published and removed during the coup. The Oversight Board has since over-ruled Facebook, deciding that the post was not “hate speech”.


FEM makes this submission in response to the Oversight Board’s request for public comment. Case number 2021-007-FB-UA involves an appeal from a Facebook user whose post was removed between February and April 2021. The post – which has not been shared by the Oversight Board – reportedly criticised the military coup and called for the CRPH to place pressure on businesses that work with the military. The post referred several times to China and included profanities. It received 500,000 views and was shared 6,000 times. 

Facebook’s “Administrative Action Bot” identified the phrase “$တရုတ် and deleted the post. No Facebook users reported the post. Under its Hate Speech Community Standard, Facebook deletes content targeting a person or group of people on the basis of their race, ethnicity and/or national origin with “profane terms or phrases with the intent to insult, including but not limited to: fuck, bitch, motherfucker.”

The Oversight Board requested public comments that address: 

  • Whether Facebook’s decision to remove the post is consistent with the company’s Hate Speech Community Standard, specifically the rule against profane terms or phrases with the intent to insult. 
  • Whether Facebook’s decision to remove the post is consistent with the company’s stated values and human rights responsibilities
  • Information about the social and political context in Myanmar, including efforts to discourage companies from engaging financially with the Myanmar military regime and to financially support the CRPH as well as the relationship between the Myanmar military regime and China. This information would help the Board better understand the possible intent and impact of the post. 
  • Trends in discourse around foreign government intervention in Myanmar and use of potentially discriminatory language in that context. 
  • Information about Facebook’s potentially erroneous enforcement of Community Standards, for example on Hate Speech, to restrict political speech in Myanmar.   
  • Whether Facebook users have noted changes in Facebook’s moderation and appeals in Myanmar-related posts since the 2021 coup. 
  • Content moderation challenges specific to the Burmese language.

In its decisions, the Board can issue policy recommendations to Facebook. While recommendations are not binding, Facebook must respond to them within 30 days. As such, the Board welcomes public comments proposing recommendations that are relevant to this case.

FEM submission to Facebook Oversight Board

FEM makes the following general comments and case-specific comments framed by the UN Rabat Plan of Action in response to the Oversight Board’s (OB) call. The OB’s published list of requested information is answered within.

General comments

Avoid repeating Facebook’s Orwellian language

Facebook has created a new language with some new words for new concepts and others used in an Orwellian way to replace words that carry negative connotations. One example is the anodyne “take down” rather than the unambiguous “delete”, which it effectively is. Measuring censorship becomes harder when words conceal meaning. Recommendation: The OB Charter states that “the purpose of the board is to protect free expression by making principled, independent decisions” and this should in practice include using clear, neutral, and accessible language (for those using English as a second language), avoiding Facebook’s jargon unless absolutely necessary to clarify Facebook processes.

Recognise that crude AI and sanctions result in unnecessary and disproportionate effects

Myanmar has suffered from the prevalence of “hate speech” on Facebook, much of which started as intentional incitement. Facebook’s earlier inaction led to the public normalisation of “hate speech” and now, when Facebook is taking action, its use of crude AI and sanctions to moderate content creates further risks, namely incorrect identifications, unnecessary approaches, and disproportionate sanctions. Deleting clear incitement is necessary, but blanket deletions of posts, pages, and users for “hate speech” that does not reach the incitement threshold does little to improve public knowledge, attitudes, and behaviour. Recommendation: The OB should encourage Facebook to implement a more sophisticated, proactive, and educational response with targeted sanctions incorporating international human rights standards on free expression.

Insufficient information published for public to provide valid comments

The OB has published just two words from the deleted post, plus a summary of a translation of the Myanma language post. At the same time, the OB has asked for public comment on Facebook’s application of its rules, values, and responsibilities. For good reason there are many – including FEM – urging Facebook to take more or smarter action on “hate speech” in Myanmar, and that may be reflected in public comments. But the lack of information – particularly the actual Myanma language – raises concerns about the validity of public comments. Uninformed comments are invalid comments. They could also influence and lead to an invalid decision by the OB. Recommendation: The OB should consider complimenting public comments with expert consultation, expert testimony, confidential discussions with trusted partners, and offering appellants the option of waiving their privacy for open justice.

Assessment under the six-part test of the UN Rabat Plan of Action

Context: Chinese businesses and Myanmar’s longstanding history of boycott campaigns

Many businesses operating in Myanmar, including those run by the military, have links to China, and China is Myanmar’s greatest source of foreign direct investment. China has been closely associated with large infrastructure projects, including those linked to human rights violations, and there is a common public perception that China is protector and benefactor to the Myanmar military. There have been many civil society campaigns to boycott, divest, and sanction Myanmar’s military businesses, military leaders, and their “cronies”, including their Chinese partners. These business-facing campaigns started decades ago but increased after 2017 and again after the coup. There is a national discourse on the role of China’s government and China’s businesses in Myanmar’s political transition, and how the public in Myanmar and globally should respond to such a role. Civil society has led a successful campaign to disentangle public confusion between Myanmar-persons-of-Chinese-descent and China itself. The Myanma language uses the same word for China, Chinese (government), and Chinese (people).

Speaker: Unknown

The OB has not published details of the user’s position or status, or their standing in the context of the audience to whom the post is directed. There is no information on how old the user account is, or how large the user’s normal audience is.

Intent: Unknown. 

The OB’s summary of a translation describes the user’s intent to influence government policy and law, which is protected under international human rights standards as political speech. The summary references “China” but does not clearly establish whether this is to insult (again, protected under international human rights standards) or to incite.

Content and form: Word used is not profane

Facebook deleted the post on the basis that it included the phrase $တရုတ် which includes the words “damn” ($) and “China” (တရုတ် – see above for multiple meanings). Facebook appears to regard “$” as a profanity, defined as, “terms or phrases with the intent to insult, including but not limited to fuck, bitch, motherfucker”, and, when followed by တရုတ်, is regarded as “hate speech”. However, “$” is not comparable to Facebook’s list of examples. A middle school teacher in Myanmar may use “$” when admonishing a child, but for an American teacher to use “bitch” would likely be a disciplinary matter. Rightly or wrongly, “$” has achieved the status of general social acceptance and therefore should not be regarded as having reached Facebook’s threshold of profanity. As there is no profanity, the phrase $တရုတ် is not “hate speech” under Facebook’s rules. This opinion may not apply to the full post, which remains undisclosed, nor should it prevent Facebook from encouraging more tolerant behaviour on its platform.

Moreover, Facebook’s decision to define “$” as profane raises concerns that either its understanding of Myanma language is outdated or its threshold is intentionally low. Either of these would indicate a potentially more extensive violation of the right to freedom of expression in Myanmar, which is a trend that FEM has observed over the past two years.

Rabat test: Extent: Unknown

The OB stated that the post received 500,000 views and was shared 6,000 times. This would place it in the top 50 public posts in Myanmar in any one day. The OB has not shared information about the post’s audience, which would indicate whether they have the means to act as a result of the post.

Rabat test: Likelihood including imminence: Unknown

The OB’s summary describes the user’s intent to influence government policy and law, and implies that any changes should affect China in particular. There is little evidence that Myanmar’s policy makers have been significantly affected by user demands on Facebook, and little evidence too of them limiting China’s general access into Myanmar.