The homes of Rohingya Muslims in Myanmar’s Rakhine state were still smoldering when a Burmese Facebook group posted a message celebrating the country’s brutal military campaign to its 400,000 followers.
“Today Maung Daw has been announced as a Bengali-free zone,” read the message posted to Facebook on September 15, 2017. Bengali is regularly used as a slur for Myanmar’s largely Muslim Rohingya population and Maung Daw was the site of some of the military’s most egregious crimes.
The post, touting a military campaign that the U.N. has since described as a “textbook example of ethnic cleansing,” shared alarmingly well, piling up 13,500 shares, 22,000 reactions, and over 2,000 comments. It's just one of thousands of hate-fueled messages that have polluted Facebook in Myanmar since September, transforming the social media platform into “a beast” that facilitated ethnic violence in Myanmar, according to the U.N. In early April, Facebook CEO Mark Zuckerberg acknowledged the problem and promised to do better.
But the post was still up and sharing Tuesday morning.
Now, as Zuckerberg prepares to be grilled by U.S. lawmakers this week over his company’s widening data breach and myriad scandals at home and abroad, activists and human rights groups in Myanmar are pointing to these hate-filled posts as proof that Facebook continues to fail at enforcing its own rules.
“The risk of Facebook content sparking open violence is arguably nowhere higher right now than in Myanmar,” read an open letter from civil society organizations in Myanmar to Zuckerberg last week. As of March, ethnic cleansing was still underway, a U.N. official warned.
Their warnings evoked a rare personal response from Zuckerberg. In an email to the activists obtained by VICE News, Zuckerberg sought to assure them that his company is paying close attention to what is happening in Myanmar and has staffed up to include Burmese language moderators. But according to people on the ground in Yangon, there’s not much evidence of that happening yet.
"I have never met a person at Facebook who is from Myanmar or speaks the Myanmar language," said Jes Kaliebe Petersen, the chief executive of Phandeeyar, Myanmar’s leading technology hub, and the group that helped Facebook translate its community standards into Burmese. “I think it is a matter of a commitment to Myanmar inside Facebook that I am not convinced has happened yet.”
In an emailed response to Zuckerberg, the groups accused Facebook of treating Myanmar as a Third World nation. “Your proposed improvements are nowhere near enough to ensure that Myanmar users are provided with the same standards of care as users in the U.S. or Europe,” the group said.
More than a half-dozen activists and exiled Rohingya told VICE News that Facebook has failed at policing anti-Rohingya hate speech. But the activists go further: They claim that Facebook helped the government’s cause by suppressing dissenting voices and quashing critical news coverage about the military’s crimes in Rakhine state.
Facebook confirmed to VICE News that it has removed some posts, videos and images documenting the violence in the region, but said it has since restored this content and apologized to those affected.
The U.N.’s rare critique of Facebook last month comes as the agency investigates reports of a calculated and ongoing ethnic cleansing campaign by Myanmar security forces that began last August, forcing more than 700,000 Rohingya Muslims to flee into Bangladesh in the span of a few months.
Myanmar has repeatedly denied the reports, despite damning evidence provided by the United Nations, Human Rights Watch and other humanitarian organizations.
Now, the UN investigators are training their sights on Facebook, which Marzuki Darusman, chairman of the U.N. Independent International Fact-Finding Mission on Myanmar, said has played a “determining role” in the country’s crisis.
"The irresponsibility and the failure to adhere to its own community standards over the last six years, has fed and created this “beast.’”
His colleague Yanghee Lee, who believes the events in Rakhine State bear the hallmarks of genocide, went a step further, saying that Facebook’s original intention of helping connect people instead was helping unite ultra-nationalist Buddhists to spread hate speech against the Rohingya and other ethnic minorities.
Facebook CEO Mark Zuckerberg arrives on Capitol Hill in Washington, Monday, April 9, 2018, to meet with Sen. Dianne Feinstein, D-Calif., the ranking member of the Senate Judiciary Committee. Zuckerberg will testify Tuesday before a joint hearing of the Commerce and Judiciary Committees about the use of Facebook data to target American voters in the 2016 election. (AP Photo/J. Scott Applewhite)
“I’m afraid that Facebook has now turned into a beast, and not what it originally intended,” Lee said.
Lee said part of the problem stems from the fact that for most people in Myanmar, Facebook is the internet. “Everything is done through Facebook in Myanmar,” Lee said. Facebook’s zero-rated Free Basics service arrived in Myanmar in 2016 just as smartphone use was exploding in the previously closed off country. The result was that Facebook essentially became Myanmar’s predominant online portal.
Yet despite its massive popularity and growth, observers said the company had appeared to commit limited resources to the country, and that its staff were incapable of dealing with the rapid rate in which hate speech spread on the platform.
"The irresponsibility and the failure to adhere to its own community standards over the last six years, has fed and created this “beast,’” Haikal Mansor, the general secretary of the European Rohingya Council, told VICE News.
Critics point out that Facebook does not have an office in Myanmar. Commercial operations related to the country are operated out of Facebook’s Singapore office, while its policy team is based in Australia, according to Petersen.
Facebook would not say how many staff it has dedicated to Myanmar or how many of its staff speak the language, but Zuckerberg put the figure of Burmese speaking moderators at “dozens” in his email this week. A spokesperson for Facebook said the company had “staffed up” its community operations team and pointed to a security team that has doubled in recent years to 20,000 globally.
“We have invested significantly in technology and local language expertise to help us swiftly remove hate content and people who repeatedly violate our hate speech policies,” a Facebook spokeswoman told VICE News.
But the reality on the ground in Myanmar isn’t as rosy.
“I don't think Facebook have even identified the pages or accounts or groups which are vectors of this misinformation and hate speech,” Ray Serrato, a digital researcher and analyst who tracked hate speech posted to the Facebook page of the hardline nationalist MaBa Tha group.
Serrato, who doesn’t speak Burmese, was able to use Facebook’s own API to scrape the information from the network, and he wonders why the company can’t do something similar.
"It's somewhat surprising to me that Facebook hasn't identified some of these things because even if it didn't speak the language, some of the words are just so crude and easy to see, that it is patently hate speech," Serrato said.
Petersen said policing hate speech is only going to grow more difficult, as users have figured out ways to share their posts without tipping off censors.
“Now, instead of pressing share, people have started copying and pasting," he said.
Facebook insists that it takes these accusations “incredibly seriously” and pointed to its work with Phandeeyar as an example of what it is doing. Yet activists said Facebook should not have been caught off guard by the UN’s findings, and question how committed the company is to fixing its issues in Myanmar.
“Facebook should have teams working on preventing hate speech, racial slurs, and promotion of violence, not just simply idly standing by, pretending to not know or that it’s not its responsibility,” Mansor said.
Facebook has no easy answers in front of them, and asking the company to be the arbiter of what’s permissible brings with it another set of concerns, said Champa Patel, the head of the Asia-Pacific program at U.K.-think tank Chatham House.
“There is the larger question of who decides what is a 'dissenting' voice and where the lines should be drawn so that hate speech and incitement to violence is curbed but not at the expense of free expression on difficult or controversial issues," said Patel.
Yet activists say Facebook hasn’t entirely stayed out of the censorship business.
While Facebook’s failure to curb hate speech and calls for violence have grabbed headlines, activists pointed to what they see as another problem: the social network’s history of preventing them from reporting government-led atrocities against the Rohingya.
“Facebook censorship had a big negative impact on us in how we disseminated information last year,” Mohammad Anwar, a Rohingya activist living in exile in Malaysia, told VICE News. He said his Facebook account, which documented the military's campaign in the Rakhine state, was suspended and posts deleted or censor.
Facebook didn’t give Anwar any explanation, beyond an automated message that said his posts violated the company’s Community Standards, despite not featuring any graphic images or incitement to violence.
This happened to dozens of other activists VICE News spoke to; Anwar believes this is happening “at the behest of the government.”
Anwar’s censorship troubles began on Aug. 28 when a post he published, documenting Burmese military helicopters flying over Rohingya villages in the Maungdaw District of Rakhine State, went viral.
The activist said that prior to this, the government “didn’t take social media activists seriously,” but once he’d gotten the world’s attention, the government got involved. Soon after, he says, Facebook started clamping down on dissenting voices by disabling their social media accounts and censoring posts.
In relation to government requests, Facebook said: “We are transparent about the content we restrict at the requests of government around the world, pursuant with local law.” In its latest transparency report on Myanmar — which only covered the first half of 2017 — the company reports that it removed just a single piece of content at the request of the government.
Activists and observers also questioned how Facebook could stifle dissent at the same time that it failed to enforce basic guidelines against hate speech and incitement of violence. They pointed to the case of ultra-nationalist Buddhist monk Ashin Wirathu — who referred to himself as the “Burmese bin Laden” — being allowed to spread hate speech unchecked for years on Facebook. His account was finally revoked in February.
More recently, a post by Zaw Htay, the current spokesperson for the State Counsellor Aung San Suu Kyi claimed that Facebook would cooperate with the Myanmar government in blocking the accounts of a Rohingya militant group. Yet the same request wasn't made to rein in the extremist MaBa Tha group, despite it having over 500,000 followers and regularly promoting hate speech and calls for violence against the Rohingya minority.
Analysts and human rights observers said that governments like Myanmar’s often use Facebook to hone in on content contrary to their interests, while ignoring issues like hate speech and incitement along ethnic lines.
Myanmar is not the only place where Facebook has been accused of toeing the government line. In Thailand it frequently deletes posts critical of the royal family under the country’s strict lèse majesté laws; in the Philippines the company is accused of helping the regime of Rodrigo Duterte; and in Cambodia opposition leader Sam Rainsy has filed a lawsuit against Facebook to force them to reveal how closely they work with the authoritarian regime of prime minister Hun Sen.
“It is important to be aware that prosecutors in Myanmar have tended to use national laws to target human rights defenders and journalists, as opposed to those actually responsible for dangerous speech,” Frederick Rawski, the director of the Asia and Pacific Programme at the International Commission of Jurists, told VICE News.
In this regard, the Burmese government has been instrumental, said Soe Thu Moe, a Rohingya activist living and working in Saudi Arabia, “The government used Facebook to spread hate speech against the Rohingya minority,” Moe said, echoing sentiments expressed by several activists VICE News spoke to. The activists pointed to public posts by the official Burmese government page, as well as those operated by the armed forces and national defense.
“As long as international Facebook users do not speak up and show solidarity with the Rohingya, the UN's comments won’t matter for Facebook — it's a corporation.”
Ro Nay San Lwin, a European-based Rohingya activist, said his colleagues inside Myanmar had “suffered a lot” as a result of posting updates to their timelines, receiving abusive comments from anti-Rohingya accounts. He told VICE News that hate speech accounts continue to urge the expulsion or killing of Rohingyas with nothing done to stop them.
“Facebook is a tool for the voice of oppressed people, and when it tries to suppress that voice, then Facebook is not dissimilar to that of oppressors,” Mansour said.
Despite Zuckerberg’s multiple apologies in recent weeks, and his promises to commit more resources to the problem in Myanmar, activists remain unconvinced, and fear that the abuse and violence will continue to spread as long as Facebook is driven primarily by profits.
“As long as international Facebook users do not speak up and show solidarity with the Rohingya, the UN's comments won’t matter for Facebook — it's a corporation,” Anwar said, adding that the recent high-profile controversies only confirm his opinions of the company.
Cover image: A Rohingya refugee child looks through the fence at a refugee camp in Cox's Bazar, Bangladesh March 22, 2018. REUTERS/Mohammad Ponir Hossain