Facebook is conducting a human rights audit in Myanmar to assess its role in enabling ethnic violence and hate speech against its Rohingya Muslim minority, VICE News has learned.
The audit, which kicked off in recent months, is a first of its kind for Facebook and appears to be the most aggressive step the company has taken to examine the role of Facebook and WhatsApp in Myanmar, where it has been accused of supercharging the spread of virulent hate speech that the U.N. has linked to ethnic cleansing.
Two local activists told VICE News about the audit but could not speak on the record about the effort, due to security concerns. A source at Facebook confirmed the effort was underway.
Victoire Rio, one of the activists who'd called for such an audit earlier this year, welcomed the move but said it was unclear just how much — if any — of the audit Facebook will make public. Either way, it’s an important first step in understanding just what impact Facebook is having in Myanmar, Rio told VICE News.
"If you are going to have more than a million users in a country, and you don't know shit about the country, then this seems like the basic research you should be doing,” she said.
Facebook is conducting a human rights audit in Myanmar to assess its role in enabling ethnic violence and hate speech against its Rohingya Muslim minority, VICE News has learned.
The audit, which kicked off in recent months, is a first of its kind for Facebook and appears to be the most aggressive step the company has taken to examine the role of Facebook and WhatsApp in Myanmar, where it has been accused of supercharging the spread of virulent hate speech that the U.N. has linked to ethnic cleansing.
Two local activists told VICE News about the audit but could not speak on the record about the effort, due to security concerns. A source at Facebook confirmed the effort was underway.
Victoire Rio, one of the activists who'd called for such an audit earlier this year, welcomed the move but said it was unclear just how much — if any — of the audit Facebook will make public. Either way, it’s an important first step in understanding just what impact Facebook is having in Myanmar, Rio told VICE News.
"If you are going to have more than a million users in a country, and you don't know shit about the country, then this seems like the basic research you should be doing,” she said.
But concerns remain. Facebook still doesn’t have any permanent employees in Myanmar, and activists worry the company is relying too heavily on them to highlight problem posts. The social network won’t say how many Burmese speakers it has dealing with the flood of hate speech coming from the country.
The audit is being conducted by Business for Social Responsibility, a San Francisco-based nonprofit that provides guidance on how to implement human rights principles to more than 250 companies, including Google, Microsoft, Visa, and Coca-Cola.
The company has already carried out interviews with groups on the ground in Myanmar, and the report could be finished as soon as this month.
Business for Social Responsibility did not respond to a request for comment.
Investigating abuses
As in many developing countries, Facebook is the dominant online portal in Myanmar, playing an outsize role in how citizens access and share information. The U.N.’s rare critique of Facebook earlier this year came amid the agency’s investigation into reports of a calculated and ongoing ethnic cleansing campaign by Myanmar security forces that began in August 2017, forcing more than 700,000 Rohingya Muslims to flee into Bangladesh in the span of a few months.
The audit comes two months after a delegation of five Facebook employees visited the country and met with civil society groups and government representatives. The delegation included Sara Su, a product manager on the Newsfeed team, Mia Garlick, a public policy director focused on the Asia-Pacific region and David Caragliano, a content policy manager. Facebook would not confirm the identities of the other members of the delegation.
VICE News has since learned from sources in Myanmar that during the June trip, the company held a previously unreported meeting with activists to directly address specific issues like human rights abuses, digital rights, freedom of expression, and interfaith harmony. The sources could not speak on the record for fear of reprisals from nationalist groups.
The delegation visited Myanmar for a week but met with activists for just a day and a half, which one activist described as “nowhere near enough and it should have been much longer, but again it is a good start.”
The visit was part of the company’s concerted efforts to deal with a crisis that led Facebook CEO Mark Zuckerberg to pen a personal apology to activists in April, after U.N. investigators determined that Facebook had facilitated ethnic cleansing against Rohingya Muslims in the country.
"I'm afraid that Facebook has now turned into a beast, and not what it originally intended," Yanghee Lee, the U.N.’s Special Rapporteur of human rights in Myanmar, said at the time.
Removing hate groups
In a personal letter to activists in Myanmar, Zuckerberg promised to commit more resources to address the issue. Along with conducting the audit, Facebook is banning hate accounts, creating new positions on its policy team to deal specifically with Myanmar, and working with local media outlets to increase news literacy in the country.
But underlying concerns about the transparency of the platform and the company’s dedication to solving the issues remain.
Read: Hate speech is still going viral on Facebook in Myanmar, despite Zuckerberg's promises
“We know there are instances where we have taken too long, and that is an area we are focused on getting better”
Four months after Zuckerberg’s letter, abuse on the platform remains a major problem for Facebook, but Garlick, Su, and Caragliano all insist the company has committed more resources to better respond to reports of hate speech on its platform.
“We know there are instances where we have taken too long, and that is an area we are focused on getting better,” Caragliano told VICE News. He said the company hopes to deal with all reports from Myanmar within 24 hours, and for posts posing a credible threat, within a couple of hours.
But there is no indication of when and how that goal will be met. Furthermore, skeptical activists point out that Zuckerberg made a similar promise to Congress in April.
Facebook has also rolled out a more aggressive approach to taking down hate figures, beginning almost immediately after the company’s meeting with activists in Yangon in early June, banning a number of the most virulent individuals and groups.
“We have proactively removed several hate figures and organizations,” Caragliano said. “These are not allowed to have a presence on Facebook — they can't have pages, accounts, and we remove content that praises or supports these individuals or organizations.”
Read: Talking about politics on Facebook can get you sent to prison in Cambodia
Facebook would not share specifics about the number of accounts or groups they have banned in Myanmar in recent months, which illustrates one of the most consistent complaints amongst human rights monitors and activists in Myanmar: For all its positive steps, Facebook continues to evade basic transparency practices.
“The immediate need from Facebook is more transparency”
“The immediate need from Facebook is more transparency,” Matthew Bugher, head of the Asia program at human rights organization Article 19, told VICE News. “The rationale for removing content remains quite opaque. In the past, activists and those promoting messages of tolerance have had their accounts locked, while extreme nationalist and those promoting violence have been able to operate freely.”
The lack of transparency is especially concerning for activists like Htaike Htaike of the Myanmar ICT for Development Organisation (Mido), who say many of Myanmar’s worst actors continue to have a presence on Facebook, through existing accounts or fake names, or via their loyal followers.
Facebook’s opacity on such issues extends to simpler metrics as well, such as how many employees the company has dedicated to Myanmar.
“A big problem on its hands”
Facebook again declined to give specifics when asked by VICE News, and instead offered a generic response regarding its content moderation — which is outsourced to CPL Resources in Dublin. “We’ve added dozens more Myanmar-language reviewers to our community operations team, and intend to double that number by the end of this year,” a spokesperson for the company told VICE News.
Read: India's fake news epidemic is killing people, and Modi's government has no plan to stop it
Such assurances closely echo Zuckerberg’s claim from April, when he said Facebook had “added dozens more Burmese-language reviewers to handle reports from users across all our services.”
Facebook said the company is recruiting for a number of Myanmar-specific positions on its policy team — positions that activists admit have not existed before for a country like Myanmar — and that it is keeping its partners in Myanmar updated on the progress of key hires for the country. However, those same activists worry that Facebook’s recruitment process is not transparent enough.
“We care a lot about those people being deeply biased, or being too close to the government,” Rio said.
Myanmar is just one country where Facebook is facing heat for its lack of action in response to hate speech, incitement to violence, and government abuse. Political dissidents and activists in Vietnam, India, Cambodia and Sri Lanka are also urging the Silicon Valley behemoth to take a more assertive role in addressing urgent human rights issues on its platform.
In India, where viral WhatsApp messages have led to dozens of mob lynchings, the government has threatened to sue Facebook because “rampant circulation of irresponsible messages in large volumes on [WhatsApp’s] platform has not been addressed adequately.”
Facebook’s actions in Myanmar are a positive indication that the company understands it needs to do more, local activists said, but they remain skeptical about what that means in real terms.
“It seems that Facebook has recognized, in Myanmar as elsewhere in the world, that it has a big problem on its hands,” Bugher said. “It is now making the right sounds about improving standards and policies. However, civil society has yet to see follow-through on key commitments made by Facebook. The coming months will be very important.”
Correction: A previous version of this story incorrectly stated the duration of Facebook's delegation to Myanmar in June. The story has been corrected.
Cover: A Rohingya ethnic minority man looking at Facebook on his cell phone at a temporary makeshift camp after crossing over from Myanmar into the Bangladesh side of the border, near Cox's Bazar's Palangkhali, Friday, Sept. 8, 2017. (Photo by Ahmed Salahuddin/NurPhoto via Getty Images)