As Facebook Addresses Role in Myanmar Violence, Look Back at Early Warnings

November 6, 2018 On the eve of the U.S. midterm elections, Facebook released an outside report it commissioned on its impact on human rights in Myanmar, where the countrys Rohingya Muslim minority has been the subject of brutal violence that the United Nations has since called a genocide.

November 6, 2018

On the eve of the U.S. midterm elections, Facebook released an outside report it commissioned on its impact on human rights in Myanmar, where the country’s Rohingya Muslim minority has been the subject of brutal violence that the United Nations has since called a genocide.

The U.N. has said social media — and Facebook in particular — was a significant factor, as the platform allowed hate speech and calls for violence against the Rohingya to spread across Myanmar. Facebook had admitted it had been slow to respond to concerns. But this report offers a clearer picture of the company’s impact on the ground.

The report, by the firm Business for Social Responsibility, found that Facebook was “directly linked” to harm in Myanmar when people used the platform in ways that violate its community standards — for example, to incite violence, spread disinformation or promote hate speech. While it said that the company hadn’t caused or contributed to human rights violations “via its own actions,” the assessment found that Facebook’s platform had been “useful” for those seeking to bring about real-world harm in Myanmar, and it outlined recommendations for the company to address the problem. The report also warned of the potential that the run-up to the country’s 2020 elections could pose new risks.

In the past year, Facebook says it’s taken down problematic accounts in Myanmar, hired more language experts, and improved its policies. “We agree that we can and should do more,” said Alex Warofka, a Facebook product policy manager, in a post on Facebook’s blog. Warofka outlined Facebook’s efforts to address five areas for “continued improvement” identified by BSR. He also noted the report’s finding that “Facebook alone cannot bring about the broad changes needed to address the human rights situation in Myanmar.”

But Facebook had plenty of early warnings from Myanmar and other countries about how it was being used to shape events on the ground. Last week, in The Facebook Dilemma, FRONTLINE explored how the company responded to warnings about the platform’s role in spreading disinformation and hate speech, and sparking real-world violence in Myanmar in particular. In the below excerpt from the documentary, a tech entrepreneur living in Myanmar named David Madden recounts making a presentation at Facebook headquarters back in May of 2015, warning that Myanmar’s Muslim minority was being targeted with hate speech.

“I drew the analogy with what had happened in Rwanda, where radios had played a really key role in the execution of this genocide,” Madden told FRONTLINE. “And so I said, ‘Facebook runs the risk of being in Myanmar what radios were in Rwanda’ – that this platform could be used to foment hate and to incite violence.”

Madden says he received an email that the concerns he raised were shared internally and taken “very seriously.” But the violence intensified. As the film reports, Madden and other local activists had another meeting with Facebook in early 2017, warning that the platform’s processes for addressing content that demonized the country’s Muslims weren’t working.

“I think, I think the, the main response from Facebook was, ‘We’ll need to go away and dig into this and come back with something substantive,’” Madden tells FRONTLINE. “The thing was, it never came.”

Watch FRONTLINE’s full two-part investigation into Facebook here.

Patrice Taddonio

Patrice Taddonio, Senior Digital Writer, FRONTLINE

ncG1vNJzZmivp6x7sa7SZ6arn1%2BstKO0jp%2BpqKakobavsY6aqa2hk6GycK3SZp2am5WXvLC3jJqbnaqVqMCmv4yrpqWdXZ67brnYmqWmmaJiw6q7y56lnJ1dobywt4ybmJyjXZbBbrHAq6OyZaeWv6%2B1zaCqaA%3D%3D

 Share!