In a Thursday letter, they slammed the Facebook CEO for a Vox Media interview in which they believe Zuckerberg glossed over Facebook’s problems in the country. Zuckerberg claimed his company stopped attempts to incite violence between Myanmar’s Muslim and Buddhist groups over Facebook Messenger.
“Our systems detect that that’s going on. We stop those messages from going through,” Zuckerberg said.
Groups in Myanmar, however, claim Facebook failed to quickly notice the problem last September, when the incident occurred. At the time, bad actors encouraged violence over Facebook Messenger with content that circulated for more than four days, possibly reaching hundreds of thousands of people.
Only when the civil society groups alerted Facebook about the abuse did the social media giant intervene, their letter says. “In your interview, you refer to your detection ‘systems.’ We believe your system, in this case, was us —and we were far from systematic,” the letter adds.
Although Facebook eventually stopped the abuse, by then the offending messages had caused widespread fear and at least three violent incidents, according to the groups. Making matters worse is that Facebook is ill-equipped to stop future attempts to incite violence over the platform, they add.
“As far as we know, there are no Burmese-speaking Facebook staff to whom Myanmar monitors can directly raise such cases,” their letter reads. “We were lucky to have a confident English speaker who was connected enough to escalate the issue.”
About a million Rohingya Muslims have fled Myanmar over the violence, which intensified last August in a military crackdown that the US condemned as “ethnic cleansing.”
In response to Thursday’s letter, Facebook is apologizing. “We are sorry that Mark did not make clearer that it was the civil society groups in Myanmar who first reported these messages,” the company said in an email “We took their reports very seriously and immediately investigated ways to help prevent the spread of this content.”
The spotlight on Facebook’s troubles in Myanmar comes as United Nations officials also claim the platform is fanning violence in the country. “Hate speech and incitement to violence on social media is rampant, particularly on Facebook,” said UN human rights investigator Marzuki Darusman last month. “To a large extent, it goes unchecked.”
One Facebook executive also claims his teams are losing sleep over the issue. However, Facebook is trying to solve the problems. “We should have been faster and are working hard to improve our technology and tools to detect and prevent abusive, hateful or false content,” the company added in its email.
The civil society groups in Myanmar say they want to work with Facebook to crack down on the abuse. But so far, they’ve only engaged with Facebook’s policy team, and not the product or engineering divisions. Attempts at greater collaboration have also gone unanswered, they claim.
“The risk of Facebook content sparking open violence is arguably nowhere higher right now than in Myanmar,” their letter adds.
The letter comes from six groups, including the IT innovation lab Phandeeyar, the Myanmar ICT for Development Organization, Burma Monitor, the Center for Social Integrity, Equality Myanmar, and Myanmar Human Rights Educator Network.
Facebook said it plans to work with the groups. The company is also rolling out a new function to report abuse on Messenger, and has added more Burmese language reviewers to its content moderation efforts.