Breaking News

Zuckerberg: Facebook has systems to stop hate speech. Myanmar groups: No, it doesn’t.

Rohingya Muslims in Bangladesh's refugee camps.

The social network has fueled ethnic cleansing of the Rohingya.

Facebook CEO Mark Zuckerberg said Facebook’s systems detected and stopped violent messages from being sent on the social network in Myanmar. Civil society groups working in the country are pushing back and saying that isn’t the case.

Close to 700,000 Rohingya, a minority Muslim group, have fled Myanmar in the wake of a coordinated campaign of ethnic cleansing. Facebook has helped fuel the violence, becoming a platform for hate and violent speech against the minority group. The popularity and accessibility of the social network has exploded in recent years and become a vital source of information — something bad actors are trying to exploit.

Zuckerberg, in an interview with Vox’s Ezra Klein this week, addressed this dilemma. He said Facebook is taking these issues “really seriously.”

He pointed to one specific case where he said Facebook interceded to block hateful messages sent out on its Messenger tool last September:

I remember, one Saturday morning, I got a phone call and we detected that people were trying to spread sensational messages through — it was Facebook Messenger in this case — to each side of the conflict, basically telling the Muslims, “Hey, there’s about to be an uprising of the Buddhists, so make sure that you are armed and go to this place.” And then the same thing on the other side.

So that’s the kind of thing where I think it is clear that people were trying to use our tools in order to incite real harm. Now, in that case, our systems detect that that’s going on. We stop those messages from going through. But this is certainly something that we’re paying a lot of attention to.

But in an open letter published Thursday, civil society groups criticized Facebook’s response to the crisis. “The risk of Facebook content sparking open violence is arguably nowhere higher right now than in Myanmar,” the letter read. The groups wrote that the “we” who detected the sensational messages wasn’t Facebook but the activists and organizations themselves.

“In your interview, you refer to your detection ‘systems’. We believe your system, in this case, was us — and we were far from systematic,” the letter read. “We identified the messages and escalated them to your team via email on Saturday the 9th September, Myanmar time. By then, the messages had already been circulating widely for three days.”

The letter explains that this particular Facebook Messenger incident wasn’t an isolated one, and “epitomizes the kind of issues that have been rife on Facebook in Myanmar for more than four years now and the inadequate response of the Facebook team.”

The groups also homed in on a few key problems that they say prevent Facebook from effectively tackling violent speech and propaganda in Myanmar. They slammed Facebook’s reliance on groups like theirs to alert its employees to hateful messages, and outlined the company’s lack of an emergency escalation system. In other words, if organizations and activists have to be the watchdogs, they need a streamlined and speedy process to get hateful or violent speech removed within hours or minutes — not days.

The groups also criticized Facebook’s lack of transparency and what they perceive as the company’s reluctant to engage with stakeholders in Myanmar. They suggest the more opportunities local groups have to talk to Facebook engineers and data teams, the more likely Facebook can find a systematic defense to hate speech rather than the whack-a-mole approach every time a violent or fake news post pops up.

“If you are serious about making Facebook better ... we urge you to invest more into moderation — particularly in countries, such as Myanmar, where Facebook has rapidly come to play a dominant role in how information is accessed and communicated,” the letter said.

A Facebook spokesperson in a statement to Vox thanked the civil society groups for their efforts in tracking down the hateful messages, and said the company is “sorry that Mark did not make clearer that it was the civil society groups in Myanmar who first reported these messages. We took their reports very seriously and immediately investigated ways to help prevent the spread of this content.”

The statement also said Facebook “is working hard to improve our technology and tools” to detect abuse, including adding a reporting function in Facebook Messenger and more Myanmarese-language reviewers to handle reports. “There is more we need to do and we will continue to work with civil society groups in Myanmar and around the world to do better.”

Why Facebook has come under fire in Myanmar

Close to 700,000 Rohingya refugees have fled to neighboring Bangladesh since last August in the wake of a government crackdown against the minority group. Then-Secretary of State Rex Tillerson officially labeled the crisis an “ethnic cleansing” in November, but the international community has been slow to act on the escalating humanitarian disaster.

The violence in Myanmar highlights the duality of social media in general, and Facebook in particular, as a force for both good and ill.

Activists and reporters use phones and social media to document atrocities, helping evade government censors. But anti-Muslim and anti-Rohingya memes and propaganda spread virulently through Facebook too, inciting violence and eroding support for the Rohingya’s plight. Public-facing accounts of verified government and military leaders — as well as the extremely influential accounts of nationalistic Buddhist monks — included false and inflammatory posts about the Rohingya, the New York Times reported in October.

In a recent report on the Rohingya crisis, Marzuki Darusman, head of the United Nations Fact-Finding Mission on Myanmar, said Facebook “substantively contributed to the level of acrimony and dissension and conflict” in Myanmar. “Hate speech is certainly of course a part of that,” Darusman said. “As far as the Myanmar situation is concerned, social media is Facebook, and Facebook is social media.”

Facebook has taken down some posts, including temporarily shutting down the account of an ultranationalist Buddhist monk who posted incendiary content. But the company has also been accused of removing posts that document violence against the Rohingya, which underscores how challenging and complicated it is to even attempt to police the social network.

Zuckerberg told Klein in his interview these problems are “certainly something that we’re paying a lot of attention to.”

“It’s a real issue, and we want to make sure that all of the tools that we’re bringing to bear on eliminating hate speech, inciting violence, and basically protecting the integrity of civil discussions that we’re doing in places like Myanmar, as well as places like the US that do get a disproportionate amount of the attention,” Zuckerberg said.

Klein also pointed out that while Facebook is a vital tool for information in Myanmar, the country might not get the attention of other markets given its size. Indeed, Facebook’s popularity as a news source in the region has exacerbated Myanmar’s fake news problem. According to CBS News, the number of Facebook users in Myanmar increased from 2 million in 2014 to more than 30 million currently — partly a result of the military junta easing censorship, and partly because of the increasing affordability of smartphones.

“Facebook has become sort of the de facto internet for Myanmar,” Jes Kaliebe Petersen, chief executive of Phandeeyar — a technology company that signed on to the open letter and helped Facebook create its Myanmarese-language community standards page — told the Times in October. “When people buy their first smartphone, it just comes preinstalled.”

Zuckerberg conceded that Facebook needs to improve as its global reach expands. “Just based on the fact that our headquarters is here in California,” he told Klein, “and the vast majority of our community is not even in the US, I think does make this just a constant challenge for us to make sure that we’re putting due attention on all of the people in different parts of the community around the world.”

Listen to Zuckerberg’s full interview with Klein here, and read the full letter from Myanmar civil society organizations below:




from Vox - All https://ift.tt/2EqxrUH

No comments