Facebook to remove misinformation that leads to violence | Sunday Observer

Facebook to remove misinformation that leads to violence

22 July, 2018

SAN FRANCISCO July 18: Facebook, facing growing criticism for posts that have incited violence in some countries, said Wednesday that it would begin removing misinformation that could lead to people being physically harmed.

The policy expands Facebook’s rules about what type of false information it will remove, and is largely a response to episodes in Sri Lanka, Myanmar and India in which rumors that spread on Facebook led to real-world attacks on ethnic minorities.

“We have identified that there is a type of misinformation that is shared in certain countries that can incite underlying tensions and lead to physical harm offline,” said Tessa Lyons, a Facebook product manager. “We have a broader responsibility to not just reduce that type of content but remove it.”

Facebook has been roundly criticized over the way its platform has been used to spread hate speech and false information that prompted violence. The company has struggled to balance its belief in free speech with those concerns, particularly in countries where access to the internet is relatively new and there are limited mainstream news sources to counter social media rumors.

In Myanmar, Facebook has been accused by United Nations investigators and human rights groups of facilitating violence against Rohingya Muslims, a minority ethnic group, by allowing anti-Muslim hate speech and false news.

In Sri Lanka, riots broke out after false news pitted the country’s majority Buddhist community against Muslims. Near-identical social media rumors have also led to attacks in India and Mexico. In many cases, the rumors included no call for violence, but amplified underlying tensions.

The new rules apply to one of Facebook’s other big social media properties, Instagram, but not to WhatsApp, where false news has also circulated. In India, for example, false rumors spread through WhatsApp about child kidnappers have led to mob violence.

In an interview published by the technology news site Recode, Mark Zuckerberg, Facebook’s chief executive, tried to explain how the company is trying to differentiate between offensive speech - the example he used was people who deny the Holocaust - and posts which promoted false information that could lead to physical harm.

“I think that there’s a terrible situation where there’s underlying sectarian violence and intention,” Mr. Zuckerberg told Recode’s Kara Swisher. “It is clearly the responsibility of all of the players who were involved there.”

The social media company already has rules in place in which a direct threat of violence or hate speech is removed, but it has been hesitant to remove rumors that do not directly violate its content policies.

Under the new rules, Facebook said it would create partnerships with local civil society groups to identify misinformation for removal. The new rules are already being put in effect in Sri Lanka, and Ms. Lyons said the company hoped to soon introduce them in Myanmar, then expand elsewhere.

Mr. Zuckerberg’s example of Holocaust denial quickly created an online furor, and on Wednesday afternoon he clarified his comments in an email to Ms. Swisher. “I personally find Holocaust denial deeply offensive, and I absolutely didn’t intend to defend the intent of people who deny that,” he said.

He went on to outline Facebook’s current policies around misinformation. Posts that violate the company’s community standards, which ban hate speech, nudity and direct threats of violence, among other things, are immediately removed.

The company has started identifying posts that are categorized as false by independent fact checkers. Facebook will “downrank” those posts, effectively moving them down in each user’s News Feed so that they are not highly promoted across the platform.

The company has also started adding information boxes under demonstrably false news stories, suggesting other sources of information for people to read.

- nytimes.com

Comments