Facebook will take down posts that could cause “real physical harm,” but Holocaust denials (and Pizzagate?) remain okay

Posted on Posted in News

July 20, 2018 at 01:45PM
Go to the source

The growing stream of reporting on and data about fake news, misinformation, partisan content, and news literacy is hard to keep up with. This weekly roundup offers the highlights of what you might have missed.

Oh look, Facebook is actually taking something down. Facebook would rather downrank fake news and conspiracy theories than remove them from the platform all together. The company has gotten slammed, especially over the past week, for this try-to-have-it-both-ways policy. This week, Facebook announced that “there are certain forms of misinformation that have contributed to physical harm” that it actually will be taking down — or, well, here’s the slightly more wishy-washy statement, to CNBC: “Reducing the distribution of misinformation — rather than removing it outright — strikes the right balance between free expression and a safe and authentic community. There are certain forms of misinformation that have contributed to physical harm, and we are making a policy change which will enable us to take that type of content down. We will be begin implementing the policy during the coming months.” The change seems linked in particular to activity in countries like Myanmar, India, and Sri Lanka: “Although the policy change is upcoming, the company used these principles to remove posts in Sri Lanka alleging Muslims were poisoning food given or sold to Buddhists.”

The Guardian’s Olivia Solon attended the hearing where Facebook announced the change, and has some good questions — one of which is about the statute of limitations for this kind of thing. The Pizzagate hoax, for instance, led to an actual shooting months after that hoax began.

Separately, here are some things that Facebook CEO Mark Zuckerberg told Recode’s Kara Swisher on Wednesday, as part of a lengthy podcast interview for Recode Decode:

Kara Swisher: InfoWars. I want you to make a case for taking InfoWars off. If you were on the other side of it.

Mark Zuckerberg: I think if you were trying to argue on the side of basically the core principle of keeping the community safe, I think you would try to argue that the content is somehow attacking people or is creating an unsafe environment. Now, let me give you —

Swisher: Is false.

Zuckerberg: Let me give you an example of where we would take it down. In Myanmar or Sri Lanka, where there’s a history of sectarian violence, similar to the tradition in the U.S. where you can’t go into a movie theater and yell “Fire!” because that creates an imminent harm. There are definitely examples of people sharing images that are taken out of context that are false, that are specifically used to induce people to violence in those ares where there’s —

Swisher: And violence has resulted.

Zuckerberg: Yes. We are moving towards the policy of misinformation that is aimed at or going to induce violence, we are going to take down because that’s basically…The principles that we have on what we remove from the service are, if it’s going to result in real harm, real physical harm, or if you’re attacking individuals, then that content shouldn’t be on the platform. There’s a lot of categories of that that we can get into, but then there’s broad debate.

Swisher: Okay. “Sandy Hook didn’t happen” is not a debate. It is false. You can’t just take that down?

Zuckerberg: I agree that it is false.

Swisher: Okay.

Zuckerberg: I also think that going to someone who is a victim of Sandy Hook and telling them, “Hey, no, you’re a liar” — that is harassment, and we actually will take that down. But overall, let’s take this whole closer to home…

Swisher: Okay.

Zuckerberg: I’m Jewish, and there’s a set of people who deny that the Holocaust happened.

Swisher: Yes, there’s a lot.

Zuckerberg: I find that deeply offensive. But at the end of the day, I don’t believe that our platform should take that down because I think there are things that different people get wrong. I don’t think that they’re intentionally getting it wrong, but I think —

Swisher: In the case of the Holocaust deniers, they might be, but go ahead.

Zuckerberg:

It’s hard to impugn intent and to understand the intent. I just think, as abhorrent as some of those examples are, I think the reality is also that I get things wrong when I speak publicly. I’m sure you do. I’m sure a lot of leaders and public figures we respect do too, and I just don’t think that it is the right thing to say, We’re going to take someone off the platform if they get things wrong, even multiple times. [Update: Mark has clarified these remarks

here

: “I personally find Holocaust denial deeply offensive, and I absolutely didn’t intend to defend the intent of people who deny that.”]

What we will do is we’ll say, “Okay, you have your page, and if you’re not trying to organize harm against someone, or attacking someone, then you can put up that content on your page, even if people might disagree with it or find it offensive.” But that doesn’t mean that we have a responsibility to make it widely distributed in News Feed. I think we, actually, to the contrary —

Swisher: So you move them down? Versus, in Myanmar, where you remove it?

Zuckerberg: Yes.

Also:

❤️️is eclipsed by ? Facebook users are increasingly using the “angry” reaction in response to legislators’ Facebook posts, Pew finds.

Legislators’ Facebook audiences became much more likely to react to posts with Facebook’s “angry” button in the wake of the 2016 election. Prior to the election (but after the “angry” feature was released), just 1 percent of all reactions to posts by Democrats were angry. After the election, that share increased to 5 percent, on average. Among Republicans, the share of angry reactions increased from 2 percent before the election to 6 percent after. While “likes” remain the most common reaction, “angry” was the most frequently used of the six alternatives (such as “haha,” “wow,” and “love”). This has not always been the case. Prior to Trump’s inauguration, the “love” reaction was the most commonly used alternative to “likes,” but it has since been largely eclipsed by “angry.” The use of angry reactions to congressional Facebook posts rose throughout 2017, reaching its highest observed rates at the end of the year, comprising 9 percent of all reactions to the average Democrat’s posts in December 2017, and 13 percent of the average Republican’s.

Angry reactions were especially likely to ensue when posts expressed political opposition. Posts that expressed opposition to Trump received an estimated five times as many angry reactions as posts that did not express support or opposition toward any figure or group. When Democrats expressed opposition to Republicans, they earned six times as many angry reactions, on average. Because the emotional reactions were not available across the entire timeframe, this analysis is based upon posts created between Feb. 23, 2016 (the day before the reactions were released) and Dec. 31, 2017.

NewsWhip previously looked at reactions to hyper-partisan Facebook pages and found that “angry” was the most common reaction.

Apolitical Macedonian teens? Not so much. Sometimes it’s just “disinfobros” seeking AdSense cash, sometimes it’s more. A BuzzFeed joint investigation revealed that the political news industry of Veles, Macdeonia “was not started spontaneously by apolitical teens. Rather, it was launched by a well-known Macedonian media attorney, Trajche Arsov — who worked closely with two high-profile American partners for at least six months during a period that overlapped with Election Day.”

Gab. A group of researchers from Brazil’s Universidade Federal de Minas Gerais took a look at the “free speech” social network Gab, which was founded by a Silicon Valley Trump supporter in June 2016 and has almost no content moderation. It’s turned into a haven for the alt-right and conspiracy theorists, and Apple and Google have both banned its app from their app stores. In addition to analyzing users’ race and gender (it’s mostly white men) and how far-right they are (61.1 percent of people listed on the Anti-Defamation League’s extremist list have Gab accounts), the researchers looked at how news is shared on the platform, and what sources it’s from.

And here’s another research paper on Gab from earlier this year, if you’re interested.

http://www.niemanlab.org/2018/07/facebook-will-take-down-posts-that-could-cause-real-physical-harm-but-holocaust-denials-and-pizzagate-remain-okay/