July 21, 2018 at 03:25PM » Feed Facebook will take down posts that could cause “real physical harm,” but Holocaust denials (and Pizzagate?) remain okay
< !DOCTYPE html PUBLIC "-//W3C//DTD HTML 4.0 Transitional//EN" "http://www.w3.org/TR/REC-html40/loose.dtd">
Oh look, Facebook is actually taking something down. Facebook would rather downrank fake news and conspiracy theories than remove them from the platform all together. The company has gotten slammed, especially over the past week, for this try-to-have-it-both-ways policy. This week, Facebook announced that “there are certain forms of misinformation that have contributed to physical harm” that it actually will be taking down — or, well, here’s the slightly more wishy-washy statement, to CNBC: “Reducing the distribution of misinformation — rather than removing it outright — strikes the right balance between free expression and a safe and authentic community. There are certain forms of misinformation that have contributed to physical harm, and we are making a policy change which will enable us to take that type of content down. We will be begin implementing the policy during the coming months.” The change seems linked in particular to activity in countries like Myanmar, India, and Sri Lanka: “Although the policy change is upcoming, the company used these principles to remove posts in Sri Lanka alleging Muslims were poisoning food given or sold to Buddhists.”
The Guardian’s Olivia Solon attended the hearing where Facebook announced the change, and has some good questions — one of which is about the statute of limitations for this kind of thing. The Pizzagate hoax, for instance, led to an actual shooting months after that hoax began.
The focus is on Sri Lanka at first, but I’m v curious to know the threshold for "real-world harm" and whether something like Pizzagate would qualify
— Olivia Solon (@oliviasolon) July 18, 2018
The focus is on Sri Lanka at first, but I’m v curious to know the threshold for "real-world harm" and whether something like Pizzagate would qualify
— Olivia Solon (@oliviasolon) July 18, 2018
Separately, here are some things that Facebook CEO Mark Zuckerberg told Recode’s Kara Swisher on Wednesday, as part of a lengthy podcast interview for Recode Decode:
Kara Swisher: InfoWars. I want you to make a case for taking InfoWars off. If you were on the other side of it.
Mark Zuckerberg: I think if you were trying to argue on the side of basically the core principle of keeping the community safe, I think you would try to argue that the content is somehow attacking people or is creating an unsafe environment. Now, let me give you —
Swisher: Is false.
Zuckerberg: Let me give you an example of where we would take it down. In Myanmar or Sri Lanka, where there’s a history of sectarian violence, similar to the tradition in the U.S. where you can’t go into a movie theater and yell “Fire!” because that creates an imminent harm. There are definitely examples of people sharing images that are taken out of context that are false, that are specifically used to induce people to violence in those ares where there’s —
Swisher: And violence has resulted.
Zuckerberg: Yes. We are moving towards the policy of misinformation that is aimed at or going to induce violence, we are going to take down because that’s basically…The principles that we have on what we remove from the service are, if it’s going to result in real harm, real physical harm, or if you’re attacking individuals, then that content shouldn’t be on the platform. There’s a lot of categories of that that we can get into, but then there’s broad debate.
Swisher: Okay. “Sandy Hook didn’t happen” is not a debate. It is false. You can’t just take that down?
Zuckerberg: I agree that it is false.
Swisher: Okay.
Zuckerberg: I also think that going to someone who is a victim of Sandy Hook and telling them, “Hey, no, you’re a liar” — that is harassment, and we actually will take that down. But overall, let’s take this whole closer to home…
Swisher: Okay.
Zuckerberg: I’m Jewish, and there’s a set of people who deny that the Holocaust happened.
Swisher: Yes, there’s a lot.
Zuckerberg: I find that deeply offensive. But at the end of the day, I don’t believe that our platform should take that down because I think there are things that different people get wrong. I don’t think that they’re intentionally getting it wrong, but I think —
Swisher: In the case of the Holocaust deniers, they might be, but go ahead.
It’s hard to impugn intent and to understand the intent. I just think, as abhorrent as some of those examples are, I think the reality is also that I get things wrong when I speak publicly. I’m sure you do. I’m sure a lot of leaders and public figures we respect do too, and I just don’t think that it is the right thing to say, We’re going to take someone off the platform if they get things wrong, even multiple times. [Update: Mark has clarified these remarks
: “I personally find Holocaust denial deeply offensive, and I absolutely didn’t intend to defend the intent of people who deny that.”]
What we will do is we’ll say, “Okay, you have your page, and if you’re not trying to organize harm against someone, or attacking someone, then you can put up that content on your page, even if people might disagree with it or find it offensive.” But that doesn’t mean that we have a responsibility to make it widely distributed in News Feed. I think we, actually, to the contrary —
Swisher: So you move them down? Versus, in Myanmar, where you remove it?
Zuckerberg: Yes.
This @karaswisher interview with Zuckerberg is great, a zillion times better than the Congressional hearings. It reveals his tortured efforts to argue that people who *intentionally* post clear falsehoods are somehow just mistaken & should remain. https://t.co/QXg04SfxnP
— Walt Mossberg (@waltmossberg) July 18, 2018
Among the revealing pieces of @karaswisher‘s interview with Zuckerberg: he seems to think there’s a meaningful difference between domestic misinformation (wrongness protected by free speech) and misinformation abroad (which causes "real harm"). https://t.co/KZXxWTIDb2 pic.twitter.com/bYQQRTp3V0
— Kevin Roose (@kevinroose) July 18, 2018
A few words about holocaust denialism.
I’m a moderator on one of the non-toxic parts Reddit (r/AskHistorians). We’ve dealt with holocaust deniers for a while and find that the best policy is to shut that shit down.
Here’s a short thread on what I’ve seen/dealt with. https://t.co/UR6wUPqmvy
— Andrés Pertierra (@ASPertierra) July 18, 2018
Also:
When asked about Trump’s comment that there were "fine people on both sides" in Charlottesville, Sec. Nielsen seemingly doubled down. "it’s not that one side was right and one side was wrong" #AspenSecurity
— Tess Owen (@misstessowen) July 19, 2018
"..anybody that is advocating violence, we need to work to mitigate"
— Tess Owen (@misstessowen) July 19, 2018
❤️️is eclipsed by 😠 Facebook users are increasingly using the “angry” reaction in response to legislators’ Facebook posts, Pew finds.
Legislators’ Facebook audiences became much more likely to react to posts with Facebook’s “angry” button in the wake of the 2016 election. Prior to the election (but after the “angry” feature was released), just 1 percent of all reactions to posts by Democrats were angry. After the election, that share increased to 5 percent, on average. Among Republicans, the share of angry reactions increased from 2 percent before the election to 6 percent after. While “likes” remain the most common reaction, “angry” was the most frequently used of the six alternatives (such as “haha,” “wow,” and “love”). This has not always been the case. Prior to Trump’s inauguration, the “love” reaction was the most commonly used alternative to “likes,” but it has since been largely eclipsed by “angry.” The use of angry reactions to congressional Facebook posts rose throughout 2017, reaching its highest observed rates at the end of the year, comprising 9 percent of all reactions to the average Democrat’s posts in December 2017, and 13 percent of the average Republican’s.
Angry reactions were especially likely to ensue when posts expressed political opposition. Posts that expressed opposition to Trump received an estimated five times as many angry reactions as posts that did not express support or opposition toward any figure or group. When Democrats expressed opposition to Republicans, they earned six times as many angry reactions, on average. Because the emotional reactions were not available across the entire timeframe, this analysis is based upon posts created between Feb. 23, 2016 (the day before the reactions were released) and Dec. 31, 2017.
NewsWhip previously looked at reactions to hyper-partisan Facebook pages and found that “angry” was the most common reaction.
Apolitical Macedonian teens? Not so much. Sometimes it’s just “disinfobros” seeking AdSense cash, sometimes it’s more. A BuzzFeed joint investigation revealed that the political news industry of Veles, Macdeonia “was not started spontaneously by apolitical teens. Rather, it was launched by a well-known Macedonian media attorney, Trajche Arsov — who worked closely with two high-profile American partners for at least six months during a period that overlapped with Election Day.”
Our report about fake news from Macedonia, from Nov 3: https://t.co/Sdl6m6o0Oz
Wash. Post profile of two US guys — Wade and Goldman — running a hyperpartisan conservative site, Nov. 20: https://t.co/zXRhnLBjN0
Turns out these two hugely viral stories were completely connected.
— Craig Silverman (@CraigSilverman) July 18, 2018
Gab. A group of researchers from Brazil’s Universidade Federal de Minas Gerais took a look at the “free speech” social network Gab, which was founded by a Silicon Valley Trump supporter in June 2016 and has almost no content moderation. It’s turned into a haven for the alt-right and conspiracy theorists, and Apple and Google have both banned its app from their app stores. In addition to analyzing users’ race and gender (it’s mostly white men) and how far-right they are (61.1 percent of people listed on the Anti-Defamation League’s extremist list have Gab accounts), the researchers looked at how news is shared on the platform, and what sources it’s from.
And here’s another research paper on Gab from earlier this year, if you’re interested.