美国国家公共电台 NPR Facebook Updates Community Standards, Expands Appeals Process(在线收听

 

STEVE INSKEEP, HOST:

Facebook says it will be more open about the posts it takes down. The company tells NPR that today it is publishing internal details of its community standards. That's the term for what's allowed on Facebook and what is not. Monika Bickert is a Facebook vice president.

MONIKA BICKERT: So we've always had a set of community standards that the public can see that explain, for instance, no harassment, no bullying, no terror propaganda. But now we're actually explaining how we define those terms for our review teams and how we enforce those policies.

INSKEEP: She says users want more openness, which is an understatement. The company is under unprecedented pressure. It's been roiled by two years of questions - which news did it promote during the last election? How widely did it share users' data? - and more. Now it is revealing definitions used by internal monitors who check up on complaints about posts around the world, like, what exactly constitutes a genuine death threat? If it names a person, location or weapon, that should come down. Or what exactly amounts to hate speech?

BICKERT: Where we have drawn the line is that we will allow attacks or negative commentary about institutions or countries or religions, but we don't allow attacks against people. So if somebody is criticizing or attacking all members of a religion, that's where we would draw the line.

INSKEEP: I wonder if one of the gray areas there might be someone who criticizes Islam but in an extreme way that somebody might argue is inciting people against Muslims.

BICKERT: We do try to allow as much speech as possible about institutions, religions, countries, and we know sometimes that might make people uncomfortable. That's one of the reasons we give people a lot of choice and control over what they see on Facebook. You can unfollow pages, you can unfollow people, and you can block people that you don't want to communicate with.

INSKEEP: How are you thinking about the environment as the 2018 election approaches and, of course, there will once again be lots of political speech on Facebook?

BICKERT: Well, we know there are a lot of very serious issues, and it's important to get them right. We're focused on combating fake news. We're also focused on providing increased transparency into political advertisements and pages that have political content. And we're also investing a lot in our technical tools that help keep inauthentic accounts off the site.

INSKEEP: Are you already going after fake accounts in that larger, more specific way in the United States here in 2018?

BICKERT: Yes. The tools that we have developed to more effectively catch fake accounts - they've improved a lot, and we are using them globally. We now are able to stop more than a million fake accounts at the time of creation every day.

INSKEEP: The publication of its internal standards is another signal that Facebook is having to acknowledge that it is effectively a publisher. It wants to find itself as a technology company, just a platform for other people's speech, but the founder, Mark Zuckerberg, now accepts some responsibility for what is posted. Facebook was embarrassed when a famous old Vietnam War photo was mistakenly censored and then put back up. It's also had to tussle with authoritarian governments like Russia and Turkey that demand some posts be taken down. Just last weekend, Sri Lankan officials complained to The New York Times that Facebook was not responsive enough to complaints of hate speech. Monika Bickert says that when pressured by governments, the company at least tries to keep up speech that meets its standards.

What does this announcement suggest about the power your company has?

BICKERT: I think what it suggests is that we really want to respond to what the community wants. What we're hearing is that they want more clarity, and they want to know how we enforce these rules. That's why we're doing this. And we're actually hopeful that this is going to spark a conversation.

INSKEEP: But this is also a reminder, you've got this enormous fire hose of speech, maybe the world's largest fire hose of speech, and you can turn that fire hose on or off. It's your choice.

BICKERT: I want to be very clear that when we make these policies, we don't do it in a vacuum. This is not my team sitting in a room in California saying, these will be the policies. Every time we adjust a policy, we have external input from experts around the world.

INSKEEP: The company that claims some 2 billion users around the world insists it is straining to work within the laws of every country while still allowing as much speech as it can.

  原文地址:http://www.tingroom.com/lesson/npr2018/4/430253.html