英语 英语 日语 日语 韩语 韩语 法语 法语 德语 德语 西班牙语 西班牙语 意大利语 意大利语 阿拉伯语 阿拉伯语 葡萄牙语 葡萄牙语 越南语 越南语 俄语 俄语 芬兰语 芬兰语 泰语 泰语 泰语 丹麦语 泰语 对外汉语

PBS高端访谈:脸书在社会问题上会更加谨慎?

时间:2017-05-05 09:18来源:互联网 提供网友:mapleleaf   字体: [ ]
特别声明:本栏目内容均从网络收集或者网友提供,供仅参考试用,我们无法保证内容完整和正确。如果资料损害了您的权益,请与站长联系,我们将及时删除并致以歉意。
    (单词翻译:双击或拖选)

   JUDY WOODRUFF: Now questions about the ever-growing scope of Facebook's empire and social network, and whether the company is embracing enough responsibility for its reach.

  Today, Facebook CEO Mark Zuckerberg announced that they will add 3,000 more people to monitor live video, after problems with violence and hate speech.
  Hari Sreenivasan takes it from there.
  HARI SREENIVASAN: The decision comes after a series of cases where people shared live video of murder and suicide, recent examples, a murder in Cleveland last month that was posted live on Facebook, and a man in Thailand posted video of him murdering his 11-month-old daughter. It wasn't removed for 24 hours.
  Once Facebook makes these announced hires, there will be 7,500 employees to monitor thousands of hours of videos uploaded constantly.
  Farhad Manjoo is a tech columnist1 for The New York Times who has been closely covering Facebook. He joins me now to talk about this issue and other questions facing the company.
  Farhad, so let's first — today's news, how significant is this?
  FARHAD MANJOO, The New York Times: I think it's significant.
  I mean, it's a significant sort of step up in their ability to monitor these videos, and it should help. The way it works is, there's lots of videos going on, on Facebook all the time. If somebody sees something that looks bad, that looks like it may be criminal or some other, you know, terrible thing, they flag it, and the flagged videos go to these reviewers.
  And just having more of these reviewers should make the whole process faster. So, it should help. I mean, I think the question is why it took them a year to do this.
  HARI SREENIVASAN: So, put the scale in perspective here. If they have 1.2 billion active users a month or whatever it is that they talk about, even at one-half of 1 percent, if they wanted to harm themselves and put this on Facebook, that's six million people.
  How do these 7,000 stop that?
  FARHAD MANJOO: Yes.
  I mean, the way that tech companies generally work is, they manage scale by, you know, leveraging2 commuters, basically. There's a lot of kind of algorithmic stuff that goes into making sure — they try to, you know, cut down the pool that the human reviewers have to look at.
  And there is some experience in this in the Valley. I mean, YouTube has had to deal with this sort of thing for years. And the way they have really come around to doing it is a similar process. Like, they have thousands and thousands of hours of videos uploaded essentially3 every minute, and they count on kind of the viewers to flag anything that's terrible, and then it goes to these human reviewers.
  So, it's a process that can work. The difficulty in Facebook's case is, it's live video, so they have to get it down much more quickly. And so, you know, it's possible that they may need more people or some other, you know, algorithmic solution, but I think this is a — you know, it should be an improvement over what they have now.
  脸书在社会问题上会更加谨慎?
  HARI SREENIVASAN: You mentioned it took them so long to get to this point. Why?
  FARHAD MANJOO: I think this is a real sort of cultural blind spot for Facebook in general.
  Oftentimes, they go into these projects — you know, Facebook Live is an example, but many of the other things they have done — with, you know, tremendous optimism.
  As a company, and Mark Zuckerberg as a technologist, he has tremendous optimism in technology. And they often fail to see or appreciate the possible kind of downsides of their technology and the ways that it could be misused5.
  I mean, I think that what we have seen with live — with the live video is a small example. The way that Facebook has sort of affected6 elections, the way that — you know, the fake news problem we saw in the U.S. election, the way it's been used as a tool for propaganda in various other parts of the world, you know, those are huge examples of, you know, what looked like a fairly simple solution technologically7, like we're going to get everyone connected and have them share the news.
  You know, it brings some real deep, like, social questions that they are only lately beginning to confront in a serious way.
  HARI SREENIVASAN: So, this combination of, I guess, an optimism in the technology and design and a faith in users are ultimately good and will make the right choice, I mean, is that the sort of core cultural concern or problem that keeps the company making these sorts of decisions?
  FARHAD MANJOO: That's part of it. And the other thing to remember is, you know, they're a technology company, and speed is of the utmost concern for them.
  One of the things that was happening in the tech industry last year is that a whole bunch of other companies were rolling out live video systems, and Facebook didn't want to be left behind. And so they created their live video system.
  And it became, you know, the biggest, because they're the biggest social network. But with that sort of size comes, you know, an increased opportunity for misuse4 and more power, right. Like, a video on Facebook that can be seen by, you know, potentially much more people has a lot more potential for being misused.
  And I think they — it's not right to say that they don't consider those things, but it seems like it's on a back burner for them. And I think what's happening at Facebook is a shift toward thinking about these issues at an earlier stage.
  And we have really seen this more recently in their work with the news industry. I mean, after the — after what happened — after what happened in the election and the kind of controversy8 about fake news, they have started to — they have rolled out a bunch of initiatives to do stuff to improve how news is seen on Facebook. They have added fact-checkers and other things.
  So, I think their attitude is changing, but it may be changing too slowly, compared to how quick the technology they're rolling out is changing.
  HARI SREENIVASAN: All right, Farhad Manjoo of The New York Times, thanks so much.
  FARHAD MANJOO: All right, great. Thanks so much.

点击收听单词发音收听单词发音  

1 columnist XwwzUQ     
n.专栏作家
参考例句:
  • The host was interviewing a local columnist.节目主持人正在同一位当地的专栏作家交谈。
  • She's a columnist for USA Today.她是《今日美国报》的专栏作家。
2 leveraging c57a4d2d0d4d7cf20e93e33b2873abed     
促使…改变( leverage的现在分词 ); [美国英语]杠杆式投机,(使)举债经营,(使)利用贷款进行投机
参考例句:
  • De-leveraging is a painful process: it has barely begun. 去杠杆化是个痛苦的过程:它才刚刚开始。
  • Archimedes said, saying: Give me a fulcrum, I can leveraging the Earth. 阿基米德说过一句话:给我一个支点,我可以撬动地球。
3 essentially nntxw     
adv.本质上,实质上,基本上
参考例句:
  • Really great men are essentially modest.真正的伟人大都很谦虚。
  • She is an essentially selfish person.她本质上是个自私自利的人。
4 misuse XEfxx     
n.误用,滥用;vt.误用,滥用
参考例句:
  • It disturbs me profoundly that you so misuse your talents.你如此滥用自己的才能,使我深感不安。
  • He was sacked for computer misuse.他因滥用计算机而被解雇了。
5 misused 8eaf65262a752e371adfb992201c1caf     
v.使用…不当( misuse的过去式和过去分词 );把…派作不正当的用途;虐待;滥用
参考例句:
  • He misused his dog shamefully. 他可耻地虐待自己的狗。 来自《简明英汉词典》
  • He had grossly misused his power. 他严重滥用职权。 来自《简明英汉词典》
6 affected TzUzg0     
adj.不自然的,假装的
参考例句:
  • She showed an affected interest in our subject.她假装对我们的课题感到兴趣。
  • His manners are affected.他的态度不自然。
7 technologically WqpwY     
ad.技术上地
参考例句:
  • Shanghai is a technologically advanced city. 上海是中国的一个技术先进的城市。
  • Many senior managers are technologically illiterate. 许多高级经理都对技术知之甚少。
8 controversy 6Z9y0     
n.争论,辩论,争吵
参考例句:
  • That is a fact beyond controversy.那是一个无可争论的事实。
  • We ran the risk of becoming the butt of every controversy.我们要冒使自己在所有的纷争中都成为众矢之的的风险。
本文本内容来源于互联网抓取和网友提交,仅供参考,部分栏目没有内容,如果您有更合适的内容,欢迎点击提交分享给大家。
------分隔线----------------------------
TAG标签:   PBS
顶一下
(0)
0%
踩一下
(0)
0%
最新评论 查看所有评论
发表评论 查看所有评论
请自觉遵守互联网相关的政策法规,严禁发布色情、暴力、反动的言论。
评价:
表情:
验证码:
听力搜索
推荐频道
论坛新贴