英语 英语 日语 日语 韩语 韩语 法语 法语 德语 德语 西班牙语 西班牙语 意大利语 意大利语 阿拉伯语 阿拉伯语 葡萄牙语 葡萄牙语 越南语 越南语 俄语 俄语 芬兰语 芬兰语 泰语 泰语 泰语 丹麦语 泰语 对外汉语

【英语语言学习】未来无人驾驶的车辆

时间:2016-10-18 04:57来源:互联网 提供网友:yajing   字体: [ ]
特别声明:本栏目内容均从网络收集或者网友提供,供仅参考试用,我们无法保证内容完整和正确。如果资料损害了您的权益,请与站长联系,我们将及时删除并致以歉意。
    (单词翻译:双击或拖选)

 This is Weekend Edition from NPR News. I am Scott Simon.

Imagine yourself in the future for a moment riding in a driverless car. You see 10 pedestrians1 stroll into the street just a few yards ahead of you. The car's going too fast to brake and miss them, so would you steer2 your car to try to miss them and possibly injure yourself? But if it's a driverless car, would you even get to make that choice? We're going to talk now to somebody who studies some of the ethical3 questions that are raised by autonomous4 vehicles. Patrick Lin, associate professor of philosophy at Cal Poly in San Luis Obispo, Calif., thanks very much for being with us.
PATRICK LIN: You're welcome. Glad to be here, Scott.
SIMON: There was a survey put up by MIT that asked questions along these lines, right?
LIN: Right, right.
SIMON: What did you notice in the survey when you looked at it?
LIN: Well, you know, so that's not the first survey done on this topic. There have been other surveys. And they had similar results, which is that people are split on the idea of how a driverless or autonomous car should behave. I think the one thing that stood out to me is that there's going to be a lot more work needed in this field here. One problem with surveys is that what people say in surveys isn't necessarily how they would actually choose in real life.
SIMON: Yeah.
LIN: They might not always know what it is they want.
SIMON: Yeah, I mean, 'cause it does seem to me just anecdotally that probably not a month goes by we don't read about some traffic accident where somebody said, you know, I just couldn't stop. They pulled into the lane, they walked across the street. And I must say, as a rule, society doesn't blame them for making an unethical choice to save their own life, even if the crash results in killing5 others.
LIN: Right. If it's a human-driven car, what you have there is just an accident. It's a reflex. Maybe you have bad reflexes. But we understand that that's just a reflex, it's not premeditated. But when you're talking about how we ought to program a robot car, now you're talking about pre-scripting the accident, right? So this is a difference between an accidental accident and a deliberate accident. So there's a big difference there legally and ethically6.
SIMON: Would somebody get into a driverless car if they thought the algorithms of that car would essentially7 say, I'm not going to let you run into that school bus and kill people, you're going to die instead?
LIN: I think they would. So, for instance, anytime you get in a driven car by someone else, you're at risk. Studies have shown that if you're a human driver and you're about to be in a crash, you're going to reflexively turn away from the crash. This usually means that you expose your passengers to that accident. But that doesn't paralyze us when we step into a car.
SIMON: At the same time, though, Professor, I mean, I think it's going to be hard for people to think of an algorithm making that decision for us.
LIN: That's right. I mean, it's a weird8 thing to think about. But that's exactly what we're doing when we're creating robots and artificial intelligence. They're taking over human roles, from being our chauffeur9 to our stock market trader to our airline pilot to whatever. We've got to do some soul-searching. And then we have to ask, well, should robots and AI mimic10 humans - do what we do - or should they even do something differently? So robot ethics11 and human ethics could be two different things. But when we talk about programming cars or making any kind of robots, it's a good exercise in how humans behave and how we ought to behave.
SIMON: Patrick Lin is an associate professor of philosophy at Cal Poly in San Luis Obispo, Calif. Thanks so much for being with us.
LIN: You're welcome. Thanks for having me.

点击收听单词发音收听单词发音  

1 pedestrians c0776045ca3ae35c6910db3f53d111db     
n.步行者( pedestrian的名词复数 )
参考例句:
  • Several pedestrians had come to grief on the icy pavement. 几个行人在结冰的人行道上滑倒了。 来自《简明英汉词典》
  • Pedestrians keep to the sidewalk [footpath]! 行人走便道。 来自《现代汉英综合大词典》
2 steer 5u5w3     
vt.驾驶,为…操舵;引导;vi.驾驶
参考例句:
  • If you push the car, I'll steer it.如果你来推车,我就来驾车。
  • It's no use trying to steer the boy into a course of action that suits you.想说服这孩子按你的方式行事是徒劳的。
3 ethical diIz4     
adj.伦理的,道德的,合乎道德的
参考例句:
  • It is necessary to get the youth to have a high ethical concept.必须使青年具有高度的道德观念。
  • It was a debate which aroused fervent ethical arguments.那是一场引发强烈的伦理道德争论的辩论。
4 autonomous DPyyv     
adj.自治的;独立的
参考例句:
  • They proudly declared themselves part of a new autonomous province.他们自豪地宣布成为新自治省的一部分。
  • This is a matter that comes within the jurisdiction of the autonomous region.这件事是属于自治区权限以内的事务。
5 killing kpBziQ     
n.巨额利润;突然赚大钱,发大财
参考例句:
  • Investors are set to make a killing from the sell-off.投资者准备清仓以便大赚一笔。
  • Last week my brother made a killing on Wall Street.上个周我兄弟在华尔街赚了一大笔。
6 ethically CtrzbD     
adv.在伦理上,道德上
参考例句:
  • Ethically , we have nothing to be ashamed about . 从伦理上说,我们没有什么好羞愧的。
  • Describe the appropriate action to take in an ethically ambiguous situation. 描述适当行为采取在一个道德地模棱两可的情况。
7 essentially nntxw     
adv.本质上,实质上,基本上
参考例句:
  • Really great men are essentially modest.真正的伟人大都很谦虚。
  • She is an essentially selfish person.她本质上是个自私自利的人。
8 weird bghw8     
adj.古怪的,离奇的;怪诞的,神秘而可怕的
参考例句:
  • From his weird behaviour,he seems a bit of an oddity.从他不寻常的行为看来,他好像有点怪。
  • His weird clothes really gas me.他的怪衣裳简直笑死人。
9 chauffeur HrGzL     
n.(受雇于私人或公司的)司机;v.为…开车
参考例句:
  • The chauffeur handed the old lady from the car.这个司机搀扶这个老太太下汽车。
  • She went out herself and spoke to the chauffeur.她亲自走出去跟汽车司机说话。
10 mimic PD2xc     
v.模仿,戏弄;n.模仿他人言行的人
参考例句:
  • A parrot can mimic a person's voice.鹦鹉能学人的声音。
  • He used to mimic speech peculiarities of another.他过去总是模仿别人讲话的特点。
11 ethics Dt3zbI     
n.伦理学;伦理观,道德标准
参考例句:
  • The ethics of his profession don't permit him to do that.他的职业道德不允许他那样做。
  • Personal ethics and professional ethics sometimes conflict.个人道德和职业道德有时会相互抵触。
本文本内容来源于互联网抓取和网友提交,仅供参考,部分栏目没有内容,如果您有更合适的内容,欢迎点击提交分享给大家。
------分隔线----------------------------
TAG标签:   英语听力  听力教程  英语学习
顶一下
(0)
0%
踩一下
(0)
0%
最新评论 查看所有评论
发表评论 查看所有评论
请自觉遵守互联网相关的政策法规,严禁发布色情、暴力、反动的言论。
评价:
表情:
验证码:
听力搜索
推荐频道
论坛新贴