英语 英语 日语 日语 韩语 韩语 法语 法语 德语 德语 西班牙语 西班牙语 意大利语 意大利语 阿拉伯语 阿拉伯语 葡萄牙语 葡萄牙语 越南语 越南语 俄语 俄语 芬兰语 芬兰语 泰语 泰语 泰语 丹麦语 泰语 对外汉语

美国国家公共电台 NPR Tristan Harris: Do Our Devices Control More Than We Think?

时间:2017-10-16 05:31来源:互联网 提供网友:nan   字体: [ ]
特别声明:本栏目内容均从网络收集或者网友提供,供仅参考试用,我们无法保证内容完整和正确。如果资料损害了您的权益,请与站长联系,我们将及时删除并致以歉意。
    (单词翻译:双击或拖选)

 

GUY RAZ, HOST:

It's the TED1 Radio Hour from NPR. I'm Guy Raz. So these days, our attention is pretty much up for grabs.

(SOUNDBITE OF TYPING ON SMARTPHONE, SNAPCHAT ALERTS)

RAZ: And the stuff that's trying the hardest to get it...

(SOUNDBITE OF TYPING ON SMARTPHONE, SNAPCHAT ALERTS)

RAZ: ...Is usually right at our fingertips. And while we think we're the ones who get to decide how to spend our time, is that really the truth?

TRISTAN HARRIS: You know, what - what does it mean to manipulate people, to persuade or pull on - on their instincts?

RAZ: This is Tristan Harris, and Tristan used to be part of the tech industry. He was a design ethicist2 at Google.

HARRIS: And I was there for about a year working first on, you know, sort of assistant-like features in Inbox, which became sort of the next version of Gmail.

RAZ: But then after basically spending his whole career in Silicon3 Valley, something in Tristan changed.

HARRIS: As much as we wanted to make life better for people and explain things to people, at the end of the day, it was about capturing attention. You know, how would we hook people into spending more time on the screen or driving more page views or getting people to click on ads? And I didn't like that. And so you started to see technology not as a vehicle for improving people's lives, but really as a means to persuade people to do things.

And so persuasion4 became the dominant5 way that I came to see everything in the world. You know, you get disenchanted when you see how much of all this time that we spend in front of screens and clicking on these buttons and inviting6 people to connect and accepting invitations and, you know, liking7 back articles and getting back to people, and you just say, what is this adding up to? You know, what - what - how is this actually better for life?

RAZ: On the show today, ideas about manipulation, about the things that can make us believe and act in certain ways, sometimes without us even knowing, from our addiction8 to smartphones to the rise of fake news, even the ways we remember and experience our lives. And for Tristan Harris, manipulation is about how we use technology and how some tech companies use us.

HARRIS: You know, it's not like a smartphone is a device that you access. You don't access your smartphone. You live by your smartphone. People wake up in the morning and it's, like, 80 percent of people, the first thing they do, they turn out of bed and they - they pull their smartphone out, right? And they see what's on the screen. And the second, you know, that that happens, it's like they're programming their whole mind and their day to think about things in a certain way.

RAZ: If you are - if you use a smartphone or you are - or you use the internet today, are you being manipulated?

HARRIS: I think by definition to look at the screen, to use the internet, thoughts and choices enter the flow of our experience that were not authored by us. And more and more of our thoughts during the day come from the things that we have been thinking about and seeing from the screens that we're constantly checking. You know, you don't have to be using the screen every moment for it to be guiding your thoughts, right? And so invisibly, the entire - you know, it's like the David Foster Wallace "This Is Water" speech, you know? We're swimming in this water. We're the fish. And you ask the other fish, you know, what do you - how do you like this water we're swimming in? And he says, what's water?

You know, we're swimming in this digital environment that is created, you know, and handcrafted by a handful of companies with deliberate goals to capture human attention. And now you have two billion people, which is like, what is it, 25 percent of the world's population and 90 percent of the world's GDP, whose thoughts are shaped by this handful of 20 to 40, 35-year-old mostly engineers and designers in California. Neil Postman, you know, one of my favorite deep thinkers about how technology affects society, said, what is the problem for which this technology is the solution?

RAZ: Tristan Harris laid out his case on the TED stage.

(SOUNDBITE OF TED TALK)

HARRIS: The internet is not evolving at random9. The reason it feels like it's sucking us in the way it is is because of this race for attention. We know where this is going. Technology is not neutral. And it becomes this race to the bottom of the brainstem of who can go lower to get it. Let me give you an example of Snapchat. If you didn't know, Snapchat is the No. 1 way that teenagers in the United States communicate. So if you're like me and you use text messages to communicate, Snapchat is that for teenagers. And there's, like, a hundred million of them that use it. And they invented a feature called Snapstreaks which shows the number of days in a row that two people have communicated with each other.

In other words, what they just did is they gave two people something they don't want to lose. Because if you're a teenager and you have 150 days in a row, you don't want that to go away. And so think of the little blocks of time that that schedules in kids minds. This isn't theoretical. When kids go on vacation, it's been shown they give their passwords up to five other friends to keep their Snapstreaks going even when they can't do it. And they have, like, 30 of these things, and so they have to get through photo - taking photos of just pictures or walls or feelings just to get through their day. So it's not even like they're having real conversations.

We have a temptation to think about this as, oh, they're just using, you know, Snapchat the way we used to gossip on the telephone. It's probably OK. Well, what this misses is that in the 1970s when you were just gossiping on the telephone, there wasn't a hundred engineers on the other side of the screen who knew exactly how your psychology10 worked and orchestrated you into a double bind11 with each other.

(SOUNDBITE OF MUSIC)

RAZ: I mean, it's like it's - (laughter). It's a - it's a little bit like - and I don't want to take this analogy too far - of how tobacco companies have known for a long time how to grow, you know, tobacco with either no or low nicotine12. But - but they didn't. They actually increased the amount of nicotine.

HARRIS: That's right. And this is even worse than that. And I'm not trying to be alarmist, but the reason why it's different than just tobacco is that it's actually social. You know, there's always this narrative13 - you know, we always worry about new technologies. People worried about newspapers on the subway, we're not talking to each other. People worried about TV, we're just going to amuse ourselves to death, which is, honestly, partially14 true. You know, human beings are resilient. But what this misses is there's three new elements that get missed in this conversation.

The first is that we've never had a medium that was so totalizing. Two billion people checking 150 times a day from the moment that they get up in the morning to every bathroom break, to every coffee line, to every, you know, going to sleep at night. So it's a totalizing kind of environment. You know, we go from using a product to being jacked into the matrix. Except the matrix is this sort of soft, invisible matrix that's kind of created by a few different companies, Apple, Google and Facebook.

The second one is that it's social, right? TV, radio didn't - didn't say these are where your friends are hanging out and where you've been left out. We never made it easier to show what you should be jealous of in other people's lives. And the last thing is that it's intelligent and personalized, and it gets better every day. So, you know, TV or radio didn't watch what you looked at and then try to dynamically change the course of the television show whereas Facebook is a monolithic15 AI that basically says, every single time you click, you're teaching us what will keep you here.

RAZ: What - I mean, what you're suggesting is that there are a handful of companies, maybe three, four or five, that have more manipulative power over - over the human species than anything or anyone has had at any time in our history.

HARRIS: Absolutely. And this is the thing that people miss. It's, like, why are we so obsessed16 with governments? I mean, two billion people use Facebook every day. That's more than the number of followers17 of Christianity. One-point-five billion people use YouTube every day. Every day. And that's more than the number of followers of Islam. So these things have an enormous amount of influence, more than any other government, over people's daily thoughts and beliefs. So, you know, for everything we want governments to be held accountable to, why in the world would we not have something that holds technology companies accountable to human values as opposed to just capturing attention, which is the only thing that they answer to, is this stock market in capturing attention?

RAZ: Well, I mean, that's a thing. I mean, you're talking about this incredible ability to manipulate our thoughts and our choices. But I think in response these companies would say, well, these are choices people make. We live in a - in a capitalistic society. I mean, we are businesses, and of course we're vying18 for people's attention because that translates into - into money. This is an old story. This isn't new. Businesses have been doing this since time immemorial.

HARRIS: Yeah. Well, this is the classic defense19 that the companies make, is, if you don't like the product just use a different product. But this is so dishonest. (Laughter). Because if you're a teenager and all of your friends, all the conversations they have, all the parties that they go to, if it's through Snapchat, are you going to choose not to use Snapchat? You know, the - I've studied cults20 in my research on persuasion, and I've actually gone into cults. And one of the things that cults do is they pull on you by pulling on all your friends. If they can - if they can make it so that all of your friends are just the people who are in the cult21, you can't leave the cult. And the thing about social media is that all of your friends are in the cult.

(SOUNDBITE OF TED TALK)

HARRIS: So I'm here today because the costs are so obvious. I don't know a more urgent problem than this because this problem is underneath22 all other problems. It's not just taking away our agency to spend our attention and live the lives that we want. It's changing the way that we have our conversations, it's changing our democracy and it's changing our ability to have the conversations and relationships we want with each other. And it affects everyone because a billion people have one of these in their pocket.

RAZ: I mean, every time we think that we've - we've sort of cracked the code as humans, like, you know, we've created this thing where you can always be connected to everyone and everything. But every time we seem to introduce something like that, someone figures out how to game it.

HARRIS: Yeah. The history of the tech industry is filled with positive intentions and good ideas that are incomplete and game-able, right? And they're incomplete because they don't capture all the other externalities. The more these attention companies profit, they put - they profit while pushing all of the social and inner externalities downstream because it's polluting our inner world and it's polluting our social cohesion23 and our ability to actually understand each other because we have to agree on reality.

If we can't agree on reality then we can't solve some of the most existential threats that face us. And so this is why we have to change the system. This is not, you know, a fun, philosophical24 conversation. I'm here because this is literally25 an existential threat to our future and - and to our present. And elections around the world are still being shaped, an entire polarization of societies is being shaped today by this algorithm. And the need for accountability is enormous.

RAZ: Tristan Harris. He's a former design ethicist at Google, and he now works at the nonprofit Time Well Spent which is trying to reform the tech industry. You can see Tristan's full talk at ted.com. On the show today, ideas about manipulation. I'm Guy Raz, and you're listening to the TED Radio Hour from NPR.

(SOUNDBITE OF MUSIC)


点击收听单词发音收听单词发音  

1 ted 9gazhs     
vt.翻晒,撒,撒开
参考例句:
  • The invaders gut ted the village.侵略者把村中财物洗劫一空。
  • She often teds the corn when it's sunny.天好的时候她就翻晒玉米。
2 ethicist Ki8z4     
n.伦理学家,道德学家
参考例句:
  • They are used to resolving conflicting principles, says Dick Willems, a Dutch doctor and ethicist. 荷兰的医生,伦理学家DickWillems说,他们惯于解决相冲突的原则。
  • They are used to resolving conflicting principles, says Dick Willems gold, a Dutch doctor and ethicist. 一个荷兰医生与道德家,认为他们习惯了解决有冲突的原则。
3 silicon dykwJ     
n.硅(旧名矽)
参考例句:
  • This company pioneered the use of silicon chip.这家公司开创了使用硅片的方法。
  • A chip is a piece of silicon about the size of a postage stamp.芯片就是一枚邮票大小的硅片。
4 persuasion wMQxR     
n.劝说;说服;持有某种信仰的宗派
参考例句:
  • He decided to leave only after much persuasion.经过多方劝说,他才决定离开。
  • After a lot of persuasion,she agreed to go.经过多次劝说后,她同意去了。
5 dominant usAxG     
adj.支配的,统治的;占优势的;显性的;n.主因,要素,主要的人(或物);显性基因
参考例句:
  • The British were formerly dominant in India.英国人从前统治印度。
  • She was a dominant figure in the French film industry.她在法国电影界是个举足轻重的人物。
6 inviting CqIzNp     
adj.诱人的,引人注目的
参考例句:
  • An inviting smell of coffee wafted into the room.一股诱人的咖啡香味飘进了房间。
  • The kitchen smelled warm and inviting and blessedly familiar.这间厨房的味道温暖诱人,使人感到亲切温馨。
7 liking mpXzQ5     
n.爱好;嗜好;喜欢
参考例句:
  • The word palate also means taste or liking.Palate这个词也有“口味”或“嗜好”的意思。
  • I must admit I have no liking for exaggeration.我必须承认我不喜欢夸大其词。
8 addiction JyEzS     
n.上瘾入迷,嗜好
参考例句:
  • He stole money from his parents to feed his addiction.他从父母那儿偷钱以满足自己的嗜好。
  • Areas of drug dealing are hellholes of addiction,poverty and murder.贩卖毒品的地区往往是吸毒上瘾、贫困和发生谋杀的地方。
9 random HT9xd     
adj.随机的;任意的;n.偶然的(或随便的)行动
参考例句:
  • The list is arranged in a random order.名单排列不分先后。
  • On random inspection the meat was found to be bad.经抽查,发现肉变质了。
10 psychology U0Wze     
n.心理,心理学,心理状态
参考例句:
  • She has a background in child psychology.她受过儿童心理学的教育。
  • He studied philosophy and psychology at Cambridge.他在剑桥大学学习哲学和心理学。
11 bind Vt8zi     
vt.捆,包扎;装订;约束;使凝固;vi.变硬
参考例句:
  • I will let the waiter bind up the parcel for you.我让服务生帮你把包裹包起来。
  • He wants a shirt that does not bind him.他要一件不使他觉得过紧的衬衫。
12 nicotine QGoxJ     
n.(化)尼古丁,烟碱
参考例句:
  • Many smokers who are chemically addicted to nicotine cannot cut down easily.许多有尼古丁瘾的抽烟人不容易把烟戒掉。
  • Many smokers who are chemically addicted to nicotine cannot cut down easily.许多有尼古丁瘾的抽烟人不容易把烟戒掉。
13 narrative CFmxS     
n.叙述,故事;adj.叙事的,故事体的
参考例句:
  • He was a writer of great narrative power.他是一位颇有记述能力的作家。
  • Neither author was very strong on narrative.两个作者都不是很善于讲故事。
14 partially yL7xm     
adv.部分地,从某些方面讲
参考例句:
  • The door was partially concealed by the drapes.门有一部分被门帘遮住了。
  • The police managed to restore calm and the curfew was partially lifted.警方设法恢复了平静,宵禁部分解除。
15 monolithic 8wKyI     
adj.似独块巨石的;整体的
参考例句:
  • Don't think this gang is monolithic.不要以为这帮人是铁板一块。
  • Mathematics is not a single monolithic structure of absolute truth.数学并不是绝对真理的单一整体结构。
16 obsessed 66a4be1417f7cf074208a6d81c8f3384     
adj.心神不宁的,鬼迷心窍的,沉迷的
参考例句:
  • He's obsessed by computers. 他迷上了电脑。
  • The fear of death obsessed him throughout his old life. 他晚年一直受着死亡恐惧的困扰。
17 followers 5c342ee9ce1bf07932a1f66af2be7652     
追随者( follower的名词复数 ); 用户; 契据的附面; 从动件
参考例句:
  • the followers of Mahatma Gandhi 圣雄甘地的拥护者
  • The reformer soon gathered a band of followers round him. 改革者很快就获得一群追随者支持他。
18 vying MHZyS     
adj.竞争的;比赛的
参考例句:
  • California is vying with other states to capture a piece of the growing communications market.为了在日渐扩大的通讯市场分得一杯羹,加利福尼亚正在和其他州展开竞争。
  • Four rescue plans are vying to save the zoo.4个拯救动物园的方案正争得不可开交。
19 defense AxbxB     
n.防御,保卫;[pl.]防务工事;辩护,答辩
参考例句:
  • The accused has the right to defense.被告人有权获得辩护。
  • The war has impacted the area with military and defense workers.战争使那个地区挤满了军队和防御工程人员。
20 cults 0c174a64668dd3c452cb65d8dcda02df     
n.迷信( cult的名词复数 );狂热的崇拜;(有极端宗教信仰的)异教团体
参考例句:
  • Religious cults and priesthoods are sectarian by nature. 宗教崇拜和僧侣界天然就有派性。 来自辞典例句
  • All these religions were flourishing side by side with many less prominent cults. 所有这些宗教和许多次要的教派一起,共同繁荣。 来自英汉非文学 - 历史
21 cult 3nPzm     
n.异教,邪教;时尚,狂热的崇拜
参考例句:
  • Her books aren't bestsellers,but they have a certain cult following.她的书算不上畅销书,但有一定的崇拜者。
  • The cult of sun worship is probably the most primitive one.太阳崇拜仪式或许是最为原始的一种。
22 underneath VKRz2     
adj.在...下面,在...底下;adv.在下面
参考例句:
  • Working underneath the car is always a messy job.在汽车底下工作是件脏活。
  • She wore a coat with a dress underneath.她穿着一件大衣,里面套着一条连衣裙。
23 cohesion dbzyA     
n.团结,凝结力
参考例句:
  • I had to bring some cohesion into the company.我得使整个公司恢复凝聚力。
  • The power of culture is deeply rooted in the vitality,creativity and cohesion of a nation. 文化的力量,深深熔铸在民族的生命力、创造力和凝聚力之中。
24 philosophical rN5xh     
adj.哲学家的,哲学上的,达观的
参考例句:
  • The teacher couldn't answer the philosophical problem.老师不能解答这个哲学问题。
  • She is very philosophical about her bad luck.她对自己的不幸看得很开。
25 literally 28Wzv     
adv.照字面意义,逐字地;确实
参考例句:
  • He translated the passage literally.他逐字逐句地翻译这段文字。
  • Sometimes she would not sit down till she was literally faint.有时候,她不走到真正要昏厥了,决不肯坐下来。
本文本内容来源于互联网抓取和网友提交,仅供参考,部分栏目没有内容,如果您有更合适的内容,欢迎点击提交分享给大家。
------分隔线----------------------------
TAG标签:   NPR  美国国家电台  英语听力
顶一下
(0)
0%
踩一下
(0)
0%
最新评论 查看所有评论
发表评论 查看所有评论
请自觉遵守互联网相关的政策法规,严禁发布色情、暴力、反动的言论。
评价:
表情:
验证码:
听力搜索
推荐频道
论坛新贴