美国国家公共电台 NPR Tristan Harris: Do Our Devices Control More Than We Think?(在线收听

 

GUY RAZ, HOST:

It's the TED Radio Hour from NPR. I'm Guy Raz. So these days, our attention is pretty much up for grabs.

(SOUNDBITE OF TYPING ON SMARTPHONE, SNAPCHAT ALERTS)

RAZ: And the stuff that's trying the hardest to get it...

(SOUNDBITE OF TYPING ON SMARTPHONE, SNAPCHAT ALERTS)

RAZ: ...Is usually right at our fingertips. And while we think we're the ones who get to decide how to spend our time, is that really the truth?

TRISTAN HARRIS: You know, what - what does it mean to manipulate people, to persuade or pull on - on their instincts?

RAZ: This is Tristan Harris, and Tristan used to be part of the tech industry. He was a design ethicist at Google.

HARRIS: And I was there for about a year working first on, you know, sort of assistant-like features in Inbox, which became sort of the next version of Gmail.

RAZ: But then after basically spending his whole career in Silicon Valley, something in Tristan changed.

HARRIS: As much as we wanted to make life better for people and explain things to people, at the end of the day, it was about capturing attention. You know, how would we hook people into spending more time on the screen or driving more page views or getting people to click on ads? And I didn't like that. And so you started to see technology not as a vehicle for improving people's lives, but really as a means to persuade people to do things.

And so persuasion became the dominant way that I came to see everything in the world. You know, you get disenchanted when you see how much of all this time that we spend in front of screens and clicking on these buttons and inviting people to connect and accepting invitations and, you know, liking back articles and getting back to people, and you just say, what is this adding up to? You know, what - what - how is this actually better for life?

RAZ: On the show today, ideas about manipulation, about the things that can make us believe and act in certain ways, sometimes without us even knowing, from our addiction to smartphones to the rise of fake news, even the ways we remember and experience our lives. And for Tristan Harris, manipulation is about how we use technology and how some tech companies use us.

HARRIS: You know, it's not like a smartphone is a device that you access. You don't access your smartphone. You live by your smartphone. People wake up in the morning and it's, like, 80 percent of people, the first thing they do, they turn out of bed and they - they pull their smartphone out, right? And they see what's on the screen. And the second, you know, that that happens, it's like they're programming their whole mind and their day to think about things in a certain way.

RAZ: If you are - if you use a smartphone or you are - or you use the internet today, are you being manipulated?

HARRIS: I think by definition to look at the screen, to use the internet, thoughts and choices enter the flow of our experience that were not authored by us. And more and more of our thoughts during the day come from the things that we have been thinking about and seeing from the screens that we're constantly checking. You know, you don't have to be using the screen every moment for it to be guiding your thoughts, right? And so invisibly, the entire - you know, it's like the David Foster Wallace "This Is Water" speech, you know? We're swimming in this water. We're the fish. And you ask the other fish, you know, what do you - how do you like this water we're swimming in? And he says, what's water?

You know, we're swimming in this digital environment that is created, you know, and handcrafted by a handful of companies with deliberate goals to capture human attention. And now you have two billion people, which is like, what is it, 25 percent of the world's population and 90 percent of the world's GDP, whose thoughts are shaped by this handful of 20 to 40, 35-year-old mostly engineers and designers in California. Neil Postman, you know, one of my favorite deep thinkers about how technology affects society, said, what is the problem for which this technology is the solution?

RAZ: Tristan Harris laid out his case on the TED stage.

(SOUNDBITE OF TED TALK)

HARRIS: The internet is not evolving at random. The reason it feels like it's sucking us in the way it is is because of this race for attention. We know where this is going. Technology is not neutral. And it becomes this race to the bottom of the brainstem of who can go lower to get it. Let me give you an example of Snapchat. If you didn't know, Snapchat is the No. 1 way that teenagers in the United States communicate. So if you're like me and you use text messages to communicate, Snapchat is that for teenagers. And there's, like, a hundred million of them that use it. And they invented a feature called Snapstreaks which shows the number of days in a row that two people have communicated with each other.

In other words, what they just did is they gave two people something they don't want to lose. Because if you're a teenager and you have 150 days in a row, you don't want that to go away. And so think of the little blocks of time that that schedules in kids minds. This isn't theoretical. When kids go on vacation, it's been shown they give their passwords up to five other friends to keep their Snapstreaks going even when they can't do it. And they have, like, 30 of these things, and so they have to get through photo - taking photos of just pictures or walls or feelings just to get through their day. So it's not even like they're having real conversations.

We have a temptation to think about this as, oh, they're just using, you know, Snapchat the way we used to gossip on the telephone. It's probably OK. Well, what this misses is that in the 1970s when you were just gossiping on the telephone, there wasn't a hundred engineers on the other side of the screen who knew exactly how your psychology worked and orchestrated you into a double bind with each other.

(SOUNDBITE OF MUSIC)

RAZ: I mean, it's like it's - (laughter). It's a - it's a little bit like - and I don't want to take this analogy too far - of how tobacco companies have known for a long time how to grow, you know, tobacco with either no or low nicotine. But - but they didn't. They actually increased the amount of nicotine.

HARRIS: That's right. And this is even worse than that. And I'm not trying to be alarmist, but the reason why it's different than just tobacco is that it's actually social. You know, there's always this narrative - you know, we always worry about new technologies. People worried about newspapers on the subway, we're not talking to each other. People worried about TV, we're just going to amuse ourselves to death, which is, honestly, partially true. You know, human beings are resilient. But what this misses is there's three new elements that get missed in this conversation.

The first is that we've never had a medium that was so totalizing. Two billion people checking 150 times a day from the moment that they get up in the morning to every bathroom break, to every coffee line, to every, you know, going to sleep at night. So it's a totalizing kind of environment. You know, we go from using a product to being jacked into the matrix. Except the matrix is this sort of soft, invisible matrix that's kind of created by a few different companies, Apple, Google and Facebook.

The second one is that it's social, right? TV, radio didn't - didn't say these are where your friends are hanging out and where you've been left out. We never made it easier to show what you should be jealous of in other people's lives. And the last thing is that it's intelligent and personalized, and it gets better every day. So, you know, TV or radio didn't watch what you looked at and then try to dynamically change the course of the television show whereas Facebook is a monolithic AI that basically says, every single time you click, you're teaching us what will keep you here.

RAZ: What - I mean, what you're suggesting is that there are a handful of companies, maybe three, four or five, that have more manipulative power over - over the human species than anything or anyone has had at any time in our history.

HARRIS: Absolutely. And this is the thing that people miss. It's, like, why are we so obsessed with governments? I mean, two billion people use Facebook every day. That's more than the number of followers of Christianity. One-point-five billion people use YouTube every day. Every day. And that's more than the number of followers of Islam. So these things have an enormous amount of influence, more than any other government, over people's daily thoughts and beliefs. So, you know, for everything we want governments to be held accountable to, why in the world would we not have something that holds technology companies accountable to human values as opposed to just capturing attention, which is the only thing that they answer to, is this stock market in capturing attention?

RAZ: Well, I mean, that's a thing. I mean, you're talking about this incredible ability to manipulate our thoughts and our choices. But I think in response these companies would say, well, these are choices people make. We live in a - in a capitalistic society. I mean, we are businesses, and of course we're vying for people's attention because that translates into - into money. This is an old story. This isn't new. Businesses have been doing this since time immemorial.

HARRIS: Yeah. Well, this is the classic defense that the companies make, is, if you don't like the product just use a different product. But this is so dishonest. (Laughter). Because if you're a teenager and all of your friends, all the conversations they have, all the parties that they go to, if it's through Snapchat, are you going to choose not to use Snapchat? You know, the - I've studied cults in my research on persuasion, and I've actually gone into cults. And one of the things that cults do is they pull on you by pulling on all your friends. If they can - if they can make it so that all of your friends are just the people who are in the cult, you can't leave the cult. And the thing about social media is that all of your friends are in the cult.

(SOUNDBITE OF TED TALK)

HARRIS: So I'm here today because the costs are so obvious. I don't know a more urgent problem than this because this problem is underneath all other problems. It's not just taking away our agency to spend our attention and live the lives that we want. It's changing the way that we have our conversations, it's changing our democracy and it's changing our ability to have the conversations and relationships we want with each other. And it affects everyone because a billion people have one of these in their pocket.

RAZ: I mean, every time we think that we've - we've sort of cracked the code as humans, like, you know, we've created this thing where you can always be connected to everyone and everything. But every time we seem to introduce something like that, someone figures out how to game it.

HARRIS: Yeah. The history of the tech industry is filled with positive intentions and good ideas that are incomplete and game-able, right? And they're incomplete because they don't capture all the other externalities. The more these attention companies profit, they put - they profit while pushing all of the social and inner externalities downstream because it's polluting our inner world and it's polluting our social cohesion and our ability to actually understand each other because we have to agree on reality.

If we can't agree on reality then we can't solve some of the most existential threats that face us. And so this is why we have to change the system. This is not, you know, a fun, philosophical conversation. I'm here because this is literally an existential threat to our future and - and to our present. And elections around the world are still being shaped, an entire polarization of societies is being shaped today by this algorithm. And the need for accountability is enormous.

RAZ: Tristan Harris. He's a former design ethicist at Google, and he now works at the nonprofit Time Well Spent which is trying to reform the tech industry. You can see Tristan's full talk at ted.com. On the show today, ideas about manipulation. I'm Guy Raz, and you're listening to the TED Radio Hour from NPR.

(SOUNDBITE OF MUSIC)

  原文地址:http://www.tingroom.com/lesson/npr2017/10/416384.html