美国国家公共电台 NPR 'Automating Inequality': Algorithms In Public Services Often Fail The Most Vulnerable(在线收听

 

ARI SHAPIRO, HOST:

In the fall of 2008, an Indiana woman named Omega Young got a letter saying she needed to recertify for the state's public benefits program.

VIRGINIA EUBANKS: But she was unable to make the appointment because she was suffering from ovarian cancer.

SHAPIRO: She called the local office to say she wouldn't make the appointment because she was hospitalized getting cancer treatments and she lost her benefits anyway. The reason - failure to cooperate.

EUBANKS: So because she lost her benefits, she couldn't afford her medications, she lost her food stamps, she couldn't pay her rent. She lost access to free transportation to her medical appointments. And Omega Young died on March 1, 2009. And on the next day, she won an appeal for wrongful termination and all of her benefits were restored the day after her death.

SHAPIRO: This is one of the stories the author Virginia Eubanks tells in her latest book "Automating Inequality: How High-Tech Tools Profile, Police, And Punish The Poor." That book is the subject of this week's All Tech Considered.

(SOUNDBITE OF MUSIC)

SHAPIRO: Virginia Eubanks argues that many of the automated systems that deliver public services today are rigged against the people these programs are supposed to serve. She dives deep into three examples of automated public services - welfare benefits in Indiana, housing for the homeless in Los Angeles and children's services in Allegheny County, Pa., which includes Pittsburgh.

The Indiana case was so bad that the state eventually gave up on the automated system. Virginia Eubanks started by telling me what state lawmakers were trying to accomplish through automation.

EUBANKS: Indiana was attempting to save money and to make the system more efficient. But the way the system rolled out, it seems like one of the intentions was actually to break the relationship between caseworkers and the families they served. The governor sort of did a press tour around this contract. And one of the things he kept bringing up was there was one case where two case workers had colluded with some recipients to defraud the government for about - I think it was about $8,000.

And the governor used this case over and over and over again to suggest that when caseworkers and families have personal relationships, that it's an invitation to fraud. So the system was actually designed to break that relationship. So what happened is the state replaced about 1,500 local caseworkers with online forms and regional call centers.

And that resulted in a million benefits denials in the first three years of the experiment, which was a 54 percent increase from the three years before.

SHAPIRO: Is an automated system of public services inherently going to be less helpful, less effective than something like Uber or Lyft or Amazon or all the automated things that people who are not in poverty rely on every day?

EUBANKS: No. There's nothing intrinsic in automation that makes it bad for the poor. One of my greatest fears in this work is that we're actually using these systems to avoid some of the most pressing moral and political challenges of our time, specifically poverty and racism. So we're kind of using these systems as a kind of empathy override. You know, let's talk about Los Angeles.

So there's 58,000 unhoused folks in Los Angeles. It's the second-highest population in the United States and 75 percent of them are completely unsheltered, which means they're just living in the street. I do not want to be the case worker who is making that decision, who is saying there's 50,000 people with no resources. I have, you know, a handful of resources available. Now I have to pick.

But the problem is that we are using these tools to basically outsource that incredibly hard decision to machines.

SHAPIRO: So the underlying problem is not that the housing system is automated but it sure doesn't help that automating that system allows people to ignore, more or less, the fact that there are not enough houses.

EUBANKS: Yeah. So one of the folks I talked to in the book, this great, brilliant man Gary Blasi has one of the best quotes in the book and he says, homelessness is not a systems engineering problem. It's a carpentry problem, right?

SHAPIRO: If you've got 10 houses for 20 people, it doesn't matter how good the system for housing those people is, it's not going to work.

EUBANKS: Exactly.

SHAPIRO: As you point out in the book, caseworkers have biases. There are case workers who are racist, who discriminate, who favor some clients over others for inappropriate reasons. Doesn't automation have the potential to solve those problems?

EUBANKS: Yeah, let's be absolutely direct about this that human bias in public assistance systems have created deep inequalities for decades. And it's specifically around the treatment of black and brown folks, who have often been either overrepresented in the more punitive systems or diverted from the more helpful systems because of frontline caseworker bias.

SHAPIRO: So they get thrown in prison more often or their children taken away more often, they get public housing less often, that sort of thing.

EUBANKS: Exactly. But the thing that's really important to understand about the systems I profile in "Automating Inequality" is that these systems don't actually remove that bias, they simply move it. So in Allegheny County where I look at the predictive model that's supposed to be able to forecast which children will be victims of abuse or neglect in the future, in that case, one of the hidden biases is that it uses proxies instead of actual measures of maltreatment.

And one of the proxies it uses is called call re-referral, which just means that a child is called on and then a second call comes in within two years. And the problem with this is that both anonymous reporters and mandated reporters report black and biracial families for abuse and neglect 3.5 times more often than they report white families.

SHAPIRO: You draw these three detailed pictures of automated systems falling short in Indiana, California, Pennsylvania. Do you think a different author could have found three different automated systems somewhere in the country that were working really well in providing services effectively?

EUBANKS: Absolutely. One of the things that's different about the way that I wrote the book is that I started from the point of view of the targets of these systems. It doesn't mean I only spoke to those folks. But I spoke to, you know, unhoused folks, both those who have had luck getting housing through coordinated entry and those who haven't. I spoke to families who have been investigated for maltreatment.

And I will say that when you start from the point of view of these very vulnerable families, that these systems look really different than they look from the point of view of the data scientists or administrators who are developing them. And I wasn't hearing these voices at all in the debates that we've been having about what's sort of coming to be known as algorithmic accountability or algorithmic fairness.

I was never hearing the voices of the people who face the pointy end of the most punitive stick. And I really thought it was important to bring those stories to the table.

SHAPIRO: Virginia Eubanks, thanks so much for talking with us.

EUBANKS: Thank you so much.

SHAPIRO: Her book is called "Automating Inequality: How High-Tech Tools Profile, Police, And Punish The Poor."

  原文地址:http://www.tingroom.com/lesson/npr2018/2/422876.html