The larger point is that the economics of tradeoffs in the digital economy are being evaluated more seriously. JL
Lauren Katz reports in Re/code:
Google has acknowledged that going up to them on the street and saying, “Hey would you like to let Google scan your face?” And they would offer a five-dollar gift card if you wanted it. "If it’s not enough, what’s our price? Which is a scary kind of question to contemplate, but I think to me that’s almost like a smart way to make it innocuous. Yeah. You know, to Starbucks. All right, great. I’ll get a Tall. If you’re offering me a hundred dollars, I might start asking more questions."
On October 15, Google announced its latest phone: the Pixel 4. There’s been hype around this phone for months. Its features were the worst-kept secret in tech. So the stuff that Google said at the launch was kind of old news. But there’s one specific feature that’s gotten a lot of attention: using your face to unlock the phone.Facial recognition isn’t new in tech; iPhones have been able to do this since 2017. But a promotional video about the Pixel 4 suggested the company was trying to solve a problem: Facial recognition technology is notoriously bad at detecting people with dark skin. The technology often misidentifies them, or doesn’t detect them at all.The explicit promotion of this was a big deal: Google was basically saying, “We see you, and this phone was made with you in mind!”But then, the New York Daily News reported exactly how Google was making its tech more inclusive. Ginger Adams Otis — who, along with her colleague Nancy Dillion, broke that story — joins Reset host Arielle Duhaime-Ross to share what they learned:We found that in trying to create a biometric facial recognition feature, which would allow your face to unlock your phone, they needed to build a big database of faces so that they could — these are their words — “train the machine,” so that the technology recognizes all the different varieties of people that there are.And so to do that, they sent teams of people out to collect facial scans and the people collecting them — the for-hire workers — were not upfront or clear about what they were gathering the data for, what the people needed to do, what they were giving consent for, and in some cases they targeted specific groups with darker skin tones.This isn’t the first time researchers have taken questionable steps with regard to people of color in the name of something noble. Ruha Benjamin studies the intersection of race, science, and technology at Princeton University. In the second part of this episode, she explains the racial bias we often seen in facial recognition technologies and how the data collection practices of Google’s contractors echo a long history of scientists taking advantage of vulnerable communities of color.Racialized groups have been targeted and included in harmful experimentations throughout our country’s histories. Whether slavery or Jim Crow or mass incarceration, scientists and doctors have gone after the most vulnerable populations in order to hone various technologies and techniques.Listen to the entire discussion on this episode of Reset. Below, we’ve also shared a lightly edited transcript of their conversation.Arielle Duhaime-Ross
Ginger Adams Otis, you’re a reporter for the New York Daily News, and along with your colleague Nancy Dillon, you broke a story two weeks ago about the work that Google has been doing to improve its Pixel phone. What did you find?Ginger Adams Otis
We found that in trying to create a biometric facial recognition feature which would allow your face to unlock your phone they needed to build a big database of faces so that they could — these are their words — “train the machine,” so that the technology recognizes all the different varieties of people that there are.And so to do that, they sent teams of people out to collect facial scans and the people collecting them — the for-hire workers — were not upfront or clear about what they were gathering the data for what the people needed to do what they were giving consent for and, in some cases, they targeted specific groups with darker skin tones.Arielle Duhaime-Ross
How did they go about gathering this information about people?Ginger Adams Otis
Well, Google has acknowledged that they’ve done sort of voluntary requests for people, basically going up to them on the street and saying, “Hey would you like to let Google scan your face?” And they would offer a five-dollar gift card if you know if you wanted it. But in some cases they weren’t getting, I guess, the diversity that they wanted, so they had to broaden sort of their repertoire of how they were asking.Arielle Duhaime-Ross
How long have you known that Google has been scanning people’s faces on the street?Ginger Adams Otis
They started this project, as far as we know, probably back in January. But I think they had done an earlier iteration of it, and many tech companies do. I mean, they have to get faces.Arielle Duhaime-Ross
Just the sentence, “They have to get faces”: It’s one of the more dystopian things I’ve heard recently. So walk me through sort of what is the ideal scenario for getting this kind of data from people: Someone just walks up to you on the street, and then what?Ginger Adams Otis
I would think of best practices, and this is just going off of a compilation of talking to a bunch of different people would be that a private company looking to buy something from you or use something of yours, commodify you in essence, should be upfront about exactly what they’re doing and make sure you have fully informed consent. And that would require a full explanation and time to read the consent form before you say yes. And actually, we saw an iteration of the Google consent form, and they did a pretty good job of crunching it down into some tight points. So it is possible to let people fully know what’s going on.Arielle Duhaime-Ross
Do you know what kind of device they were using for these face scans? Can you sort of describe how that works?Ginger Adams Otis
We saw it actually ourselves. A team out gathering data in California and people would take the gadget was pretty sizable you know not tablet size but bigger than a small iPhone for example. And it had a circle on it and it would say sort of center your face and follow the dot with your nose. And you know they would be rotating their face from different angles and up and down. And the person has to hold it because you need a full like 3D scan.Arielle Duhaime-Ross
And from your reporting we know that the company, Google, was targeting people of color and that they wanted to get data from their faces especially. What has Google said it was doing, and why did it need that kind of a data set?Ginger Adams Otis
Google wanted all kinds of skin tones and faces and features, so they weren’t I don’t think specifically setting out just to target a certain demographic, but what they what is well known about this kind of technology is it is less reliable on darker skin because it doesn’t have enough practice. And Google said to us we really need to train the machine.Arielle Duhaime-Ross
So, Google says that it was actually doing this to make its products more inclusive. That sounds like a noble goal to me in some ways. So, where did things go wrong?Ginger Adams Otis
Well, they hired a firm, a hiring firm to bring in contractors. And they were kind of given some marks to hit in terms of quantity of faces scanned. They were given instructions you know to just get people to sign up. Get people to say yes. Skip by the consent form tell them it’s a game. Tell them it’s like a new kind of Snapchat or a minigame you know don’t outright lie but don’t get bogged down in the details because you know we need to keep this flowing.Arielle Duhaime-Ross
Kind of sounds like people didn’t know what they were signing up for.Ginger Adams Otis
Some people probably didn’t. The people that we spoke to that we saw in California after the team left, we’d ask them: did you know this was for Google and they were scanning your face? And they were like, “No, we didn’t hear Google. We thought it was just a game.”Of course, one of the stories that we heard and reported was that a team had been sent to Atlanta specifically to look for homeless people of color with the idea being they’re less likely to worry about why they’re being asked for their face.Arielle Duhaime-Ross
Wow.Ginger Adams Otis
The city of Atlanta is very upset. The city attorney actually wrote to Google. I spoke to the mayor of Atlanta, and their feeling is it’s not cool to prey on a vulnerable population and there’s issues of consent within some people who experience homelessness.Arielle Duhaime-Ross
You mentioned that this whole project at least the parts that you were reporting on were done through a third party company where these workers visiting Google facilities were they dealing with Google managers. What’s the evidence tying some of these practices directly to Google?Ginger Adams Otis
Our understanding from multiple sources is that while these people these for-hire workers were contracted through the company called Randstad most if not all of them worked out of various Google headquarters. They used the Google amenities Google cafeteria. They got on the Google bus. They got all of the fun Google stuff. And we saw some documentation that gave us indications of specific Google managers that were involved in key parts of this.Arielle Duhaime-Ross
So you published this story sort of calling out Google for their contractors’ behavior. What is Google’s response to all of this?Ginger Adams Otis
So they have acknowledged that they were out looking for people with darker skin because of known flaws in the technology. And, as you said, the goal is to make a better product as to sort of the methods — what we called the allegations of dubious tactics. Google said: We’re suspending this, we’re investigating. We put a premium value on transparency, and we’re going to check all of this out.Arielle Duhaime-Ross
But the phone still went ahead and launched, right? So they’re still benefiting presumably from the data that they gathered?Ginger Adams Otis
Sure. And in the consent form that we saw, which was again and iteration of an evolving document, the consent is for this project. But once they build that technology and use your face your data to build this product where that back in technology goes is not defined in the consent form.Arielle Duhaime-Ross
What stands out to you about this story? When you take a step back, what do you hope people will take away from your reporting?Ginger Adams Otis
Well, I think there’s several key things that everybody should really focus in on here, myself included, because I’m not a tech reporter by day. I’m a worker, people reporter. I think as people, we have to start thinking a little bit more critically about what’s our role in Big Tech because what Big Tech wants from us is our data. And in this case it’s literally part of our bodies.Are we for sale? And if we are going to be for sale, then are you negotiating a price or are you just going, “Oh, you want me to play a cool game? Okay, I’ll do it.”Arielle Duhaime-Ross
What do you think of the five-dollar gift card? Does that seem like enough to you?Ginger Adams Otis
That raises the point of if it’s not enough, what’s our price? Which is a scary kind of question to contemplate, but I think to me that’s almost like a smart way to make it innocuous.Arielle Duhaime-Ross
Right, if you’re offering me a hundred dollars, I might start asking more questions.Ginger Adams Otis
Right.Arielle Duhaime-Ross
But five dollars seems kind of relaxed and kind of chill.Ginger Adams Otis
Right. Yeah. You know, to Starbucks. All right, great. I’ll get a Tall.Arielle Duhaime-Ross
Ruha Benjamin, you study the intersection of race, science and technology at Princeton. How do you explain the racial bias we often see in facial recognition technologies? How did we get here?Ruha Benjamin
Part of what we’re seeing in the last few years is so many examples that are revealing that human beings have to teach computers how to compute. And so the input for computational systems largely shapes the output. If, say, a company has discriminated against women in its hiring for the last 50 years, and so the employee base in that company is overwhelmingly male and that is the data that’s being trained for a hiring system that’s looking for new employees, then that that hiring system is going to assume that this company is not interested in and of female applicants based on this precedent. And so history is helping to predict the future in this case.Arielle Duhaime-Ross
When you read those stories in the New York Daily News, what was your gut reaction?Ruha Benjamin
“Oh, here we go again.” So, racialized groups have been targeted and included in harmful experimentations throughout our country’s histories. Whether slavery or Jim Crow or mass incarceration. Scientists and doctors have gone after the most vulnerable populations in order to hone various technologies and techniques.Under slavery, we had J. Marion Sims who people call the father of gynecology who experimented on enslaved women to hone surgical techniques that we still used today some of the women were operated on up to 30 times without anesthesia. Some eventually dying from infections resulting from the experiments.Under Jim Crow, we had the US Public Health Service experimenting on black farmers who had syphilis and then denying them treatment once that treatment was known in order to continue to observe [them. This is] what many people know of this as the Tuskegee experiment. But it was a US Public Health Service-sponsored experiment.And then up until the present there’s a great book called Acres of Skin that looks at experiments that happened in Philadelphia in which a dermatologist saw all these prisoners and was just excited about the ability to use what he called Acres of Skin in his experiments. And so throughout each of these, it’s going after a vulnerable population.All of these were framed as something that was for the public good. Right. And so none of the people who are engaging in this thought of themselves as sort of sinister characters. They saw themselves as developing things that everyone would eventually benefit from.And so, in that way, Google’s experiment builds on a long tradition and it might seem not as severe in comparison but it sets a precedent in which we sort of turn a blind eye when it goes after people that are already vulnerable and who are likely to be harmed even when these technologies are developed well. I think it’s good that Google pulled the research and that there’s kind of widespread questioning of whether we actually want this precedent to hold.Arielle Duhaime-Ross
I’m wondering how much responsibility should Google have in all of this. Should we be holding Google accountable for this, or should it be the third-party contractor that sort of went about gathering this data?Ruha Benjamin
Absolutely. Google should be held responsible in terms of commissioning the research to begin with. My understanding is that some of those contract workers are the ones who actually kind of blew the whistle on this. And so in that way they were holding themselves responsible in a way that they didn’t necessarily have to.I do think that those who are commissioning the research who are posing the problem to begin with that the data is supposed to answer or dress should definitely be responsible. And at the same time I don’t think we should allow companies to only hold themselves accountable we do need a larger sort of public accountability whether it has to do with laws or governance that steps in. And that is part of the process because as we see when we just leave it to private entities to hold themselves accountable, there are all kinds of ways in which they fail to do so.Arielle Duhaime-Ross
In your mind, what does it mean for a company like Google to be involved in something that’s so reminiscent of how scientists have treated black people in this country in the past?Ruha Benjamin
It just brings to mind that we don’t learn our history. We think of ourselves as so radically different from our predecessors as you know the people who conducted this in hindsight we can see it so clearly. We shouldn’t be experimenting on people who can’t truly give consent because the institution that they are surrounded by is itself coercive whether it’s prisons or plantations. But now we have a hard time looking at our present reality with the same critical eye.And it also says something about who’s working at Google and who has real power to speak up and to say something when they see something going in the wrong direction and so part of it is the expertise around the table. I think the sort of institutional change has to be thinking about what forms of expertise are relevant to tech development. And historians, sociologists, political scientists, and others who would have likely called this out or had the power to call this out much earlier in the process — before it hit the news — should be understood as relevant and necessary to tech development, not an afterthought.Arielle Duhaime-Ross
So, kind of wanting to close the loop here, I could see somebody reading this Google story and saying, “Hey, here is this company that is actually trying to fix this racial bias problem right. They were trying to get more black faces included in the database.” Is that a reasonable takeaway?Ruha Benjamin
Just the desire to create an inclusive product doesn’t justify engaging in a coercive process to get to that end.We have to care about the means, partly because of that history I described. And because those who are being enrolled in this particular process of making Google’s product more inclusive that’s going to benefit Google in the end. Their bottom line is going to be that benefit, and likely these populations are not going to directly benefit from their engagement in this.We have to remember this is a company that is trying to sell goods and products, and the kind of inclusion rhetoric is often used to mask what is really at stake, who’s going to benefit, and who’s likely going to be harmed by making facial recognition software more effective.The fact is that these facial recognition systems beyond Google’s particular product are likely to be enrolled in surveillance projects, whether by police or ICE or other institutions. The irony is that the very populations who are being targeted for this. People with darker skin, homeless people, etc. are likely to be those targeted by these surveillance practices. And so the process could have been ethical, but the outcome could still be something we should question.
0 comments:
Post a Comment