Google vs. black people: Scandal in Atlanta
Happy Black History Month. I wanted to thread the needle on Google’s own history of mistreatment of black people. In last weeks letter we focused on The Gorilla Incident. This week we’ll talk about how an attempt to correct that bias was itself racist and troubling.
In July Google announced that it's upcoming Pixel 4 would have a face unlock feature much like the iPhone. They said, “Our goal is to build the feature with robust security and performance. We’re also building it with inclusiveness in mind, so as many people as possible can benefit.”
Black representation in US technical staff has only “improved” by less than 1% since 2014. Currently blacks make up 4.8% of Google’s total workforce and only 2.8% of their tech staff. That’s despite years of alleged effort to the contrary.
Given they don't have anywhere near the amount of PoC staff to build the data set in-house, they took to the streets. Google staff & contractors went to US cities offering $5 certificates for a face scan. Google says it used consent forms as well.
These scans collected depth and "task" information (e.g. picking up a phone) which makes sense. It also collected time and location data which makes no damn sense at all. Even before they got into hot water for this, Google claimed to already have purged the location data.
Things went completely off the rails in Atlanta. Sources say a Google manager told contractors to "target people with darker skin." Randstad contractors took that directive to terrible extremes.
According to several sources who allegedly worked on the project, a contracting agency named Randstad sent teams to Atlanta explicitly to target homeless people and others with dark skin, often without saying they were working for Google, and without letting on that they were actually recording people’s faces.
The NY Daily News was the first to run an exposé. They targeted the homeless because they'd be "least likely to say anything to the media.” They would also often fail to say they were working on Google's behalf or get proper consent. In some cases they didn't even give participants the $5 voucher. They also targeted low-income blacks students and lied about recording video of them.
Two days later, Atlanta's city attorney Nina Hickson contacted Google:
“The possibility that members of our most vulnerable populations are being exploited to advance your company’s commercial interest is profoundly alarming for numerous reasons,” she said in a letter to Kent Walker, Google’s legal and policy chief. “If some or all of the reporting was accurate, we would welcome your response as what corrective action has been and will be taken.”
Following the pattern we discussed last week, Google issued a bog-standard mea culpa. I assume they have machine-learning writing these now because they are very predictable:
We’re taking these claims seriously and investigating them. The allegations regarding truthfulness and consent are in violation of our requirements for volunteer research studies and the training that we provided.
“Transparency is obviously important, and it is absolutely not okay to be misleading with participants.”
They added that they found the allegations “very disturbing.” The program is currently paused while Google does it's own investigation. But that they are still paying contractors, presumably that includes Randstad.
It’s easy to paint Randstad as the singular “bad guy” here. That’s by design. Silicon Valley companies farm out high-risk assignments to contractors all the time:
Facebook contracts out it's content moderation to disastrous effects.
Amazon contracts out it's deliveries to disastrous effects.
Our modern day robber barons do this so they can have plausible deniability when shit goes tits up. We shouldn't fall for that. As Sidney Fussell points out in The Atlantic:
Google allegedly gave the contractors daily quotas, ordered them to prioritize subjects with dark skin, and encouraged them to approach homeless people, who it expected to be most responsive to the gift cards and least likely to object or ask questions about the terms of data collection.
Managers reportedly encouraged contractors to mischaracterize the data collection as a “selfie game,” akin to Snapchat filters such as Face Swap. College students who agreed to the scans later told the Daily News that they didn’t recall ever hearing the name Google and were simply told to play with the phone in exchange for a gift card.
One would help that Google is taking a hard look at it's own employees too in their investigation. Garbage in, garbage out. Racism in, racism out.
Stats: 762 words & 3 minutes-ish reading time
File under: #biasedalgorithms #stupidAI #surveillance #inclusion
Next week: What can Google and the rest of the tech industry do to actually get better on race?
20 Minutes into the Future is a critical look at how technology is shaping our lives today. And what actions we can take for a better tomorrow. If you found this newsletter worth your while then please sign up for a free subscription.
Daniel Harvey writes 20 Minutes into The Future. He is a product designer and has written for Fast Company, Huffington Post, The Drum, & more. If you're pissed about the current state of tech and want to see us do better then you’ve found a kindred spirit.
You can email him at daniel.harvey@gmail.com or follow him on Twitter @dancharvey.