Hello from 20 Minutes into the Future. In this edition we’re looking at Palantir’s Gotham software and how it perpetuates institutional racism in police departments across the United States. Often behind the scenes and without any real oversight.
Illustration by Val Mina originally published at Wired
“Welcome to the Gang database.”
Manuel Rios has been serially harassed by LAPD thanks to Palantir. He lives in East LA and struggles with depression and a learning disability. He was once upon a time addicted to crystal meth and was in a county jail for a related burglary. He’s since kicked his habit and works at a local supermarket. Unlike some of his friends, Rios was never “jumped in” — initiated — into the 18th Street Gang.
Unbeknownst to Rios, in 2016, he became ensnared in Gotham’s data dragnets. One day he was in a parked car with a friend in the gang when cops rolled up. His friend ran. Rios, knowing he’d done nothing wrong, remained. When the police returned they handcuffed him, photographed him and said, “Welcome to the gang database.”
From 2016 to 2018 he’s since been targeted by police over a dozen times. LAPD has told him to move if he doesn’t like it. “They say you’re in the system, you can’t lie to us,” he says. “I tell them, ‘How can I be in the hood if I haven’t got jumped in? Can’t you guys tell people who bang and who don’t?’ They go by their facts, not the real facts.”
Rios had been found guilty by association by Gotham’s algorithms. Rather than being a crystal ball, Palantir’s software is a black box where data is presumed to be destiny. It’s systems find vulnerable people like Rios and try to turn them into the criminals they so desperately want them to be.
Operation LASER (Los Angeles Strategic Extraction and Restoration) is the codename for LAPD’s predictive policing initiative that runs on Gotham. Gotham combines data from an array of sources like license plate readers, police records, social media posts, location data, phone call records, bank statements, and more. It then combines all that disparate data into “spidergrams” that show a web of relationships like “Colleague of”, “Owner of”, and even “Lover of.” As early as 2018, law enforcement agencies could identify more than half of the adult population of the United States. More police departments have signed on since then so the number is certainly higher today.
Data demands more data. As a result police now have a quota to produce “chronic offender bulletins” to feed the system even more data. The output of this is a “chronic offender score.” In some cases they’re even told to knock on doors to make it clear to people that they’re being actively monitored. In others, officers are then encouraged to find any available opportunity to stop or arrest high score targets as means of updating the bulletins which count against the quota.
It’s not clear how people’s scores are reduced but officers have told people, people like Rios, “If they don’t get stopped, they stop being on the list.”
I hope you don’t need me to tell you how circular and self-fulfilling all of this is…
James Carville’s plot to use New Orleans as a petri dish for Palantir
Carville has been a paid advisor to Palantir since 2011. In 2012 he decided to bring Palantir to his home state of Louisiana under the auspices of philanthropy. In an interview in 2014, Carville and his wife and libertarian political consultant, Mary Matalin, made the first public mentions of the initiative:
“The CEO of a company called Palantir – the CEO, a guy named Alex Karp — said that they wanted to do some charitable work, and what’d I think? I said, we have a really horrific crime rate in New Orleans,” Carville told KQED Forum’s host Michael Krasny, without mentioning his professional relationship to Palantir. “And so he came down and met with our mayor… they both had the same reaction as to the utter immorality of young people killing other young people and society not doing anything about it. And we were able to, at no cost to the city, start integrating data and predict and intervene as to where these conflicts were going to be arising. We’ve seen probably a third of a reduction in our murder rate since this project started.”
Matalin, who is also a political consultant, made it clear to Krasny that the prediction work being done with NOPD by the Palo Alto firm was both a prototype and potentially could sweep up innocent people.
“We’re kind of a prototype,” said Matalin. “Unless you’re the cousin of some drug dealer that went bad, you’re going to be okay.”
New Orleans operates under a “strong mayor” model. That means Carville, Karp, and Mayor Mitch Landrieu were able to agree contracts without any public procurement process OR oversight from the City Council. Several council members weren’t even aware of the effort until The Verge interviewed them about it in 2018.
Civil and criminal attorneys and civl rights activists were also in the dark. Despite being used by prosecutors, none of the predictive policing data used in trials were shared with defense attorneys. During The Verge investigation, Jim Craig, the director of the Louisiana office of the Roderick and Solange MacArthur Justice Center said:
“It’s especially disturbing that this level of intrusive research into the lives of ordinary residents is kept virtually a secret. It’s almost as if New Orleans were contracting its own version of the NSA to conduct 24/7 surveillance of the lives of its people. Right now, people are outraged about traffic cameras and have no idea this data-mining project is going on. The South is still a place where people very much value their privacy.”
And again, data demands more data. New Orleans was contractually obligated to provide Palantir with additional information feeds to train its algorithms. Public records, court filings, city criminal and non-criminal data were all shovelled into Gotham’s gaping maw. Field interviews also became mandated for NOPD. More than 70,000 such “stop and frisk” intelligence gathering sessions occurred in the first year of the contract alone.
All without public knowledge. All without oversight from the elected officials whose job it is to manage this sort of municipal data.
Furthermore studies suggest the decline in crime that Carville attributed to Palantir was better credited to community programs that were running concurrently. Nicholas Corsaro, one of the professors behind that study, has said, “Trying to predict who is going to do what based on last year’s data is just horseshit.”
Data is not destiny
Predictive policing is the 21st century version of “broken windows” policing and it’s every bit as flawed. The Stop LAPD Spying Coalition published a report demonstrating how “the continuation of decades of discriminatory and racist policing under the apparent neutrality of objective data” creates a “racist feedback loop” in which a “disproportionate amount of police resources are allocated to historically hyper-policed communities.” Here’s a summary courtesy of The Intercept:
Survey results included in the report suggest that very few people in Los Angeles bear the brunt of most police interactions: Two percent of residents who responded to the survey reported being stopped by police between 11 and 30 times a week or more, while 76 percent of respondents reported never being stopped at all. The 300 survey respondents were distributed across geography, race, age, and gender. In focus groups, people who lived in areas heavily targeted by police described a state of constant surveillance. Asking “how often do I see police in my area is like asking me how many times do I see a bird in the day,” said one resident.
It’s a truism in business, one of many from Peter Drucker, that “what gets measured, gets managed.” In the age of Silicon Valley empowered state surveillance people of colour are who is being measured. And “managed.”
You can read more about how big tech exacerbates racism with these stories from the 20 Minutes into the Future archive:
Not a subscriber yet? 20 Minutes into the Future is 100% ad free and always will be. Sign up for weekly commentary & related links to help you dig deeper into big tech behaving badly.
10 stories this week
Simone Browne aka @wewatchwatchers has long been an inspiration to me. Read this and see why.
How Surveillance Has Always Reinforced Racism (Wired)
The boycott is in protest of what the groups call Facebook’s “repeated failure to meaningfully address the vast proliferation of hate on its platforms.”
Prominent civil rights groups are calling for a Facebook ad boycott (Input)
Yes. Break them up.
Have we become too reliant on Big Tech firms? (BBC)
BadUX kills. The stock market kills.
20-Year-Old Robinhood Customer Dies By Suicide After Seeing A $730,000 Negative Balance (Forbes)
The question is how much NHS patient data did Palantir and Faculty slurp up in the meantime?
UK ditches its coronavirus contact-tracing app and switches to Google-Apple model (The Next Web)
It’s not just Palantir that you have to worry about btw…
FBI used Instagram, an Etsy review, and LinkedIn to identify a protestor accused of arson (The Verge)
The clothes make the monster.
Outfitting police in military uniforms encourages brutality (Fast Company)
What companies do means more than what they say.
Ex-eBay Workers Sent Critics Live Roaches and a Mask of a Bloody Pig Face, U.S. Says (The New York Times)
You’re going to see more and more of this. Privacy matters to people more than the tech companies would have you believe. Especially when it comes to intimate details like health.
Norway halts coronavirus app over privacy concerns (MIT Technology Review)
P.S. Palantir is about to make a killing on the stock market.
Palantir to File IPO in Weeks For Possible Fall Debut (Bloomberg)
Sick and tired of big tech behaving badly? 20 Minutes into the Future is about holding the bastards to account. One way we can do that is by spreading the word of their misdeeds.
Alex Karp, the CEO of Palantir, self-professes to being a “Neo Marxist deviant” as a way to draw heat from his right-wing partner and co-founder, Peter Thiel. In an interview with Axios, Karp pondered “if I were younger at college: ‘Would I be protesting me?'" If he has to ask himself that, after gloating about how his software kills people in the same breath, then he’s no Marxist… he’s a fucking hypocrite.
Stephanie Hassam is a digital product designer working in Seattle. In recent months she’s been co-creating a privacy-preserving approach to Covid-19 contact tracing. She’s also fiercely committed to data justice, inclusive design, and creating more equitable futures.
We need more Stephanies and less Alexes working in tech today if we want a better tomorrow.
Good night and good future,
20 Minutes into the Future is a critical look at how technology is shaping our lives today. And what actions we can take for a better tomorrow. If you're not already a subscriber and found this newsletter worth your while then please sign up.
My name is Daniel Harvey and I write 20 Minutes into the Future. I’m a product designer and have written for Fast Company, Huffington Post, The Drum, & more. If you're pissed about the current state of tech and want to see us do better then you’ve found a kindred spirit.