Collective consent in the age of Covid-19
7:57-min read
Hello from 20 Minutes into the Future. In this edition we’re looking at the need for collective consent in the era of digital contact tracing. Public health and privacy are not mutually exclusive — especially in a pandemic.
Contact tracing has been instrumental in public health practice for decades. It arguably played a bigger role than immunisation in the eradication of smallpox. Historically contact tracing has been a very manual process (via Wikipedia):
An individual is identified as having a communicable disease (often called the index case). This case may be reported to public health or managed by the primary health care provider.
The index case is interviewed to learn about their movements, whom they have been in close contact with or who their sexual partners have been.
Depending on the disease and the context of the infection, family members, health care providers, and anyone else who may have knowledge of the case's contacts may also be interviewed.
Once contacts are identified, public health workers contact them to offer counseling, screening, prophylaxis, and/or treatment.
Contacts may be isolated (e.g. required to remain at home) or excluded (e.g. prohibited from attending a particular location, like a school) if deemed necessary for disease control.
If contacts are not individually identifiable (e.g. members of the public who attended the same location), broader communications may be issued, like media advisories.
Cities, counties, and countries the world over are calling for “armies of contact tracers.” California wants 20,000 tracers. The UK is calling for 18,000. Manual contact tracing of this scale is unprecedented. And the life of a contact tracer is very hard.
It’s not hard then to imagine technologists big and small genuinely wanting to help. Never mind bastards like Palantir & Faculty aiming to profit from the pandemic. Hell, I was connected briefly to a proposal here in the UK for digital contact tracing. And I wasn’t the only one. Everyone and their brother seems to have been working on one.
MIT has a database that tracks contact tracing apps. And that’s only official government sanctioned examples. There are tons of other unofficial apps from private companies out there too. Many of these apps are, of course, massive data gathering programs for their own sake. They risk further entrenching state surveillance in countries already rife with it:
China’s system goes above and beyond contact tracing for public health and shares identity, location, and even online payment history with law enforcement.
Palantir has offered its Gotham platform for the UK Covid response. Gotham is part of the suite of surveillance products Palantir uses to run America’s concentration camps. Whitehall sources have said there’s already been a data breech.
NSO Group, who recently hacked WhatsApp, is leading one of Israel’s contact tracing efforts. Not unlike Palantir their security is suspect. They’ve left data exposed on a passwordless server.
We’ve been down this road many times before. The fire last time was 9/11 and The Patriot Act. John Ackerly had a front row seat for it then:
In my time at the White House, I bore witness to a number of the early decisions that had significant, unintended negative consequences. For example, new FISA court processes allowed domestic wiretapping, and Patriot Act provisions enabled mass data capture of citizens who were not under any criminal investigation. The lessons many of us learned were that the benefits in the fight against terrorism were marginal, while the privacy and data security consequences were large and lasting. Further, we learned that because there was little transparency in the deliberations, people lost trust in public institutions.
and he’s cautioning against repeating those same mistakes again:
Data privacy must be a foundational and transparent component of our response to COVID-19. We must lead the world in innovative strategies that enhance both safety and trust. The solution demands giving the American public actual technical control over their data so that people don't have to rely on the promises of technology companies or the government for how they will use (and reuse) personal data.
As much as Ackerly is saying smart things that last bit is part of the problem. For too long we’ve bought into the myth that data is personal. And that view is deeply flawed. As Sarah Gold and I say in The Manifesto on Society-Centered Design our frameworks are broken. Particularly when it comes to privacy:
Data protection frameworks like GDPR or CCPA express our rights only as individuals. This individualistic lens has shaped how we now design for digital rights. But data rarely represents a single person - it usually describes many people.
Jake Goldenfein, Ben Green, and Salomé Viljoen of Cornell and Harvard go into more detail on why that framing is so flawed.
In the debate of “privacy vs. X”, where X is a broader social concern like health or security, an individualistic framing of privacy built around dignity and autonomy always loses. What we need is not individual control over data about us, but collective determination over the infrastructures and institutions that process data and that determine how it will be used. This requires moving beyond privacy entailing the choice to opt-in or opt-out of public-private coronavirus surveillance infrastructures and towards developing democratic mechanisms to shape the structure, applications, and agendas of technological architectures.
Anouk Ruhaak, formerly of the Open Data Institute and now at Mozilla, has argued extensively for collective consent models:
Collective consent describes those cases that sit between the realms of government regulation and individual consent. Imagine, for instance, a group of patients with a specific type of cancer. They would like to make their data available for research, but are afraid the data may fall into the wrong hands (‘wrong’ in this case ranging from a future employer to their social network). If half the group shares this data, it would become relatively easy to infer information about the other half. In other words, an individual view of consent doesn’t take account of the fact that the entire group has a stake in each person’s decision. In addition, if the cancer is genetic, sharing this data may also impact the family members of the patients. Therefore, instead of each patient making these decisions on their own, we could imagine them coming together and collectively deciding on the best course of action: who do they want to extend access to this data and under what conditions?
Its not hard to imagine a similar scenario with Covid-19.
The sorts of innovations Ackerly calls for aren’t really happening in the US. But there are signs of hope in other Western Democracies. One such idea is the introduction of data trusts:
With data trusts, the independent person, group or entity stewarding the data takes on a fiduciary duty. In law, a fiduciary duty is considered the highest level of obligation that one party can owe to another – a fiduciary duty in this context involves stewarding data with impartiality, prudence, transparency and undivided loyalty.
UK Biobank is one great example of data trusts in action in the healthcare space. And we’re seeing more pilots here in the UK. Thanks in large part to Parliamentary Committee support.
The Patriot Act led to a massive overreach by government that failed to deliver on its promise. As we’re seeing in Iceland the same could very well be true this time but with big tech even more involved. Being kind digital contact tracing is tech do-somethingism. Being less kind it’s data theft that will come back to haunt us for decades.
The pandemic represents a turning point for how we use OUR data (not yours, not mine. Ours.). We can continue down a path that leads to the very worst of big tech companies profiteering from even more intimate data. Or we can rally around a new standard that focuses more on public value.
Not a subscriber yet? 20 Minutes into the Future is 100% ad free and always will be. Sign up for weekly commentary & related links to help you dig deeper into big tech behaving badly.
Ten stories this week
“Faculty is working at the heart of the govt’s response to the pandemic. It has been processing large volumes of confidential UK patient information in an “unprecedented” data-mining operation alongside Palantir, founded by the billionaire Peter Thiel.”
https://www.theguardian.com/world/2020/may/04/vote-leave-ai-firm-wins-seven-government-contracts-in-18-monthsNHSX had proposals for a privacy preserving solution as far back as February. Apple and Google are freely offering one too. They chose to partner with Palantir and Faculty knowing full-well it was the least private option on the table.
https://www.theguardian.com/technology/2020/may/05/uk-racing-to-improve-contact-tracing-apps-privacy-safeguardsI certainly fucking hope so.
https://www.economist.com/finance-and-economics/2020/05/09/could-the-pandemic-give-americas-labour-movement-a-boostDesign has been commoditized.
https://www.fastcompany.com/90501691/science-confirms-it-web-sites-really-do-all-look-the-samePixels, pandemics, and the protestant work ethic collide in Animal Crossing.
https://arstechnica.com/gaming/2020/05/how-animal-crossing-has-become-an-experimental-playground-for-irl-business/Adam Neumann is a charlatan of the highest order.
https://techcrunch.com/2020/05/04/wework-co-founder-adam-neumann-accuses-softbank-of-abusing-its-power-in-new-lawsuit/We can only hope.
https://www.thedrive.com/news/33346/survey-suggests-were-pretty-done-with-ridesharing-after-all-this“One feature of teachers unions’ demands in recent years has been something known as “Bargaining for the Common Good.” Essentially, unions make demands that extend beyond “bread and butter” concerns and involve the broader community in their struggles.”
https://progressive.org/dispatches/how-workers-can-win-jaffe-200430/Antivaxxers, The Tea Party, and White Nationalists are joining forces.
https://arstechnica.com/science/2020/05/antivaxxers-spearhead-protests-against-lockdown-orders-demand-freedom/“Now, a co-investigation by BBC Click and the UK counter-extremism think-tank Institute of Strategic Dialogue, indicates how both extremist political and fringe medical communities have tried to exploit the pandemic online.”
https://www.bbc.co.uk/news/technology-52490430
From the archives
Dig deeper into more ethical approaches to tech with these stories from 20 Minutes into the Future:
Sick and tired of big tech behaving badly? 20 Minutes into the Future is about holding the bastards to account. One way we can do that is by spreading the word of their misdeeds.
Bastard watch
Eric Schmidt has been a busy little bastard since slowly separating from Google. He’s joined up with war criminal Henry Kissinger to foster even deeper connections between Silicon Valley and the military. He’s also trying to remake NY in Google’s own image. If you’re familiar with history then that might remind you of when the banks took over NY during the 1975 fiscal crisis and never really let go. General Thomas really should have chucked him out of that Chevy Suburban back in 2016.
Kindred spirits
Jessica Moreno has spent her career trying to make the internet a kinder place. She’s doing that now as co-founder and Chief Product Officer at Mesh. Interest-based communities on a Nazi-free platform with ads the community control? What’s not to love? Sign me up.
Thanks for reading 20 Minutes into the Future. Have a friend or colleague who'd like the newsletter? Invite them to sign up.
Good night and good future,
Daniel
20 Minutes into the Future is a critical look at how technology is shaping our lives today. And what actions we can take for a better tomorrow. If you're not already a subscriber and found this newsletter worth your while then please sign up.
My name is Daniel Harvey and I write 20 Minutes into the Future. I’m a product designer and have written for Fast Company, Huffington Post, The Drum, & more. If you're pissed about the current state of tech and want to see us do better then you’ve found a kindred spirit.
You can email me at daniel.harvey@gmail.com or follow me on Twitter @dancharvey.