Hello from 20 Minutes into the Future. In this edition we’re looking at the many ways TikTok’s algorithms and content moderators suppress the poor, transgendered, disabled, and people of colour. Tick tock the clock is ticking.
But it’s not all cute baby goats out there. There’s a lot of exclusionary policies that limit the reach of all but the platform’s preferred audience. Which, like many social media platforms, is privileged, white, able-bodied, and cis.
Leaks about TikTok's content moderation guidelines have become common of late. Back in December Netzpolitik shared portions of the regional guidelines for Germany. And in March, The Intercept shared a comprehensive view of these rules globally.
Across both reports a troubling view emerges:
Moderators were encouraged to flag content from people with Autism, Down Syndrome, or disabled people with “some facial problems.”
Videos in which “the shooting environment is shabby and dilapidated,” including but “not limited to … slums, rural fields” and “dilapidated housing” with “cracked walls” or “old decorations” were also flagged.
Disabled, overweight, and LGBT users especially ones seen as “confident” were also marked down by moderators.
Content featuring people with “an abnormal body shapes” such as a “beer belly” or “dwarfism” or “ugly facial looks” such as “too many wrinkles” were also suppressed.
Once flagged, these videos are hidden from the “For you” feed which is TikTok's main stage. When pressed, spokespeople said these policies were in place to prevent bullying. But the documentation makes scant use of that as rationale. Instead they point to user growth and acquisition as the reasons for the decisions.
From The Intercept report:
The justification here, as with “ugly” uploaders, was again that TikTok should retain an aspirational air to attract and hold onto new users: “This kind of environment is not that suitable for new users for being less fancy and appealing.” Social startups, eager to build on their momentum rather than disappear into the app heap of history, commonly consider growth and user retention to be by far their top priority, but rarely is the public privy to the details of this kind of nakedly aggressive expansion.
What’s more TikTok’s algorithms are designed to promote that “aspirational air.” Researcher Marc Fadoul ran a casual experiment that was immediately problematic:
Collaborative filtering like this is all-too-common from big tech companies. This is why we have filter bubbles generally. It’s a technique that helps reinforce a given user’s status quo.
And as Fadoul said to Buzzfeed it’s inherently limiting:
For example, he said, if the most popular creators on a platform are white, and the app keeps recommending other white creators, it makes it hard for creators of color to gain followers and popularity — even if that's not the intention of the algorithm.
"Then it means it's easier for a white person to get recommended than someone from an underrepresented minority," he said. "So that’s something that can be happening, regardless of its facial feature or collaborative filtering."
As you may have guessed, the most popular creators on TikTok tend to be affluent, able-bodied, cis-whites.
TikTok says they " are proud to be a platform for positive and creative expression for our diverse community of users." Some black and transgender creators argue the opposite is true. They point to frustrating levels of censorship:
“the main issue is that there is inconsistency in regard to the community guidelines…I ran into complications when two of my videos were removed because they ‘violated the community guidelines.’ One of the videos addressed the way I personally felt about non-African Americans using the n-word. The sound I used was a man saying ‘what’s wrong with you’ continuously. The other video that was taken down was a video of me reading Twitter memes. These were comedic memes that the Black community could relate to because a great amount of [Black people] have experienced the circumstances listed. It's always fun to make these videos because I’m able to relate to them, and others are able to relate to them as well. When I came across [a] girl’s video that degraded the African American community, I couldn’t comprehend why it continued to appear on the For You page because based on TikTok’s community guidelines, the video should’ve been taken down.”
Trans-creators like Reice Hodges and Clarissa Jacobo spoke to the BBC about the censorship they’ve experienced:
Reice Hodges, 35, says she has had several videos removed, including one where she challenged instances of bullying. She claims TikTok deleted these posts before removing the abusive comments she received.
"It makes me mad when my content is removed. There are some videos that I spend hours making… and to have one of those videos removed really discourages me," she told the BBC.
"There are countless amounts of teenagers and adults who have reached out to me and thanked me for putting myself out there to be seen.
"To block something that can bring awareness to the trans community, when we already have so much hate and disgust coming toward us - where else are we supposed to go?"
Clarissa Jacobo, 19, says she has deleted her account in frustration after her TikTok video talking about her experience was repeatedly taken down.
"I felt defeated. Nobody was seeing my content, nobody cared and there was nothing I could do," she told the BBC. "These apps censor LGBT creators who just want to spread positivity and help people but they can't because nobody will be able to see it."
Visibility and representation are critical in empowering disenfranchised communities. TikTok systemically robs those communities of the chance to tell their stories. First by denying them an audience via algorithmic recommendations that favour the mainstream. Then by artificially suppressing their reach. And finally by outright censorship.
You can read more about how big tech exacerbates classism and racism with these stories from the 20 Minutes into the Future archive:
Not a subscriber yet? 20 Minutes into the Future is 100% ad free and always will be. Sign up for weekly commentary & related links to help you dig deeper into big tech behaving badly.
10 stories this week
Influencers are garbage people.
Wannabe influencers are being trained to film a believable YouTube apology video (The Verge)
Some designers need to be pistol-whipped.
This mask has a hole so you can sip your cocktail while socializing (booze not included) (Fast Company)
Great read from @doteveryone on the changing dynamics of people, power and technology
People, Power and Technology: The 2020 Digital Attitudes Report (Doteveryone)
One thing that would go a long way to doing this? If companies like Facebook and Twitter enforced their terms of service equally. Sadly Zuckerberg and @Jack treat their most dangerous, outrageous, and lucrative members differently.
Let's Clean Up the Toxic Internet (The New York Times)
“Documents left unsecured on Google Drive.” Ask yourself if you think the data itself will be anymore secure.
Secret NHS files reveal plans for coronavirus contact tracing app (Wired)
It’s only a warning if they punish Musk. Otherwise it’s a call to arms for the greedy. Remember Harvey's Law: Any sufficiently advanced greed is indistinguishable from malice.
Tesla’s reopening may be a spectacle, but it's also a warning for Silicon Valley (Protocol)
What fresh hell are the Nazi’s up to now? Here’s the answer:
White Supremacists Built a Website to Doxx Interracial Couples — and It's Going to Be Hard to Take Down (Vice)
“Nine years later, as reports of a fearsome new virus suddenly emerged, and with Trump now president, a series of ideas began burbling in the QAnon community: that the coronavirus might not be real; that if it was, it had been created by the ‘deep state’.”
The Prophecies of Q (The Atlantic)
The mantra for Facebook’s much vaunted “Supreme Court”? Move slowly and fix nothing. It cannot be reiterated enough: this board has no remit to challenge Facebook’s business model and no real enforcement powers since Zuck controls the shares he does.
Facebook and the Folly of Self-Regulation (Wired)
Alex Stamos was a part of the problem at Facebook but he’s not wrong here. Broken clocks and all that. “They also have a difficult disinformation problem in that their content is mostly visual. And so that just makes technically this a lot harder than stuff that’s text-based.”
How TikTok could be a player in election disinformation (The Verge)
Sick and tired of big tech behaving badly? 20 Minutes into the Future is about holding the bastards to account. One way we can do that is by spreading the word of their misdeeds.
Joel Kaplan is VP of Global Policy at Facebook. He’s used his power at the social media behemoth to fire moderators at the behest of conservative pundits, to squelch the internal investigation into Russian election interference, and to smear protestors. Kaplan is also the bastard behind Facebook’s K Street lobbying efforts which is disproportionately enriching Republican coffers. If you recognise him then it’s probably due to his moral support of Brett Kavanaugh. Joel Kaplan is a right bastard.
Caroline Sinders is a designer / artist whose work examines algorithms, abuse, and how we might use tech for the public good. Her work around feminist data sets, surveillance capitalism, and dark patterns is worth your time. Most recently she’s created a living wage calculator for the invisible human labor that powers “AI” systems.
We need more Carolines and less Joels working in tech today if we want a better tomorrow.
Good night and good future,
20 Minutes into the Future is a critical look at how technology is shaping our lives today. And what actions we can take for a better tomorrow. If you're not already a subscriber and found this newsletter worth your while then please sign up.
My name is Daniel Harvey and I write 20 Minutes into the Future. I’m a product designer and have written for Fast Company, Huffington Post, The Drum, & more. If you're pissed about the current state of tech and want to see us do better then you’ve found a kindred spirit.