Facebook is fanning the flames of genocide. Again.
Lives hang in the balance as Facebook executives pander to Nationalist politicians in India.
Hello from 20 Minutes into the Future. In this edition we’re looking at how Facebook is once again being used to promote the slaughter of Rohingya Muslims. This time, however, it’s willingly and knowingly complicit.
In 2016 there were more FB users in Myanmar than any other South Asian country. It became the default news source for a nation riddled with religious and ethnic animosity. By Aug-Sept 2017 hate speech on the platform reached critical levels.
In that same span of time more than 6,700 Rohingya Muslims were killed by Buddhist citizenry and military. More deaths followed as did looting, gang rapes, and other forms of sexual violence. At least 730 young children were among the people shot, burned or beaten to death. 650,000 Rohingya refugees escaped to Bangladesh, India and other nearby countries to avoid this genocide.
During this period, Facebook outsourced content moderation for the region. Only 2 reveiwers could even speak Burmese. As a result they counted on users to report problematic posts but the interface for doing so was only available in English.
They also counted on Facebook’s translation services. In Burmese, one post said: “Kill all the kalars that you see in Myanmar; none of them should be left alive.” Facebook’s translation into English read as: “I shouldn’t have a rainbow in Myanmar.”
The United Nations slammed Facebook for their failure in Myanmar. Olivia Solon said “Facebook's failure in Myanmar is the work of a blundering toddler.” As I said then, it brings to mind Grey’s Law: "Any sufficiently advanced incompetence is indistinguishable from malice."
It’s happening again.
In 2020 there are more Facebook users in India than any other country in the world. Facebook—and also Facebook-owned Whatsapp—is the default news source for this nation also riddled with religious and ethnic animosity. And in February hate speech on the platform took the lives of dozens and injured hundreds.
As was the case in Myanmar, the ruling political party in India—The BJP—is using Facebook to broadcast hate. Current and former lawmakers from the party like T. Raja Singh, Anantkumar Hegde, Kapil Mishra, and others have called for citizens to shoot Rohingya Muslim immigrants and raze mosques to the ground. They’ve also put forth the conspiracy theory that the Rohingya are “waging a Corona jihad.”
These politicians have hundreds of thousands of followers amongst them. Singh alone has over 400,000. Prime Minister Modi’s page—which has more likes than any other politician in the world—often signal boosts this rhetoric.
Unlike Myanmar, Facebook can’t feign ignorance or incompetence. The company has invested billions in the region and has dedicated permanent employees—who can speak the language this time thank you very much—responsible for content moderation. According to that liberal pinko-commie rag, The Wall Street Journal:
Facebook Inc. employees charged with policing the platform were watching. By March of this year, they concluded Mr. Singh not only had violated the company’s hate-speech rules but qualified as dangerous, a designation that takes into account a person’s off-platform activities, according to current and former Facebook employees familiar with the matter.
Given India’s history of communal violence and recent religious tensions, they argued, his rhetoric could lead to real-world violence, and he should be permanently banned from the company’s platforms world-wide, according to the current and former employees, a punishment that in the U.S. has been doled out to radio host Alex Jones, Nation of Islam leader Louis Farrakhan and numerous white supremacist organizations.
It should not surprise you in the slightest that the employees recommendation fell on deaf ears. Singh is still active on Facebook, Instagram, et. al. Once again this is down to conservative favoritism from top-level executives within the company.
Ankhi Das is the highest ranking public policy executive for India at Facebook. She not only shielded Singh but also three other Hindu Nationalist individuals and groups deemed “dangerous” by her team. Her stated rationale? Profit.
Ms. Das, whose job also includes lobbying India’s government on Facebook’s behalf, told staff members that punishing violations by politicians from Mr. Modi’s party would damage the company’s business prospects in the country, Facebook’s biggest global market by number of users, the current and former employees said.
A Facebook spokesman, Andy Stone, acknowledged that Ms. Das had raised concerns about the political fallout that would result from designating Mr. Singh a dangerous individual, but said her opposition wasn’t the sole factor in the company’s decision to let Mr. Singh remain on the platform.
That other factor is, of course, TikTok which was recently banned in India by The BJP. Quid pro quo. Stop me if you’ve heard this one before.
Party might be another factor. Is Das a member of the BJP?:
In April of last year, days before voting began in India’s general election, Facebook announced it had taken down inauthentic pages tied to Pakistan’s military and the Congress party, the BJP’s main rival party. But it didn’t disclose it also removed pages with false news tied to the BJP, because Ms. Das intervened, according to former Facebook employees.
In 2017, Ms. Das wrote an essay, illustrated with Facebook’s thumbs-up logo, praising Mr. Modi. It was posted to his website and featured in his mobile app.
But there may be even more at work here. A more insidious motivation. Prejudice.
On her own Facebook page, Ms. Das shared a post from a former police official, who said he is Muslim, in which he called India’s Muslims traditionally a “degenerate community” for whom “Nothing except purity of religion and implementation of Shariah matter.”
The post “spoke to me last night,” Ms. Das wrote. “As it should to [the] rest of India.”
Whatever the full-story is, it’s clear once more that Facebook’s only principle is profit. Profit generated by hate and outrage. Remember Harvey's Law: Any sufficiently advanced greed is indistinguishable from malice.
You can read more about how social media signal boosts propaganda for profit in the 20 Minutes into the Future archive:
Not a subscriber yet? 20 Minutes into the Future is 100% ad free and always will be. Sign up for weekly commentary & related links to help you dig deeper into big tech behaving badly.
10 stories this week
Police built an AI to predict violent crime. It was seriously flawed
U.S. Intelligence Says Republicans Are Working With Russia to Reelect Trump
QAnon groups have millions of members on Facebook, documents show
Why Wikipedia Decided to Stop Calling Fox a ‘Reliable’ Source
Android is becoming a worldwide earthquake detection network
Revealed: QAnon Facebook groups are growing at a rapid pace around the world
Pinterest Accused of Gender Bias in Suit by Former No. 2 Executive
Sick and tired of big tech behaving badly? 20 Minutes into the Future is about holding the bastards to account. One way we can do that is by spreading the word of their misdeeds.
You don’t need me to tell you that Steve Bannon is a giant bastard. From Breitbart to Trumpworld few people have shaped the alt-right as much as he has. You might even be aware that he’s been recruiting for an “alt-right gladiator school” in Italy for example. But you might not know about his most recent media play, GTV.
GTV was a partnership between Bannon and exiled Billionaire Chinese real-estate tycoon and alleged “dissident hunter,” Guo Wengui. The two raised over $300 million to create a media platform likened to WeChat, TikTok, and Amazon.com but for the right-wing Chinese diaspora.
Unsurprisingly, the company is now being investigated by the FBI, SEC, and the New York State Attorney General for ripping off investors and other securities violations.
Dr. Kate Devlin is whip-smart and connects the dots across a wide-range of influences ranging from archaeology to human-computer interaction. She’s currently Senior Lecturer in Social and Cultural Artificial Intelligence at King’s College in London. Her recent research has explored cognition, sexuality and intimacy and resulted in her amazing book, Turned On: Science, Sex and Robots.
We need more Kates and less Steves working in tech today if we want a better tomorrow.
Thanks for reading 20 Minutes into the Future. Have a friend or colleague who'd like the newsletter? Invite them to sign up.
Good night and good future,
20 Minutes into the Future is a critical look at how technology is shaping our lives today. And what actions we can take for a better tomorrow. If you're not already a subscriber and found this newsletter worth your while then please sign up.
My name is Daniel Harvey and I write 20 Minutes into the Future. I’m a product designer and have written for Fast Company, Huffington Post, The Drum, & more. If you're pissed about the current state of tech and want to see us do better then you’ve found a kindred spirit.
You can email me at firstname.lastname@example.org or follow me on Twitter @dancharvey. Or add a comment to this post now.