Aldous Huxley once said,
“In regard to propaganda the early advocates of universal literacy and a free press envisaged only two possibilities: the propaganda might be true, or the propaganda might be false. They did not foresee what in fact has happened, above all in our Western capitalist democracies — the development of a vast mass communications industry, concerned in the main neither with the true nor the false, but with the unreal, the more or less totally irrelevant. In a word, they failed to take into account man’s almost infinite appetite for distractions.”
So what can we do as consumers, creators, and a society to fix all this shit? Because we have to fix it. We can’t throw in the towel like Facebook has done with their disingenuous “pivot to privacy.”
As individuals we can take steps to break the news cycle. Studies show that people who’ve gone cold turkey on social media are less informed but are happier. We need to acknowledge there’s a difference between “right” and “right now.” How can we find slower, more responsible ways to consume the news? Step 1 — turn all your notifications right the fuck off.
As a society we need to educate people on how to live in a media saturated world. There are different approaches to media literacy programs but one thing they all have in common is an emphasis on critical thinking and analysis. It teaches people to understand the power structures that shape media representations and the ways in which audiences and producers derive and create meaning. It asks “who benefits?” With an event like Christchurch sharing the video or manifesto, or even naming the terrorist, bolsters his terrible agenda.
Media literacy alone isn’t enough though. We need to start teaching everyone — from school age children on up — algorithmic literacy too. Why are these systems making the recommendations they are making for us? Why are they serving us the content they are? And as designers and technologists we need to start making these systems more transparent to the people that use them.
As designers and technologists we can acknowledge that hate speech is not free speech:
Enforce existing ToS evenly and fairly. No one is above the law and no one should be above ToS either.
Apply the same approaches to erasing white radical extremist content that the platforms use to censor ISIS.
Better definitions of protected groups.
Better algorithms to detect hate speech.
Down rank / shadow ban hateful content.
Better distinguish between “fake dissent” and real dissent and protect the latter. Troll dissent has patterns and inconsistencies that are detectable.
We need to create new networks with new business models. That starts by challenging lazy assumptions that advertising should be the model for everything and that all networks have to be immense and addictive as a result. There are other models: from subscriptions to SAAS direct revenue to DTC sales. The most successful companies have multiple business models. It just makes good business sense to diversify your revenue streams, metrics, and business models away from propaganda.
File under: #propaganda #breakbreakingnews #medialiteracy #algorithmicliteracy #newnetworks
Next week: How big tech has made dystopia mainstream.