FEED Issue 14

72 OVER THE TOP Extreme Algorithms

me the basics of the language. If I want to know how to raise llamas, there are endless ‘how to’ videos I can watch. If I have a test tomorrow on a famous novel, there are stacks of films I can comfortably get through before morning. The world of video educates us, it opens our minds, it opens our hearts. But do the facts support this? If the Christchurch shooting proves anything, it’s that the world is as polarised as ever. Tensions between and inside countries seem to be at an all- time high. Racism and ethnic cleansing are reportedly back with a vengeance. ALGORITHMS Most media consumers still imagine that online content is like the local library, where they can browse the stacks until they find something they like – and where they are in control. They might imagine that titles, metadata, curation of content and organisational vocabulary around video are designed to make things clearer for them, to inform them and to make it easier for them to make informed decisions. They don’t know that the digital content ecosystem is often designed to reduce their ability to make informed decisions. If I’m a content producer, I probably don’t want the viewer choosing what they want to watch next. The most important part of my business model is keeping the viewer watching the content I want them to watch, exposing them to the advertisers I need to expose them to in order to keep my business healthy. Our industry has developed a host of techniques and technologies to target viewers with content and advertising. It’s often algorithms seeking maximum engagement that decide what is put in front of the viewer – and what they are going to watch next. BOTTOM OF THE BRAIN STEM According to Tristan Harris, Google’s former design ethicist, the race for a stickier algorithm means that content targeting has become a “race to the bottom of the brain stem”. Ever more sophisticated technologies (some of which you’ve read about in this magazine) are used to keep viewers stuck in screen content. For some platforms – including the biggest content platforms that are the gateways to most of the world’s information – this has meant increasingly extreme recommendations of content. It’s now a commonly studied phenomenon that content that angers and outrages is shared on social media far more quickly and far more often than that which induces

MOSQUE IN SNOW The Al Noor Mosque in Christchurch (below) joins a long list of religious sites targeted by digitally connected extremists

serenity. (Is there any content on the internet that induces serenity?) Outrage and resentment spread globally with the rapidity of the rage in Danny Boyle’s zombie film, 28 Days Later .

Recommendation algorithms on YouTube, for example, have made it a short jump from videos about the history of the second world war to Holocaust denial. Or amusing make-up pranks to sadistic

feedzine feed.zine feedmagazine.tv

Powered by