FEED Issue 15

56 FUTURE SHOCK Content Moderation

TAKING CONTENT TO THE CLEANERS Content moderation jobs are awful. But is AI the solution to the internet’s trash problem? Words by Chelsea Fearnley

ccording to The Wall Street Journal , “the equivalent of 65 years of video are [sic] uploaded to YouTube each day” and

reporter Jim Taylor in 2018 that he had become so desensitised to pornography after reviewing pornographic footage all day that he “could no longer see a human body as anything other than a possible terms of service violation.” But his colleagues had it worse: “They regularly had to review videos involving the sexual exploitation of children.” The negative effects of human content moderation aren’t just being felt by those in the job role; brands are also starting to suffer. Due to public opinion and regulatory pressure, there are increased risks of penalties involved for monetising content in the wrong context.

ACTIONS ARE MUCH MORE DIFFICULT TO CLASSIFY THAN OBJECTS. FOREXAMPLE, KISSING ISANACTION, BUT NUDITY ISANOBJECT

humans are still the first line of defence against protecting its 1.9 billion users from exposing society’s darkest impulses. But resolve to remove humans from the ugly business of content moderation grows as more and more horror stories unfold from former employees, and as the cost of employment increases. Companies are loath to devote anything more than the bare minimum to a task that does not contribute to profit. One man, who asked to remain anonymous, told BBC

feedzine feed.zine feedmagazine.tv

Powered by