FEED issue 30 Web

26 SOCIAL MEDIA Fighting Disinformation

the UK government should introduce online harms legislation within a year of the report’s publication. “Unfortunately,” says Puttnam, “the evidence we took indicated that it may not be until 2022 or 2023. That is crazy, given the pace with which this industry moves.” MOVE FAST AND FIX THINGS The report also recommends that UK communications regulator Ofcom has greater powers to police and penalise digital companies that aren’t acting in the best interest of citizens. “It is not just up to advertisers to ensure the technology giants deal with the pandemic of misinformation on their platforms,” says Puttnam. “The government also has an important role to play and should not duck its responsibility.” Lord Holmes, who specialises in new technologies, as well as diversity and inclusion, demanded greater transparency in looking at the companies that have developed such an intimate relationship with modern society. He and Puttnam also urged the government to hold platforms accountable for the amplification of messaging through fake accounts, bots and AI. He explains: “The reality is that the algorithms have to be auditable. But even before we talk about the auditing of algorithms, it’s high time that we put our foot down on sock puppets.” “One of the most interesting things in our report,” says Puttnam, “is how information gets amplified. So it’s not just a question of two or three people with nutty views. When that news get recommended, and the whole thing takes off in the search, it can create damage. We’re not coming down on the ability of an individual to have free speech and making their views known. But that free speech gets amplified in a very distorted and unregulated way. That’s when we all run into terrible trouble.” “The great claim of the companies is, ‘We don’t even really know what the algorithm’s up to,’” adds Holmes. “Well, IT’S HIGH TIME THAT WE PUT OUR FOOT DOWN ON SOCK PUPPETS

you absolutely do, to the extent of how it’s constructed and what its mission is. And its mission has been constructed in a way to drive extreme content because that content drives dwell time, and that dwell time drives monetisable views.” The report also makes recommendations on electoral reform, including clearly marking online political ads and requiring greater transparency about who is bankrolling them. Media companies should also provide easily accessible online databases of political advertisers. Mozilla provided the committee with guidelines and a suggested API for such an open advertising archive. PUBLIC INTEREST Implementing better public education was another important recommendation. The report cited the Open Society Institute Media Literacy Index, which ranked the UK 12th out of 35 countries across wider Europe at promoting societal resilience to disinformation. The committee’s vision for digital literacy goes beyond mere technological skills and includes education in being able to distinguish fact from fiction, including misinformation, understanding how digital platforms work and how to influence decision makers in a digital context. Estonia and Finland were cited as particular successes in providing citizens with digital literacy skills. It was noted that their proximity to Russia – a known wellspring of disinformation in Europe – was one incentive for keeping their societies well informed.

The report included research from Doteveryone that 50% of people surveyed accepted that being online meant someone would try to cheat or harm them in some way: “They described a sense of powerlessness and resignation in relation to services online, with significant minorities saying that it doesn’t matter whether they trust organisations with their data because they have to use them.” Holmes notes that the very business model of the platforms might not be congruent with healthy democratic discourse. “One of the core problems is that this isn’t a question of freedom of speech. It’s a question of freedom of reach. The difficulty is that if more radical content has a greater dwell time and can drive more revenues off it, then the algorithm gets trained to hook on to that and proliferate that content. That’s what makes this different to fake news and extreme views of the past. There’s nothing new in that. The difference is the pace and proliferation of those views.” Puttnam also points out the need for digital platforms to support the journalism that fuels them. “There is no question that the platforms feed off of the traditional news organisations. There’s a real need for some form of reciprocal relationship, where journalism gets underpinned and supported by the digital platform. That seems to me axiomatic, and I think those conversations are taking place. “It’s unlikely that society itself can ever build ‘herd immunity’ against lies and manipulation,” he concludes. Read the full UK House of Lords report here: bit.ly/3fF0sPN

feedzinesocial feedmagazine.tv

Powered by