FEED Issue 04

34 GENIUS INTERVIEW Francesca Tripodi

of how what we search on really matters and how the key words we put into Google might unintentionally reaffirm our existing beliefs – and in some cases, take those beliefs to a more extreme place. FEED: Google has become the world’s major portal for accessing information. What did your research reveal about its use for getting accurate news? FT: What I argue in the section of my report called ‘Googling for Truth’ is that Google as a platform can be really great for finding the local pizza place or a historical date, but what worries me is people are turning to it to find information about who they should vote for and what news they should be paying the most attention to. They think that Google is a neutral purveyor of information, but users don’t understand how what you put into Google dramatically alters what you get out. For example, in the Virginia election for governor there was a campaign ad that accused the Democratic candidate Ralph Northam of squandering $1.4 million on a fake Chinese company. If you Googled ‘Northam fake Chinese company’, you got returns from The Washington Post , an editorial in the Richmond, Virginia newspaper, and FactCheck.org. But when you added ‘1.4 million’, the returns were dramatically different – you got a link to Americans For Prosperity who funded the ad, and to the Republican Governors Association. What this says is if you know your audience – and you know, for example, that conservatives are more concerned about fiscal responsibility – through search engine optimisation and tying that ‘1.4 million’ to your site, you can shift those search results. Ideological beliefs can be IT SCARES ME HOW CLOSED OFF WE SEEM TO BE FROM ANY OPINION THAT DIFFERS FROM OUR OWN

dabble in content that is much more extreme. By having certain guests on your shows that, then, are guests on other – much more radical – shows, you are algorithmically connecting to those more radical points of view. On YouTube’s ‘Up Next’ you are more likely to appear beside them. That can help create these rabbit holes where viewers are being exposed to more radicalising thinking. In Safiya Noble’s book, Algorithms of Oppression , there’s a great conversation about (convicted mass murderer) Dylan Roof Googling ‘black on white crime’. Google has now fixed it – if you Google ‘black on white crime’, you get links back explaining how it is not actually a real thing. But prior to Dylan Roof’s murdering nine African-Americans in a place of worship, when you Googled ‘black on white crime’, the top link was the Council of Conservative Citizens, which was actually

tied to very simple changes in Google search terms. I observed the same thing with the NFL protests. If you type ‘NFL ratings up’ or ‘NFL ratings down’, the results will reaffirm your ideological position. What worries me is people have become so fed up with news that they don’t trust it, so they are blindly turning to Google as a source of trusted information, without examining how Google is a corporate media company that also has a bottom line. In a time when we’re being taught to be very critical of media, we’re not providing that same critical lens to sources like Google. FEED: Are search algorithms leading us to more extreme content?

FT: I think there are a network of personalities in conservative media that

feedzine feed.zine feedmagazine.tv

Powered by