Google admits that Plato’s cave doesn’t exist: Cory Doctorow has an interesting piece here on Google’s role as a gatekeeper to the Internet.
Google is under increasing pressure to change the way it ranks search results. Earlier in June, Recording Industry Association of America chief executive Cary Sherman told the US Congress that Google should be required to place “legitimate” sites at the top of the list when its users search for musicians and music.
Presumably, Sherman would prefer that the “illegitimate” sites – whether that’s the Pirate Bay or some other site – not be returned at all in the rankings
I find this fascinating, for the same reason that I’ve always found the idea of journalism as a gatekeeper for news fascinating. A decade or two ago, before the Net became a thing, we were essentially at the mercy of the major news organizations to tell us what to think about. If something didn’t get reported by a newspaper (local or national), or the evening news, you weren’t likely to hear about it. In this sense, the news organizations (especially bureaus like AP) held enormous power — they held the keys to the “gate” and decided what we would hear about and what we wouldn’t.
With the rise of the Internet, and the ability for citizens to report on things, this impact has lessened. But now the gatekeeping function has shifted to things like Google, and its ability to interpret what we want and show us what is most relevant…but judged by whom?
Relevance — like newsworthiness — is wholly dependent on the perspective of the receiver. Something is relevant on a sliding scale depending on who is evaluating it, and it you can only make judgements on something’s “relevance” if you are absolutely sure who is reviewing it.
(This is why I love conversations about “media bias.” Biased on behalf of whom? To say media or news is “biased” often just means “it doesn’t appeal to me,” with no consideration that there are likely other people to whom it does appeal. Do they think its biased?
This article from the Nieman Journalism Lab at Harvard presents some solid evidence here that we “detect” bias when we associate ourselves with a group, then consume news in the context of that group. Bias is also more prevalent when we know the source of the news, rather than when the news is anonymously sourced. Someone on the left is going to “detect” bias from Fox News, regardless of what they report.)
So, Cary Sherman of the RIAA wants only “legitimate” sites returned in Google results? Who decides what is legitimate? If I’m Mr. Sherman, then The Pirate Bay is certainly not. But if I’m a 14-year-old looking for Lady Gaga’s new album and not wanting to pay for it, then The Pirate Bay most certainly is legitimate. Now we’re headed down a slippery slope of the moral worthiness of various sources, and Google is not going to do this conversation (remember Google’s position on searches for the word “Jew”?).
Right after reading Doctorow’s piece, I ran into this article from the Nieman Lab about how news filters are hard.
What makes a “good” filtering algorithm? There’s no easy answer.
How can there be? The question of who should see what when is not like the question of how far the moon is from the Earth, or what the mayor ate for breakfast today, or whether Elvis is still alive. Those questions have simple answers that are either right or wrong. But choosing who sees what information is not like that at all: There’s no one answer, just many possible visions of the type of world we’d like to live in.
Google, for its part, has been trying to get around this problem for years by developing personalized filters. This means that if you and I search for the same thing, we might get different results. Google knows so much about us, that it could conceivably profile us to the point where it can filter results to put them in the order that we would want, to the exclusion of everyone else. So, Mr. Sherman could see results from the artist and their label, and our 14-year-old Gaga fan would see The Pirate Bay as the top result.
Can Google do this? Time will tell, but if this appeals to you, then I invite you to read Eli Pariser’s masterpiece “The Filter Bubble,” which makes a great case that personalized filters harm us all in the long run.