Evgeny Morozov thinks search engines should "flag" sites that promote nutty conspiracy theories. It's a nutty idea.
FORTUNE — There is a lot of garbage and nonsense on the Internet. There will always be a lot of garbage and nonsense on the Internet. And, thanks to our understandable wish to clean up garbage when we see it, there will always be misguided calls to do something about it.
Not that nothing can be done. Google’s GOOG algorithms, imperfect as they sometimes are, help keep fraudulent and spammy links out of the upper reaches of search results, for example. But what about “dangerous” ideas, or those that are just wrongheaded, nutty or insulting?
Evgeny Morozov (his Twitter bio: “There are idiots. Look around.”) believes that Google and other search engines should slap warning labels on links to such ideas. The Internet has evolved, he laments at Slate, “with no or little quality control.” This has yielded some good outcomes, like Wikipedia and Twitter. “But it has also spawned thousands of sites that undermine scientific consensus, overturn well-established facts, and promote conspiracy theories.”
It’s not clear that any facts have actually been overturned, but I’ll give Morozov a bit of rhetorical leeway on that one. His problem is with the nuts: 911 Truthers; people who believe, despite overwhelming scientific evidence to the contrary, that vaccines cause autism; disbelievers in evolution, global-warming deniers, and the like.
Except, when are they “the like?” This is where Morozov’s proposal falls apart. The Truthers are pretty clearly misguided, as are the anti-vaccine people. And, yes, most of the global-warming deniers. Global warming (“climate disruption” is a better term) is clearly real. But does that mean we should lump all critics of the current scientific consensus together, and together with 911 Truthers and Holocaust deniers? Like many truths, the truth of climate disruption is complex. There is consensus among legitimate scientists that it’s happening, but that consensus isn’t total. There are plenty of valid disagreements over the severity of the problem and the potential negative consequences of it, and even more over what, if anything, should be done about it. Some of the critics who point out or exploit these disparities are labeled “deniers,” whether or not the appellation actually fits. Sometimes it doesn’t.
How is Google supposed to go about making such fine distinctions? And on a more basic level, since when is Google supposed to be in the business of deciding what is truth and what isn’t? Even if it were technically possible, it’s not Google’s place to tell us that we shouldn’t take seriously Jenny McCarthy and her anti-vaccine army, any more than it is Google’s place to decide which sites are and are not engaged in digital piracy (again, it’s not always clear).
Morozov writes that by flagging questionable links, Google would be advising users to “exercise caution and check a previously generated list of authoritative resources before making up their minds.”
But those sources are right there, in the Google results, along with links to all the nutty stuff. People can choose to believe either the lone blogger with the garish Web site, or the experts who know what they’re talking about. It’s hard to envision the person who might be given to believing in conspiracy theories being persuaded to conduct further study by a Google “red flag.”
Google does do filtering — of the above-mentioned fraud and spam, as well as of porn (in the latter case, the user decides whether to apply the filter). It is able to push some of the bad stuff lower in the search rankings, making more room on top for the good stuff. It does this largely through the collective wisdom of the Internet itself — what gets linked to the most, for instance. But that’s far different from deciding what is and what is not a “dangerous” or “wrongheaded” idea and what is a “legitimate” one. Google tries as hard as it can to represent the Internet as it is, not the Internet as Evgeny Morozov thinks it should be.