Google is a significant force in the dissemination of information, and that translates into power. The most ethical use of that power is no use at all: just give us a way to find what’s on the web, and let us do the filtering, thanks. As you probably know, Google has the credo “Don’t be evil,” a three-word invitation to controversy. What does Google regard as “evil,” exactly? Its Code of Conduct Preface explains:
“Googlers generally apply those words to how we serve our users. But “Don’t be evil” is much more than that. Yes, it’s about providing our users unbiased access to information, focusing on their needs and giving them the best products and services that we can. But it’s also about doing the right thing more generally — following the law, acting honorably and treating each other with respect.”
Good. That’s seems exactly right— unbiased access to information. Two recent situations, however, have raised questions about how unbiased Google really is.
The Climategate Autosuggestion Cover-up
“Climategate” is the hackneyed term many in the media and the blogosphere have attached to the developing controversy swirling around e-mails and other materials hacked from the University of East Anglia, some of which suggests that prominent climate change researchers suppressed data and dissent that might weaken the perception of “consensus” on global warming. When one attempt to search for news items or other web posts about “Climategate” by typing the term into the Google search space, the engine’s autosuggestion process acted as if it had never heard of the term. This was first noticed by blogger Terry Hurlbut yesterday, almost a week after the term became ubiquitous on the web: once you entered “Climategate,” there were millions of references. Since the autosuggestions are generated by a computer program, there is no reason for such a lag. The autosuggestion “climategate” did not appear even when the word was completely typed in—“Climate Guatemala” and “Climate Guatemala City” came up instead—until after noon E.S.T.today. Was Google trying to suppress information that might fuel skepticism about global warming? Hurlbut seems to think so. If he’s wrong, Goggle should explain how.
Popehat (yes, they are on a roll over there) reported that for about a week, the results of a Google image search for “Michelle Obama” would prominently display an offensive image of the First Lady looking as if she lived in the “Planet of the Apes” White House. Then Google took it off the site. It does not remove other offensive caricatures, however. For example, altered photographs portraying George W. Bush as an ape or a cretin, have appeared in Google image searches for years, as have other offensive race-based images. The anti-Muslim cartoon that caused riots in Denmark appears in Google searches, even though many U.S. publications (shamefully) refuse to publish it. Why is Michelle Obama accorded special protection? If Google is going to pick and choose what images it considers too offensive (To the “non-evil” ideologies? To the “non-evil” political party? To fans of a non-evil political figure?), what standards is it using?
Both of these episodes may have innocent explanations. Still, Google bears watching. If it is not going to adhere to its own Ethics Code, if it cannot be depended upon not to skew its search results according to political preferences and biases, then it cannot be trusted.
If it cannot be trusted, then it is time to find a new search engine.