Google’s motto is “Don’t be evil.” It’s well-debased by now: agreeing to help China censor the internet modeled a non-existent distinction between “don’t be evil” and “don’t assist evil.” I’m not ready to call Google’s looming truth algorithm “evil,” but it is certainly sinister and dangerous.
Google’s search engine rose to dominate the field by using the number of incoming links to a web page to determine where it appears in search results. Pages that many other sites link to are ranked higher. “The downside is that websites full of misinformation can rise up the rankings, if enough people link to them,” says Newscientist.
Now a Google research team is altering the system to measure the trustworthiness of a page, rather than its web popularity. Instead of counting incoming links, the proposed new system would count the number of “incorrect” facts within a page. “A source that has few false facts is considered to be trustworthy,” says the team. Each page will get its computer-determined Knowledge-Based Trust score, which the software will derive by tapping into Google’s Knowledge Vault, a repository of what Google’s claims is Absolute Truth based on web consensus. Web pages that contain contradictory information will be bumped down the rankings, so fewer minds will be warped by non-conforming information.
Naturally, the Left, assuming that its view of the universe is the unassailably correct and virtuous one, loves this idea. That should put that”climate change denialists” in their places–at the bottom of web searches. Says Salon, which never met a conservative argument that wasn’t a lie (NEVER met? Oh, oh. There goes Ethics Alarms down the search results!), “Even though the former program is just in the research stage, some anti-science advocates are upset about the potential development, likely because their websites will become buried under content that is, well, true.” Continue reading