This image was lost some time after publication.

Google worship has gone too far. The latest prayer to the pretender to God-like omniscience comes from Wired editor Chris Anderson (and if it drums up enough controversy, it's bound to end in a book deal). He argues that we should give up on the allegedly outmoded maxim that "correlation is not causation," because now we're in the "Petabyte Age" and we can manipulate so much data that we can solve our problems without having to understand them.

The new availability of huge amounts of data, along with the statistical tools to crunch these numbers, offers a whole new way of understanding the world. Correlation supersedes causation, and science can advance even without coherent models, unified theories, or really any mechanistic explanation at all. There's no reason to cling to our old ways. It's time to ask: What can science learn from Google?

The problem here is that if we stop asking the question "why?" then we are basically making for the foundations of faith. You can always make statistics say nearly anything you want, it simply depends on the assumptions you make when you analyze and present them. While Google's search algorithms are the best currently available, they are not infallible — if they were, then Google wouldn't have the advertising business that Anderson speaks so highly of, as people would find what they were looking for in the natural results.

It's a typical technocratic argument that privileges the rough trade in applied science over the pansy practice of theory. Applied science can be commercialized, and therefore profitable — pure, theoretical science much less so. The thinking goes that markets, those ruthlessly efficient arbiters of quantifiable value, don't need a priori hypotheses to make their judgments, so let's leave the thinking to machinations of mathematics and simply guess at the intent of the black box.

But by implying that you can simple toss aside causation is specious sophistry. Because when you stop asking "why" and only ask "what" and "how much" you're bound to lose a grip on strict rationality. As Schroedinger clearly demonstrated, the very act of measuring can affect the outcome of the measurement. Anderson should be careful what he wishes for — by putting faith in the invisible hand without modeling possible outcomes, we will get what the algorithm calculates we deserve, whether we like it or not. (Photo by Dave O)