An article about the effects blogs have on Google PageRanks. The author is upset that a term — "The second superpower" — is being redefined from its original meaning of public opinion through Google rankings to a blog and an article, both written by James Moore. He equates this with an erasure of the original meaning and an overwriting with a new meaning caused by a small list of A-List bloggers (there is no cabal). Fair enough. Google rankings are vulnerable to attack through links — we've known this since Google bombs. But what does that mean for terms and meanings?
I think the author is shooting way over the target by treating Google as a meaning-defining authority. Google is first and foremost just an automaton, based on arbitrary algorithms, operated by a company that wants to make money through embedded advertising blocks and selling its technology.
Just because Google is currently the most commonly used search engine (is it even?) doesn't automatically make it meaning-defining. It's not an expert team that scientifically analyzes queries. It's not even a system based on democratic consensus. It's simply based on a series of algorithms with which it calculates the relevance of a result.
As one can see, for example, in Kasparov's battle against chess computers, even in a highly deterministic environment like chess, correctly evaluating a chess position (and that's all a chess program does — evaluate positions relative to other positions that arise from a series of moves) is extremely difficult.
The contents of search engines, on the other hand, are human-authored works with all the usual problems: ambiguities, irony, typos, deliberate lies and whatever else people come up with. How is an automaton supposed to reliably determine the relevance rating of a document based on the user's input and the available alternative documents? It can't. There are only approximate solutions. And in these — necessarily, since nothing else exists — technical information and structural information are added and weighted according to predetermined rules.
The automaton doesn't grasp meaning and content. Therefore, an automaton simply cannot be meaning-defining for an expression.
The only thing Google-washing proves is the vulnerability of Google's algorithms. Nothing more and nothing less. And that one should reasonably work with meta-search engines if one wants a broader and more balanced mix of information. In the end, this is probably just someone who is pissed off for relying too much on technology.

I'll spare myself an assessment of the sometimes rather apocalyptic conclusions drawn in the article, as well as the slight hint of paranoia that shines through.
Here's the original article.
Too small for a decent sea monster, but already quite impressive.
I found the original article on Spiegel Online: Science.