QUOTE(Alison @ Tue 30th March 2010, 4:51pm)
![*](style_images/brack/post_snapback.gif)
Is there any reason, Daniel, why this hasn't happened yet? I can't see any advantage to not having it in there and, like you say, adding it to the robots.txt is pretty trivial. Is there some political reason, or Google reason why this hasn't happened yet? We've all seen attack article and other junk get 'stuck' in the cache for weeks on end, so this would seem a priority. Am I missing something here?
It doesn't work in robots.txt — it has to be in the header of every single page:
CODE
<HTML><HEAD>
<META NAME="ROBOTS" CONTENT="NOARCHIVE">
...
</HEAD>
But that's still trivial, and all major search engines recognize and respect it.
I suspect that Wikipediots fear that Google will be unhappy with Wikipedia if Wikipedia starts doing this on every page. Maybe so, but normally it has zero effect on rankings. If it didn't have zero effect on rankings, one could argue with more conviction that there is no viable opt-out for the cache copy, and therefore make a case that it's a copyright issue because Google has copied and republished the entire page.
I've been doing it on all my sites for nearly ten years, and I don't see any effect on rankings.
Here's a
typically uninformed discussion of a failed proposal along these lines on Wikipedia. It's not NOCACHE, for example, it's NOARCHIVE, as someone pointed out after much silly discussion.
And here's an example from 2007 of why you may not want a "Cached" link on Wikipedia articles:
Daddy, daddy, I found George Washington on Wikipedia!