http://en.wikipedia.org/wiki/Wikipedia:Sea...efault_proposal
QUOTE
Wikipedia currently as of mid-February 2009 allows all search engines to "cache" its results. That is, if a search engine like Google happens to crawl a page, any inappropriate or "bad" content, including WP:BLP violations, may be propagated out onto the Internet for an indeterminate amount of time. However, we have the ability to set Wikipedia to be "NOCACHE" in our robots.txt file. The major benefit of this is that search engines would only report the "current" state of an article (or any page) at any given time.
At least once, a slightly prominent BLP article was vandalized with racial epithets, that the world's search engines then cached.[1] A vandal replaced the entire BLP article with three epithets.[2] However, the damage was done, and according to Wikipedia on search engines, we were now referring to the BLP subject as "NIGGA".[3] The edit was reversed less than two minutes later, but the damage was done.[4]
That was one of the single most-watched BLP articles we've ever had--what chance do the hundreds of thousands of lesser-known BLP articles have? The idea behind this proposal would be to protect not just BLPs, but the integrity of our articles themselves from being cached with bad information, even temporarily.
At least once, a slightly prominent BLP article was vandalized with racial epithets, that the world's search engines then cached.[1] A vandal replaced the entire BLP article with three epithets.[2] However, the damage was done, and according to Wikipedia on search engines, we were now referring to the BLP subject as "NIGGA".[3] The edit was reversed less than two minutes later, but the damage was done.[4]
That was one of the single most-watched BLP articles we've ever had--what chance do the hundreds of thousands of lesser-known BLP articles have? The idea behind this proposal would be to protect not just BLPs, but the integrity of our articles themselves from being cached with bad information, even temporarily.