One problem with this is that Google despises sites that modify themselves for Google and Google alone. It's quite likely they make test probes with UserAgent?s that don't self-identify as Google to see if things change. They may unilaterally and automatically ban the website from their search engine.
In order to LimitTemptation by not advertising our vulnerability through SearchEngines like Google, we can camouflage ourselves when a spider crawls the site by turning into a NakedWiki. In this way, we are a type of LayeredWikiInterface, only for robots. This essentially turns the 'wiki' aspect the site into a HiddenCommunity? without actually hiding the content. This makes it a superior choice than excluding all search engines via the RobotsExclusionStandard.
The primary advantage of this is EditMasking; i.e. hiding the Edit the text of this page link that spammers are searching for. The secondary advantages of this are that pages like the older revisions, the history interface, backlink searches, category lists, etc. are automatically NotIndexed simply by not being visible to the spider.
We can attempt to detect spiders by
We can then drop into camouflaged mode.
Another option is to use Wiki:ProgressiveEnhancement only to display the edit link. However, this comes with the disadvantage of not being accessible to text-only browsers. -- EarleMartin