MeatballWiki | RecentChanges | Random Page | Indices | Categories

Search engine cloaking is when a website provides a different version of a page to a search engine spider then it does to users. This is done to either hide something from the search engine, or to provide different information to it. This is often done by SEO companies to make their pages look useful to users, and to hide their methods from other SEO companies.

One problem with this is that Google despises sites that modify themselves for Google and Google alone. It's quite likely they make test probes with UserAgent?s that don't self-identify as Google to see if things change. They may unilaterally and automatically ban the website from their search engine.

Cloaking to create a HiddenCommunity?

In order to LimitTemptation by not advertising our vulnerability through SearchEngines like Google, we can camouflage ourselves when a spider crawls the site by turning into a NakedWiki. In this way, we are a type of LayeredWikiInterface, only for robots. This essentially turns the 'wiki' aspect the site into a HiddenCommunity? without actually hiding the content. This makes it a superior choice than excluding all search engines via the RobotsExclusionStandard.

The primary advantage of this is EditMasking; i.e. hiding the Edit the text of this page link that spammers are searching for. The secondary advantages of this are that pages like the older revisions, the history interface, backlink searches, category lists, etc. are automatically NotIndexed simply by not being visible to the spider.

We can attempt to detect spiders by

We can then drop into camouflaged mode.

Another option is to use Wiki:ProgressiveEnhancement only to display the edit link. However, this comes with the disadvantage of not being accessible to text-only browsers. -- EarleMartin



MeatballWiki | RecentChanges | Random Page | Indices | Categories
Edit text of this page | View other revisions | Search MetaWiki