MeatballWiki | RecentChanges | Random Page | Indices | Categories

One basic strategy of LinkSpam is to (robotically) search Google for editable wikis. This can be done either by directly searching for the edit LinkText (e.g. 'Edit text of this page.') or for wikis in general. Once a wiki is found and its type is detected, a robot can then spam the site with engine specific code. Indeed, it is a very good strategy to write a robot to do exactly this as wikis are fairly homogeneous. In nature, any homogeneous population is easily preyed on by parasites.

One option is to follow a similar strategy in Nature. By evolving a level of biodiversity in the identifying markers, it will become difficult for parasites to find and exploit a species. For instance, humans have a variety of blood types, making it difficult for some parasites to infect all of us, and thus making it less cost effective to evolve a strategy of infecting human blood. Similarly, we can rotate the LinkText and action URIs either by site or by session.

For instance, using UseModWiki as an example, rather than "Edit text of this page." you could display something unique to your site, like "Edit this page's text." This will make it harder to find your wiki via Google:Edit+text+of+this+page. Additionally, you could defeat robots by changing the edit action from MeatBall:action=edit to something random like MeatBall:action=tide.

One disadvantage of this strategy is that good software written to take advantage of a stable MachineInterface to your WikiEngine will not work on your site. However, if the changes are stable, you might be able to adapt this software. Alternatively, you could provide a separate login for known good scripts, although this creates a PricklyHedge to using these kinds of tools.

A more generic solution along the same lines is a SearchEngineCloak.

Randomly generated EditMasks

If these changes are stable, so they appear the same for all site visitors for a long period of time, you still risk the possibility that a given spammer will adapt their robot to attack your wiki specifically, particularly if your site enjoys a high PageRank. An alternative solution frequently proposed is to randomly generate new edit and post actions and LinkText for each session. To understand this, consider if you stored the actions in a table with a timestamp, e.g.

 random action timestamp
  asdf   edit  110023032
  jklm   save  110023042

for all unknown actions, you could check to see if it was in this table; if so, you would instead run the intended action in its place. Also, retire all entries older than, say, 24 hours to keep the table fresh.

To do this successfully. You would have to rotate the LinkText as well in some way that makes it still possible for the would-be editor to find and click on the edit link. Presumably it will include the word 'edit', and thus be easy to find. Even if it weren't, if only one link on the page is randomly changing every time you load the page, this is obviously the link to use. You could pepper the page with 'edit' links, but this would only confuse the human editors. You could hide the link text, but a would-be robot could detect hidden links. You could use images, and 0-pixel images for the hidden text, but again a robot could detect this. You could simply bury the bogus links in deep space, say hundreds of screens down, but the robot need only look for the edit link exactly where the user would expect it. The same goes for creating many spurious edit forms on the way to saving.

Also it is possible to search for Google:RecentChanges or Google:inurl:RecentChanges. For completeness you'd also need to rotate common page titles, which is also bad.

In short, this strategy will not work. You cannot make the UserInterface total random gibberish for normal people just to defeat the spammer. One could use Javascript to de-gibber the interface once it's in the user's browser, but that prevents a significant category of viewers from using the site (namely: those with older or textual browsers; those who are visually impaired; and those who hate Javascript).



MeatballWiki | RecentChanges | Random Page | Indices | Categories
Edit text of this page | View other revisions | Search MetaWiki