[Home]DeepLinkDefense

MeatballWiki | RecentChanges | Random Page | Indices | Categories

Some competing website is DeepLinking to your site. Perhaps it's using some semi-automated method like NearLinks or InterWiki links. Perhaps it's done manually. Often, you'd welcome those deep links, but sometimes (as discussed on DeepLink) you may not.

Therefore, implement a DeepLinkDefense:

  1. read the referer information $Referer=$ENV{HTTP_REFERER};
  2. match the referer url with a filter list
  3. redirect incoming requests to the FrontPage, an explaining page, or a 404 page.

Example: AOL blocked links from LiveJournal prior to launching their own blogging service [1].

But, this will not work perfectly, because the referer information is not given in a reliable way. Browsers can be configured not to send referrer information. People can bounce off a redirecting HTML page on another website using JavaScript or HTTP-REDIRECT.

Untargeted defense

A simpler alternative is to just have unstable URLs, use a session key in the URL. After the session times out, the URLs are worthless, and you can redirect or 404 them as discussed above. But this is quite a broad measure, also disabling bookmarking.

Javascript and frames

It is possible to create a site that uses frames for its content, and has javascript on each content page to check that it is being used within the appropriate frame set. If it is not being used within the appropriate frame set, the user can be redirected to a front page. Such defences only work with javascript-enabled browsers.

Referer based views

Rather than redirecting somewhere else, you could also show a different view of the same content. For example, if someone links to you from "Supersite", you could change the normal links at the top of the page to read "Welcome, supersiters", and link to a page explaining the difference between your site and Supersite. You might also display the content with fewer features: on a wiki you might disable the EditThisPage link, for example.


The users of the blocked site may view this as an attack.

Some sites block links from slashdot as a basic guard against being SlashDotted.

See also InvoluntaryTransclusion.


Deep links are a core feature of the WorldWideWeb, everybody who does not want to have URL references to his site, should just publish her stuff as PDF or in MS Word format or use an appropriate <META> tag for not getting indexed in search engines. --MarioSalzer?

Meta tags and/or the RobotsExclusionStandard are effective ways to avoid getting indexed in search engines, and typically superior to referrer-checking. However, when wishing to defend against deep links from other places, these do not help. PDF and similar file formats provide a robust defense against deep links, but are an excessive response to what is normally a minor problem.


Discussion

MeatballWiki | RecentChanges | Random Page | Indices | Categories
Edit text of this page | View other revisions
Search: