Therefore, implement a DeepLinkDefense:
A simpler alternative is to just have unstable URLs, use a session key in the URL. After the session times out, the URLs are worthless, and you can redirect or 404 them as discussed above. But this is quite a broad measure, also disabling bookmarking.
Rather than redirecting somewhere else, you could also show a different view of the same content. For example, if someone links to you from "Supersite", you could change the normal links at the top of the page to read "Welcome, supersiters", and link to a page explaining the difference between your site and Supersite. You might also display the content with fewer features: on a wiki you might disable the EditThisPage link, for example.
The users of the blocked site may view this as an attack.
Some sites block links from slashdot as a basic guard against being SlashDotted.
See also InvoluntaryTransclusion.
Deep links are a core feature of the WorldWideWeb, everybody who does not want to have URL references to his site, should just publish her stuff as PDF or in MS Word format or use an appropriate <META> tag for not getting indexed in search engines. --MarioSalzer?
Meta tags and/or the RobotsExclusionStandard are effective ways to avoid getting indexed in search engines, and typically superior to referrer-checking. However, when wishing to defend against deep links from other places, these do not help. PDF and similar file formats provide a robust defense against deep links, but are an excessive response to what is normally a minor problem.