| Random Page
Google has created a standard to semantically mark the quality of outbound links using tags.
- rel="nofollow" Tells the search engine not to follow this link. Link to a specific page, but avoid association with it (unreliability, incredibility, untrustworthiness, unfamiliarity).
- rel="sponsored" Tells the search engine this is link was generated through monetary incentives, like an ad or affiliate link.
- rel="ugc" Tells the search engine the link was user-generated, and thus may be untrustworthy or spam, but also that it has organic value.
Last updated April, 2021
LinkSpammers like ShotgunSpam flood open conversations on the Internet as a way of raising their profile on Google, MSN, Yahoo!, and other SearchEngines. Google's PageRank algorithm is famously about counting references to a page rather than keywords on it, and so it makes sense to increase the number of references to your client's page by throwing links to it as far and wide as possible on the Internet. There is an economic incentive to do this because a higher PageRank leads ostensibly to more sales or more traffic or whatever is of value.
Thus, in the vein of NotIndexed, SixApart? and Google have proposed   the simple EconomicSolution of flagging all outbound links on blogs with
rel="nofollow" which tells the SearchEngine spiders to ignore these links in their data sets. Without the economic incentive, spammers in theory will stop spamming since it's costly and pointless.
This strategy will be a failure on several grounds.
- Invisibility. Spammers will not be aware of what blogs or wikis or BulletinBoardSystems have implemented this or not. In theory they could check the page to see whether it is the case, but in practice most spammers don't deeply understand the Internet and the Web. Their objectives are simply to create links to their client's site. A blog is a blog as a wiki is a wiki. Even if savvy spammers stop, naive spammers will still keep coming.
- Partial coverage. Even if some spammers became aware that some blogs and wikis implemented this, many will not, particularly the thousands of older blogs and wikis that are GhostTowns waiting to be spammed. Thus, the incentive to spam blogs and wikis will never go away from this strategy alone as only newer or active fora will be upgraded. With a large pool of potential targets where the vast majority are worth the effort, many spammers will just hit all of them.
- DelayAction. A common solution at least on wikis is to make all links that have been created within a certain period of time, say two weeks, inactive in some way, such as forwarding them through an ExternalRedirector? or through a NoFollow tag. However, many abandoned GhostTowns that are not properly configured will happily let link spam lapse through this quanantine period and become live. Thus, there remains the incentive to just spam all wikis. After all, some will stick, and that's all that matters.
- Indirect action. Link spam does not have a direct and obvious impact on the final PageRank of a client site, only an indirect one that is opaque (cf. OpenProcess#badvogato). One does not know what the actual impact of one specific link is to a given SearchEngine, so one may presume it is above zero. Spammers may conclude that SearchEngines ignore the NoFollow hint just as they ignore the meta keywords they created years ago. After all, search engines rely on links to rank pages. As such, there is value in spreading LinkSpam around even in uncertain circumstances. That's all it will take to induce spam once again. Unfortunately most spammers don't take the time to see if a site is using nofollow so there is really no effect from implementing the nofollow tag. I have found that comment moderation and time delays for posting is the best way to combat spam [Nofollow]
- Decapitation. Blogs, wikis, and other discussion fora that form the TwoWayWeb serve a critical function as a BalancingForce to traditional power centres. As the CluetrainManifesto asserts, the discussions amongst TheAudience are more powerful than the voice of TheAuthor alone. Yet implementing NoFollow or other techniques to remove discussions from the data sets of SearchEngines decapitates the very purpose of these fora in the global battle for attention. What's the point of blogs if they do not compete for power and attention in Google? Without making outward links count, how can criticism of say [Union Carbide] ever hope to raise the Bhopal disaster to the #2 position?
- The outcome of this is not simply to say to bloggers and other amateur commentators buying service from cooperating providers or downloading software built by collaborating developers, "Retreat!" It's not just that poor third world spammers have decimate the vox populi of the rich first world latté set. The outcome is also to bias the SearchEngines towards the owners and controllers of the static web. The non-discursive, traditional power centres that have dominated the world since the Industrial Revolution. Brochureware and other non-critical messages will increasingly dominate the rankings, leaving the rest of us run over by the Cluetrain that we were supposedly on just a decade ago.
- Admittedly, bloggers will continue to make links that count in their own postings. The loss will only be in the comments. Thus, the actual effect for blogs is not so great. For wikis, however, the distinction between TheAudience and TheAuthor is none, and thus the impact is much greater.
- PageRank abuse. Due to the way that Google's PageRank algorithm works, if you want to increase your page rank, you have lots of internal links and few external links. Modify your software to nofollow all external links and your PageRank goes up. However, the TragedyOfTheCommons happens. Everyone does this and suddenly no one is linking to anyone else. 
I strongly disagree with this pessimistic view. It is left to the implementations to use the feature wisely. Wikis will of course not use nofollow for interwiki links. It can't be the job of google to tell good from bad links. If we value the impact of our links in search engines it it our own job to check them and remove the (default) nofollow attribute again. The question is not why it can't work but how to integrate it into our AntiSpam? efforts. -- FlorianFesti?
The single strongest reason why the proposal will fail is that the spammers do not read the sites they spam, and never will. This has been demonstrated countless times, as sites have implemented spam-crippling technologies and announced them loudly to absolutely no effect. The spammers will keep spamming because they have the automatic systems to do so. Whether the spam links have NoFollow or not, they will still appear on the sites being attacked, and that is all the spammers want. And, as Sunir points out above, their attacks will always be successful on the GhostTowns. To suggest that this proposal will be a SilverBullet, as the scads of blogs TrackBacked on SixApart?'s site and other places are, is remarkably naïve. -- EarleMartin
The only silver bullet against spam is [this]. The success of the nofollow attribute can only be to limit the effectivity of spam - to make spamming as a whole less attractive. Of cause we have to continue all our other anti spam messures. But nofollow fits in well into a lot of things we already have.
I'm half in agreement here. It's not going to hurt us to implement it. But spamming is a bulk activity which is already predicated on very little feedback. This isn't a big disincentive. -- PhilJones
``flagging all outbound links on blogs with
'' - actually it flags all outbound links in *comments* with
, links in posts remain unaltered. -- GeorgeHotelling?
I agree with much of the above but you say "to ignore these links in their data sets" is that definitely stated? I thought Google said that it would give no credit for these links for the site linked to. I have not seen Google confirm it would not, nonetheless, count these links in diluting other links on the page. I find it bizarre that Google should offer such an obvious way for Websites to give a different impression to spiders than to human visitors and smell something fishy going on --AndrewCates
Moved from above.
- Abuse. Take the raging competition of the arch-liberal vs. arch-conservative political bloggers. If you are busy criticizing a nemesis, you will need to link to them, but you don't want to help them at the same time. As a result, you will make their links nofollow as well. 
Surely this not abuse, but just an alternative use. It's not the reason google provides the rel=nofollow facility, but it is a bonus. Now people can link to websites which they don't like, without google making its usual assumption, that linking is positive. -- HarryWood
I think there is public value in raising the PageRank of a nemesis. If you are criticizing someone, they become a subject of public interest, and thus should be accessible. But I defer to your point and moved this bullet down into the discussion. -- SunirShah
I think a better idea would be to use semantic attributes such as "ugc" for UserGenerated? content, and let the SearchEngine do its job of ascertaining whether a link is spam or not. The "nofollow" attribute was good in the time it was introduced, but nowadays it's no longer as useful as before since we have better technology these days (it's 2021 after all). Google could for example store an ExternalLink marked as "ugc" in some database, and keep watch of that link. If it doesn't get removed from the webpage in 14 days, then that link is good and should now influence PageRank. 14 days should be enough time for good blogs to remove spam. This, I believe is a much better approach than the indiscriminate nature of "nofollow". -- JobBautista
Google already may positively weight nofollow links.