1. v.,n. [From the Usenet group alt.folklore.urban] To utter a posting on Usenet designed to attract predictable responses or flames; or, the post itself. Derives from the phrase ‘trolling for newbies’ which in turn comes from mainstream ‘trolling’, a style of fishing in which one trails bait through a likely spot hoping for a bite. The well-constructed troll is a post that induces lots of newbies and flamers to make themselves look even more clueless than they already do, while subtly conveying to the more savvy and experienced that it is in fact a deliberate troll. If you don't fall for the joke, you get to be in on it. See also YHBT.
2. n. An individual who chronically trolls in sense 1; regularly posts specious arguments, flames or personal attacks to a newsgroup, discussion list, or in email for no other purpose than to annoy someone or disrupt a discussion. Trolls are recognizable by the fact that they have no real interest in learning about the topic at hand — they simply want to utter flame bait. Like the ugly creatures they are named after, they exhibit no redeeming characteristics, and as such, they are recognized as a lower form of life on the net, as in, “Oh, ignore him, he's just a troll.” Compare kook.
Trolls generate emotions in others while not investing any on their own side. In real life you could not escape from the emotions of your peers — if they are angry you would be physically insecure so there is a BalancingForce, but in CyberSpace you can feel safe. Additionally electronic communication is known to amplify emotions. Those two features collaborate to make trolling so demolishing. When you analyze this phenomenon you need also take into account that very angry people have their thinking under an AngryCloud — so the troll with cool mind always has the advantage over the angry community members.
Trolls are of two kinds: Troll by intent which are made purposely and Troll by result which are not intended to troll in the first place.
Sometimes trolls feel that they serve a community purpose, by shaking things up, stirring up discussion, playing the DevilsAdvocate, by being the CourtJesters, but they really aren't. They are a perversion of all those purposes. Nonetheless, you are never going to get rid of them. Wherever there's a crowd there's an attention seeker. It's human nature. The only known way to avoid a Troll effect is to bite not the bait in the first place. It is often believed that not paying attention will get you rid of trolls (cf. DissuadeReputation.), while it do help with some trolls on a local scale, on the other hand while not changing anything on the global scale it is also sometimes considered a victory for the troll author. To efficiently deal with trolls by intent, one has to understand what trolling is, how it works, why it works and moreover what makes the person behind it do it.
Trolls are often insane. They are created when the insane jump from one community to another looking for a home; after a while they become trained that they are only going to get attention if they attack.
Their most common strategy on a wiki is to start a ForestFire. Collective sanity must prevail.
A good troll is like a bad driver. They're never in an accident, but they see a lot in their rear view mirror.
Baker, P. (2001). Moral panic and alternative identity construction in Usenet. Journal of Computer-Mediated Communication, 7(1). Retrieved January 23, 2004 from http://www.ascusc.org/jcmc/vol7/issue1/baker.html
Bruckman, A. (1994). Approaches to managing deviant behavior in virtual communities. Proceedings of Computer-Human Interaction '94, Boston, Massachusetts, April 24-28.
Ciffolilli, A. (2003). Phantom authority, self-selective recruitment and retention of members in virtual communities: The case of Wikipedia. First Monday, 8(12). Retrieved March 10 from http://firstmonday.org/issues/issue8_12/ciffolilli/index.html
Collins, M. (1992). Flaming: The relationship between social context cues and uninhibited verbal behavior in computer-mediated communication. Retrieved March 10 from http://www.emoderators.com/papers/flames.html
Dibbell, J. (1998). A rape in cyberspace (Or TINYSOCIETY, and how to make one). Retrieved March 10 from http://www.juliandibbell.com/texts/bungle.html
Donath, Judith S. (1999). Identity and deception in the virtual community. In M. A. Smith and P. Kollock (eds.), Communities in cyberspace. (pp. 29–59) London: Routledge.
Herring, S., Job-Sluder, K., Scheckler, R., and Barab, S. (2002). Searching for safety online: Managing �??trolling�?� in a feminist forum. ''The Information Society, 18', 371–384.
Miller, M. (1990, February 8) FOADTAD. Message posted to news:alt.flame. Retrieved January 25, 2004 from http://groups.google.com/groups?selm=131460%40sun.Eng.Sun.COM&oe=UTF-8&output=gplain
The On-line Hacker Jargon File, version 4.4.7. (2003a). Retrieved January 25, 2004 from http://www.catb.org/~esr/jargon/html/S/September-that-never-ended.html
The On-line Hacker Jargon File, version 4.4.7. (2003b). Retrieved January 25, 2004 from http://www.catb.org/~esr/jargon/html/T/troll.html
Phillips, D. J. (1996). Defending the boundaries: Identifying and countering threats in a Usenet newsgroup. The Information Society, 12, 39–62.
Phillips, D. J. (2002). Negotiating the digital closet: Online pseudonymity and the politics of sexual identity. Information, Communication & Society, 5(3), 406–424.
Suler, J. (1997). The bad boys of cyberspace: Deviant behavior in online multimedia communities and strategies for managing it. In J. Suler, The psychology of cyberspace. Retrieved January 21, 1999 from http://www.rider.edu/users/suler/psycyber/badboys.html
Suler, J. (2003). Life at The Palace: a cyberpsychology case study. In J. Sunir, The psychology of cyberpace. Retrieved March 10 from http://www.rider.edu/~suler/psycyber/palacestudy.html (article orig. pub. 1996)
Van Gelder, L. (1996). The strange case of the electronic lover. In C. Dunlop and R. Kling (eds.), Computerization and controversy: Value conflicts and social choices. (2nd ed.) (pp. 533–546), San Diego, CA: Academic Press. [AlexAndJoan]
Alexander, C. (1977) A pattern language. Oxford: Oxford University Press. [PatternLanguage]
Google (2001). Google acquires Usenet discussion service and significant assets from Deja.com. Retrieved March 10 from http://www.google.com/press/pressrel/pressrelease48.html
Leuf, B. and Cunningham, W. (2001). The Wiki way: Quick collaboration on the Web. Boston: Addison-Wesley Longman. [TheWikiWay]
Although the earliest reference to trolling in the Google Usenet archive was Miller (1990, February 8), trolling has only been recently described by the literature, initially by Donath (1999), who used several anecdotal examples from various Usenet newsgroups in her discussion. She provides a concise overview of what trolling actually means socially to an online community:
However, since Donath (1999), only Baker (2001) has studied another Usenet case regarding a homosexual secretly playing the homophobe in the newsgroup news:alt.tv.melrose-place, and Herring, Job-Sluder, Scheckler, and Barab (2002) have described a Usenet-like feminist discussion forum invaded by an anti-feminist troll. Earlier literatures have described troll-like behaviour without using this common term, but rather the more generic term for online disputes, flaming (Collins, 1992; Bruckman, 1994; Phillips, 1996; Suler, 1997). Finally, there are the two canonical cases of online drama, AlexAndJoan (Van Gelder, 1996) and MrBungle (Dibbell, 1998). It is surprising that such a common aspect of online life remains under theorized in the literature.
Most of the existing discussion centres around UseNet (RFC 1036) or other similar discussion boards. The Usenet model of discussion is simply a series of time-ordered messages by individual authors broadcast to all other authors. Each message is formed much like a memorandum, with author, date, subject, message body, and signature. Messages are sent out to newsgroups, which is the way Usenet catalogues discussions. Newsgroups are arranged in a hierarchy, with top level names being things like soc (social, society, sociology), rec (recreation), sci (science), and comp (computing) which are controlled by a central authority on Usenet, and the fully open alt (alternative) group where anyone can create a group. Newsgroups are syndicated around the world by a means of decentralized relay passing, such that when one posts a message to a newsgroup, it is uploaded to one's server who then copies it to its neighbour servers who then copy it in turn to their neighbours, until that message crosses the entire network. This greatly reduces the ability of one to control a newsgroup, although it is possible to create ‘moderated’ newsgroups such that a designated person has to approve each and every message posted to that group. Newsgroups can only become moderated when they are created; otherwise, they remain open for anyone to post to for the rest of their lives. Finally, within newsgroups, messages are grouped by their subjects into threads, ordered first by their date and most likely in a hierarchy reflecting which message replied to which (Udell, 2001).
The structure of Usenet affords the would-be troll with many opportunities to harangue an existing community. For one thing, participation in a Usenet newsgroup is completely open given its decentralized nature. Anyone can read any newsgroup provided their connection to the network gets a copy of it. Further, for newsgroups that are not moderated, anyone can post to the newsgroup at will. There is no way to block them at the source. This means that a community based in a newsgroup must tolerate any troublemaker that enters their midst. One solution that arose to combat this problem was the KillFile (Donath, 1999), which allows each reader to single out an author to ignore. However, this must be done on a per reader basis. Newcomers to the newsgroup will not know to do this, and they may be susceptible to the troll's advances.
Also, significantly missing from the Usenet and other discussion boards is an authenticated letterhead or other conventional signs of status (Collins, 1992) or identification (Donath, 1999). Social context cues are more implicit, such as the reputation of the author through time, or the content of the message body. Donath (1999) suggests that the domain names, or the originating hosts, of the authors can play a significant role in how a message is perceived. For instance, posting from a .mil domain would lend more credibility on the subject of warfare than posting from a .com domain. The most disparaged domain is aol.com, as America ONLINE (AOL) users have routinely demonstrated ‘culture clash’ (Suler, 1997), whereby they behave inappropriately in other communities (Donath, 1999). Donath further points out that before the commercialization of the Internet in 1993 users were at least identifiable through their institutions (such as universities and government agencies), yet afterwards when users could subscribe to commercial throwaway accounts they could be perfectly anonymous. This afforded levels of identity deception unheard of before (although it was still possible), and thus burgeoned a flood of disruptive individuals, popularly known as TheSeptemberThatNeverEnded (TJF, 2003a). Commercial improvements since then like anonymous proxies (Donath, 1999) and Zero Knowledge's Freedom (Phillips, 2002).
However, Usenet has one advantage for research that many other highly controlled, closed, private, commercial spaces do not. In 1995, the DejaNews? service began archiving and indexing all of Usenet. Later DejaNews? was purchased by Google (2001), who extended the archive even further back. While previously available newsgroup history was limited to a most a few weeks, the archive has given researchers the ability to dig through months of newsgroup postings to unobtrusively tease apart the story of a particular conflict in that space. Moreover, conflicts can be found by searching the entire Usenet archive for key words. This explains why research into trolling has only been recent. There simply has not been much available material to base an analysis from.
Of course, previously there have been recounts, but these have been highly personal and idiosyncratic. For instance, the 1982–1983 saga of AlexAndJoan from the CompuServe? discussion forums — “which were structured very similarly to Usenet except they were owned and managed by one private commercial company” — was recounted (Van Gelder, 1996) by a reporter for Ms magazine in a rhetorical, narrative style, trying to describe her emotional response to being betrayed. In this story, Alex, a 50+ shy Jewish psychiatrist from New York, pretends to be a highly bombastic, anti-religious, post-car-accident, wheelchair-bound, mute woman named Joan initially in order to better relate to his female patients. After two years, ‘Joan’ had constructed quite an array of deep emotional relationships, which only began to fall apart after ‘Joan’ coaxes an online friend of hers into an affair with Alex. Written from the point of view of the article's author, Van Gelder, the reader can only get a feel for the sense of betrayal she felt. As she writes,
Similarly, the 1993 story of LamdbaMOO?'s MrBungle as recounted by Dibbell (1998) is similarly highly idiosyncratic, rhetorical, and narrative. In this story, we hear how an pseudonymous individual, Mr. Bungle, found a way around the conventional LambdaMOO software to ‘take control’ of other players' bodies, like a voodoo doll, and then he forced them to say sexually provocative things. On LambdaMOO, the immediate vocabulary for this was a ‘rape’. The community responded with anger, debating the nature of their created reality, until finally an administrator had to step in to ban Mr. Bungle. Yet, while the recount is highly captivating, it was journalistic in nature.
As mentioned above, Donath (1999) was the first academic study of Usenet trolling. She uses cross case analysis dating back to events in 1995, the beginning of the DejaNews? archive, to build a picture of what trolling is. She describes how it can be understood as identity deception, and where community responses to it come as a form of reintroducing the truth, although she does describe some wider strategies, like the killfile. Nonetheless, Donath's analysis was shallow, mostly to select specific examples for her wider theoretical argument. Thus, it was more of a philosophical argument rather than an attempt to determine new theory.
Baker (2001) also used the DejaNews? archive, but he went into one case in great depth. In particular, he examined a significant number of messages related to the troll, 150 in total, from news:alt.tv.melrose-place, from January to May, 1996. This was not a complete sample of the entire conversation, as the archive did not have every single message, but it reflected a large proportion of the discussion to draw conclusions. To further give a picture of the community in the newsgroup, he conducted a survey of the group to which he received 280 respondants. The survey asked simple demographic data, such as age, sex, location, education level, and sexual orientation. The latter question was particularly significant since the conflict in question was about the infamous ‘gay kiss’ that never aired on the series Melrose Place; the troll, Macho Joe, pretended to be a deeply homophobic individual to the chagrin of the rest of the newsgroup, who proceeded to attack him for months. When finally Macho Joe was outed as a possibly homosexual pretending to be deeply homophobic, the thread ended. What marks this study in particular was that Baker managed to interview the troll himself. The final analysis was a breakdown of the community's response strategies to Macho Joe based on evidence from the data sample. From these strategies, Baker concluded the underlying theory to explain the phenomenon was a case of moral panic — i.e. the community did not want to admit their own homophobia, so they attacked Macho Joe to maintain their own elitist superiority.
The most inspired off all the three studies on trolling, however, was Herring, Job-Sluder, Scheckler, and Barab (2002). They similarly studied one particular case in great detail. Although this involved a private discussion forum, not Usenet, the circumstances were similar. A feminist discussion board was trolled by a male, presenting himself as a feminist, yet radically attacking the very premises of feminism. What made his assault most problematic was that feminism has a history of creating safe spaces for women to talk, but also open spaces that anyone can join, and the troll played the two philosophies against each other for several months until an administrator stepped in to ban him. The important approach here was that Herring, et al. used grounded theory methods (Strauss & Corbin, 1998; Glaser & Strauss, 1967) to build a coding system for the discussion, from which they deduced a variety of interaction behaviours, which shaped their discussion.
Not all trolling and studies of trolling revolve around the Usenet/discussion board model. One of the best examinations of trolling and other deviant behaviour online comes from Suler's (1997) analysis of ThePalace? (http://www.thepalace.com), a graphical chat environment. The Palace is effectively a two-dimensional chat room, both showing a background picture with avatars floating in front, much like desktop wallpaper with icons floating on top. The Palace has many ‘rooms’ that one can enter, which effectively look like many different backgrounds with different people in them. Primarily different properties between The Palace and Usenet are that The Palace is real-time and synchronous vs. the highly asychronous nature of Usenet (messages could take up to a week to arrive); The Palace was two-diemensional and graphical, whereas Usenet was merely text; readers are visible on The Palace; and the Palace had an active presence of wizards with the powers to police the environment as they saw fit. While Suler writers again from personal experience, his intent as a psychologist is to describe a very detailed taxonomy of deviant behaviour and responses at The Palace. Unique problems include having an offensive image as one's icon, waving one's icon all around the screen, and drawing offensive graffiti in the graphical space. Interesting solutions include pinning a user's icon to one corner of the screen if they have a habit of dragging it all around the screen, substitution foul words as they speak (e.g. hell becomes thank you; ”What the thank you are you doing”�), and creating special rooms with more relaxed social norms for those who feel the need to express themselves. However, Suler emphasizes a number of social solutions as well, such as training the wizards in proper conflict mediation. This fits with Bruckman's (1994) model of online regulation that describes what a full arsenal of responses should categorize as:
Basic model for online regulation (Bruckman, 1994).
|Authority / Modality||Technological||Social|
|Decentralized||Gagging, killfiles||Feedback from peers|
|Centralized||Banishment, account suspension||Feedback from administrators|
Given the differences between The Palace and Usenet, it stands to reason that other online media will have different trolling patterns and responses. One unique such medium is a wiki. Wikis are communally editable websites, where every word on every page can be modified by each and every person (Leuf & Cunningham, 2001). They are consequently highly collaborative spaces, where no content is explicitly controlled by any one person. Wikis were first implemented in 1995 as the WikiWikiWeb by Ward Cunningham as a backend to the Portland Pattern Repository. Influenced consequently by Christopher Alexander's (1977) A Pattern Language, wikis architecturally are heavily-crosslinked hypertext, where each node in the hypertext represents one concept (e.g. a Pattern). They are identifiable not only by their universal editability, but by their simple "page name is link name" equivalence.
Wikis are critically different from discussion boards. First, they are not just asynchronous, but they are potentially vastly asynchronous. Since all text is editable at all times, one can return to a page edited five years ago and resume the conversation where left off. Second, the structure is not chronological, but arranged in a hypertext (a graph) arranged as the authors choose to structure the text. That is, anyone can make a link to anywhere at any time, and that is the only way to relate pages. That makes the entire wiki ‘flat’ in the sense that all pages are on ‘top’ at all times, waiting to be edited, unlike newsgroups where conversations from several years ago are effectively dead.
Since all pages are editable by anyone at any time, information can be organized at by anyone at any time. This keeps the wiki from drowning under information overload, and it also helps avoid repeat the same conversation repeatedly. For trolls who like to rehash old arguments as their chief means of instigation, this offers an opportunity to silence them.
Naturally the openness of wikis invites vandalism. Wikis also have a revision history allowing any editor to undo anyone else's vandalism. Since there are in principle more good people than bad, this should average in the end (Ciffolilli, 2003). In particular, the exasperation that people have on Usenet and discussion boards trying to get their fellow community members to stop responding to the troll can be better placed to just deleting the trolls directly. This undermines the chief power in previous models of trolling, whereby the troll can act without intervention by the wider community, and only administrators can step in to defend them. That is, everything done on a wiki must be negotiated with the community.
In the world of television, "Brass Eye" is a classic trollish program: http://news.bbc.co.uk/1/hi/uk/1460805.stm
How is Brass Eye trollish? It's satire. --anon.
It puts out programs (such as the paedophilia special) designed to attract predictable responses or flames (such as the culture secretary condemning it and later having to admit that she hadn't seen it). Humour isn't incompatible with trolling, provided it's happening to someone who you don't care for (such as politicians or GodKings).
Moreover, it isn't satire unless it seeks to bring about improvement in its victim through aggravation. Trolls do not seek to change things. They only seek to aggravate.
Herm. OK. I'm glad I've seen it, but it still befuddles me. I don't get trolls. I never have, and I don't really want to. I first saw a list-killer troll in 1991, and while I could see what he was doing, I never got it. Why do trolls troll? -- DaveJacoby
Well, I am a student of the Tao and of Discordia, I prefer chaos over order, I prefer Japanese Negamayki to Chinese Tso, I would rather have 2 in the bush than one in the hand, and of course, I would rather have a bottle in front of me, than a frontal lobotomy. Any More Questions?
Well, there's a few different kind of trolls. The true meaning of troll, before it got distorted by all of that commercialization, err, spamming, is to have fun with the people who reject certain opinions out-of-hand. There's too many zealots out there, and if you can't get rid of them (or make them a bit more open minded), then you might as well have fun with them. -- 11223 of KuroShin
If we define trolling as saying something just to get a reaction, then I find the best way to keep my moderation scores high on KuroShin are to troll just a little bit. Provocative, rhetoric laden posts get higher scores than informative ones on average. I don't particularly like that. I suspect that if identity was divorced from contribution, trolling would mostly disappear. At worst, it would be limited to bad graffiti. With a wiki, that kind of thing can be deleted. Of course, I troll WikiWiki quite frequently because I find the Wiki:ExtemeProgramming people are in a dangerous cycle of GroupThink that damages the overall quality and community of the site. Maybe I'm trying to provide a Wiki:ZenSlap. Or maybe I just react badly to people who peddle the "One True Way". -- SunirShah
The problem comes in when the distinction between "troll" and "spammer" begins to gray. Spammers produce garbage that is entirely offtopic and disruptive, leaving nothing that could be taken as positve. Trolls, on the other hand, usually have a methodology and/or ideology behind their work, whether it be to press an agenda, reveal flaws in a system through subversive means, or merely provoke vitriol. I optimistically tend to view [some] trolls as akin to counter/subculture artists such as the ObeyGiant? movement. -- ChrisAinsworth
I'd say a Troll is someone who isn't engaging in discourse for the usual reasons. They are not seeking to inform or be informed. They betray us when we AssumeGoodFaith. I don't mind people being controversial, or challenging my basic assumptions. To some extent I enjoy it and like to think that I can meet such challenges. We have insights so old they are commonplace and we forget how things could be different. A Troll isn't trying to provoke deep thought, though. They are like children ringing doorbells and running away. Yes, people come and answer the door when you do that, but so what? Most Trolls I've seen get reasoned responses and I don't see why provoking a reasoned response is "having fun" if there's no intent to learn or teach from it.
I'm not so sure that the JargonFile definition is so good. There are quite a number of different types of trolls and
First step is realising you have a troll on your hands, and making sure everyone knows it. Unfortunately, simply calling someone a troll based on one or messages usually isn't fruitful, especially if the troll wasn't blatent. Fortunately, some trolls end up leaving little messages behind, and a quick search on google or deja turns up all sorts of examples of past trolling in other forums. Always fun when that happens.
Usually they are not fun:
Sometimes there's some confusion between trolls and flamers (WhatIsaFlamer?). If someone continually needles me, saying things they know that I'll find provocative, and cause me to start arguments, they're a troll. If someone continually insults me, calling me names, questioning my mother's pedigree, they're a flamer. Flamers tend to have temper issues, so they respond to slight (even imaginary) provocation, which of course makes them a perfect target for trolls. In any fight between a troll and a flamer, you can tell the troll from the grin on his/her face. Flamers typically don't enjoy such exchanges — they prefer to flame newbies, who have a softer shell.
A community high in both flamers and trolls will die fairly quickly. A community high in flamers will generate a lot of heat, but without trolls to fan the flames, the heat will die out as quickly as it started. A community low in both trolls and flamers is very pleasant and constructive, but some might find it a bit tame and prone to GroupThink. A community high in trolls generates a lot of interesting and controversial content, but doesn't represent mainstream thoughts and subjects very well.
To illustrate the difference between low-flamer low-troll, and low-flamer high-troll, consider a mailing list devoted to a subject which is largely uncontroversial, but has elements that are highly controversial. For example, caring for one's pet. A low-troll list will spend a lot of time discussing important and practical elements of petcare, and might well generate a great FAQ on the subject, and other symbols of productive discussion. A high-troll list will touch on the same issues, but will spend a disproportionate amount of time discussing bestiality, whether keeping pets at all is a breach of animal rights, cloning, etc. Without the presence of flamers, the high-troll list will be polite and courteous, but it will also be wildly disproportionate to the overall topic.
I'd postulate a LifeCycle?, starting at F- T-
An excellent test for flamers is to put them in a place full of trolls. Similarly, an excellent test for trolls is to put them in a place full of flamers. Could this be exploited? Also, could we exploit the tendency of trolls to bind to flamers by making both harmless? -- MartinHarper
Speaking of exploiting it, I've been thinking about the point of the Young Liberal message boards in the grand scheme of political life in Canada. Nothing of merit ever happens there. It's just a lot of flaming, posturing, trolling, crushing, and the rest of the range of bullshit behaviour that the Young Liberals are infamous for. Asking some prominent Young Liberals why they participate, all I could get was that they just want to know what's going on with the other prominent Young Liberals, and to see the gossip, or whatever. That really didn't make a lot of sense — it sounded more like a rationalization — but now I think it serves as a necessary outlet for the peacocks (and peahens) to troll and flame each other while the rest of us ignore them. Literally that's what it's for. Otherwise, they would take their need to hurt each other into more mainstream CommunicationChannels, which would be disruptive to the rest of us. I think for any PoliticalAction online, there is a need to create these back channels for those who are emotionally unstable.
Extrapolating, and I'm sure it's trivial to find datapoints to corroborate this, I think all communities need some sort of escape valve for these types of people as most communities do not have the capability of dealing with them in a more proactive, engaged way in the time required to do so. Online we have the examples of Badvagoto vs. AdvoGato, Adequacy vs. KuroShin, GreenCheese vs. WhyClublet vs. WikiWiki. In real life, we have politics! (seriously)
I find it interesting that trolls and flamers naturally separate themselves from the rest of the population. Perhaps after cutting their teeth on the softer, trusting variety that is the rest of us, they can only find the necessary challenge in those more skilled in their particular art of tongued swordplay. -- SunirShah
Trolls occasionally pose as a complete newbie by continually asking "Really Stupid Questions." Since there is no way to determine whether this person is a troll or a newbie, we AssumeGoodFaith and answer the questions. This however affords the troll the ability to continually waste people's time, something the troll might enjoy. The solution is simple the wiki solution of only answer each question once. The stupid questions the troll asks are just CommunityLore you never bothered to write and organize. So, maybe it's not a disservice after all.
You don't decide they are a troll. The first time they ask a stupid question, you point them to the FAQ. The next time they ask a stupid question, you ask if they have read the FAQ. The third time, they have demonstrated that they are irresponsible and therefore they aren't worth your time. Even if they are newbies, they are still responsible for themselves. If they cannot take the time to learn, they certainly should not take your time to learn. Everyone is expected to be an adult. -- SunirShah
(see WhatIsaTroll <-- that's not a very handy name to link to, does Sunir feel like exercising admin powers to rename?)
Just pretend Sunir moved to Nepal this morning. Now, how would you rename the page? Besides, Sunir doesn't have administrator powers.
Well he might feasibly be in Nepal -- he's popping up all over the place ;-) We can copy to a new page & leave a redirect. Remains to see if others agree on a name-change and what a new name should be!
Technically, you capitalize each word, so it should be WhatIsATroll?, right?
Right, but let's just remake the page into a pattern called DisruptiveRole?; this is more than just a DisruptivePersonality?, but someone who actually takes on the (self-)responsibility to disrupt the community. Any objections?
I thought vaguely of TroubleMaker?, but perhaps that's one form of DisruptiveRole?. I do like "role" over "personality" because it's about the actions, not the motives. I also want ClassicTroll?, for trolling prior to the mass market.
The angry troll has lost the ability to give the PrincipleOfFirstTrust and AssumeGoodFaith, so automatically enters into a ConflictCycle instead of trying to find a collaborative solution to problems he or she perceives. The question is why the lack of faith in the fellow human kind? That could get psychoanalytical, although the wiki way should demonstrate the general upright morality of human nature given an positive NonViolent environment. Prior cases have shown that maintaining your integrity in the face of intense criticism often can change the person's mind, although this is a significant time and emotional investment.
Relax. There are no trolls. I've been living in Norway for a year, I know, there are no trolls, not even there. So what is the use in listing them? These "bad people". These "enemies". Maybe better things could be done, instead of negative lists? And can such a listing in any way protect you from them? Just be nice to them, treat them as humans, with respect. You'll see: There are no trolls. Relax. -- MattisManzel
Haven't we? -- MattisManzel
May I translate Mattis into "true English": trolls use the same free space as "anarchical social activists" (like Mattis). The threat to close this free headroom by cultural regulations creates understandable pure panic: the vision of an open internet might die. The paradoxical situation is that neither "cooperation with conservatives" nor "allying with the trolls" will help. The only alternative would be "to muffle the trolls" to slow allergic reactions of the communities - but to controll the trolls (from a friendly position) seems hardly doable in the long run. So this remains: "pleading for mercy".
What we must do is to understand and accept the needs of social idealism and find a way to integrate it into the developing online community culture. But the necessary dialogue is hard to start. Perhaps we would need some kind of RepresentativeRole? holding special freedoms. So certain groups or minorities could be sure to be heard. Perhaps at some point there will be even communities of representatives only. -- HelmutLeitner
I am still a little confused by the concept of "troll" (and I've been on the internet since 1995!) In particular, I don't believe I have ever encountered a troll in real life. I have certaintly seen people who create trouble with groups because they have a hidden (e.g. political) agenda, and use various "tricks" to sustain their trouble-making. I have probably even seen some people do things like, yes, infiltrate a feminist group pretending to be feminist but in the end just "trolling" — however they were motivated by a political opposition to feminism. They chose the group carefully, and wanted to disrupt.
But on the internet, people will "troll" ridiculous things: e.g., they will troll a discussion group about brands of wall paint. Why? Is it some internal hatred of success? (The troll sees a well functioning group, it makes him mad, he wants to destroy it — sort of like the discussion equivalent of a vandal?) I guess what I'm saying is that I've never seen a clear-cut real life example of a "troll". Is it because trolls are incredibly rare? Because I am surrounded by effective instiutions to exclude trolls? Or am I missing the trolls in my midst?
I see what you're saying. In this sense, a troll is deliberately trying to induce people to break various rules they've (presumably informally) come to agree on for how to behave — if possible, with minimal effort, and again if possible, to get the breaking of one rule to "propagate out" to the breaking of others. An "obvious questions" troll who keeps asking things that he should have read in the FAQ is trying to induce people to be rude to newbies — and perhaps then to get people to be less civil to each other when they criticize each other's response, etc. etc.. It's kind of strange that the answer is that subtle, actually. The "hidden" nature of the troll helps in this, and indeed, the fear of trolls can itself function equivalently to a troll. Thanks!
[jam tangan] [jam tangan murah] [jam tangan kw] [hostgator coupon] [kata mutiara] [Jasa SEO] [EZido] [RDAnet] [pioneer deh-1300mp] [asus a53e-xa2] [asus tf101-b1] [asus tf101-a1] [asus n53sv-eh72] [asus republic of gamers g74sx] [asus acer a5250] [acer chromebook ac700] [asus asus 53u] [lg infinia 55lw5600] [Sonicview 360 premier] [asus 7 cu ft freezer] [asus 30 single wall oven] [brother cs6000i sewing machine] [brother 1034d serger] [brother sewing machines] [Yokohama Geolandar H/T-S] [crib tent tots in mind] [kidco peapod plus] [foscam fi8910w] [samsung pl120 review] [gopro helmet cam] [Canon SX130IS] [powershot s100] [ContourHD 1080p] [canon vixia hf r21] [digital picture frame] [canon ef 50mm f1.4] [canon ef 70-300mm review] [wide angle lenses] [moving comfort sports bra] [moving comfort bra] [womens argyle sweater] [bebe dresses] [ViewSonic VX2250WM] [Le Pan TC 970] [Apple MacBook Air MC965LL] [Sennheiser CX 880] [plantronics cs540] [ultrasonic jewelry cleaner] [Sennheiser RS120] [bose quietcomfort 15 acoustic noise cancelling headphones] [logitech harmony one remote] [logitech harmony 900] [sony mhc-ec69i] [sony mhcec909ip] [bose wave music system] [sony htss380] [logitech squeezebox touch] [sony dvp-fx970] [onkyo tx-nr509] [onkyo tx - nr609] [onkyo ht-s3400] [energy 5.1 take classic home theater system] [polk audio psw505] [onkyo ht-s5400] [onkyo tx-nr709] [belkin pf60] [onkyo ht-rc360] [denon avr-1912] [Yamaha YHT-S400BL] [fujitsu scansnap s1500] [brother hl-2270dw] [epson workforce 545] [hp laserjet p2055dn] [bushnell 8mp trophy cam] [toshiba 32c110u] [panasonic viera tc-p60s30] [VIZIO E220VA] [hauppauge wintv dcr-2650] [Acer AM3970-U5022] [Acer AspireRevo AR3700-U3002] [Dell Inspiron i570] [Dell GX620] [Gateway FX6860-UR20P] [Western Digital My Passport Essential SE 1 TB USB 3.0] [Fujitsu ScanSnap S1300] [Epson Perfection V300] [Fujitsu SCANSNAP S1100] [NeatDesk Desktop Scanner and Digital Filing System] [Epson WorkForce Pro GT-S50] [Kodak P811BK] [Epson Perfection V330] [Viewsonic VX2453MH] [Asus VE228H] [ViewSonic VA2431WM] [Samsung B2230] [HP 2711x] [ASUS ML228H] [Epson PowerLite Home Cinema 8350] [Optoma PK301] [Epson EX7210] [Epson EX5210] [ViewSonic PJD5133] [Acer X1161P] [FAVI RioHD-LED-2] [Epson EX3210] [ViewSonic PJD6531w] [Trinity 360 Breville 800JEXL] [Skil 3320-02] [Delta 46-460] [Grizzly G0555] [Delta 18-900L]