[Home]ProductiveControversy

MeatballWiki | RecentChanges | Random Page | Indices | Categories

Abstract

Group wikis can be said to be an exercise in managing controversy to gain productive outcomes. This talk will demonstrate that controversy is not something to be feared, but rather harnessed. Controversy, as it turns out, generates the highest quality outcomes possible.

This contrasts with the normal view that controversy is destabilizing and should be avoided. Traditional approaches, such as Britannica's, fearing instability recommended installing single "authoritative" editors to quell controversy. In the face of research presented elsewhere at Wikimania that demonstrates clearly that the articles with the highest number of authors have the highest quality, it's clear that rather than taking the approach of the genius author, it's better to learn how to manage controversy so that it is productive rather than destabilizing.

This talk will describe how to manage controversy constructively using a wiki, focusing on well-known approaches in a variety of settings.

Copyright

Creative Commons. Attribution. Share Alike. ([cc-by-sa])

Link

Original publication:

http://en.wikibooks.org/wiki/Wikimania05/Presentation-SS1

This article will also live at

http://www.usemod.com/cgi-bin/mb.pl?ProductiveControversy

Presentation:

http://sunir.org/meatball/ProductiveControversy/Wikimania05.ppt

Overview

Brändle (2005) demonstrates clearly in [Wikimania05/Paper-AB1] the best Wikipedia articles are the ones touched by the most authors. This outcome is not unique; all group decisions benefit from having the widest number of contributors. However, we all know that the more cooks in the kitchen, the greater potential for food fights.

Controversy, particularly negative controversy frequently destabilizes group processes, including wikis. Brändle's research indicates that controversy is slightly negatively correlated with quality. Worse, negative controversy does scare away would-be contributors--a [concern frequently raised by Larry Sanger] (2005).

"We're deciding what people are going to think." -- Wendy Doniger, Board Member of Encyclopedia Britannica (as quoted in [Ferkenhoff, 2005])

The belief that groups inevitably destabilize without strong controls leads many to criticize wikis, especially Wikipedia who has the largest and most diverse groups. The often recommended solution is a strong set of editors who know best, or in one Britannica board member's words, "some of the smartest people on earth." These smart people eliminate controversy by having the first and final say.

This solution of the isolated expert flies strongly in the face of many Internet projects who have shown indeed the more people involved the better. James Surowiecki's (2004) popular press book The Wisdom of Crowds demonstrates this phenomenon is wide-ranging in group behaviour. He argues the larger, more diverse the crowd the better. Diversity and size are not the only important elements, though, but also critical is a way to aggregate and combine their input into a common outcome. Often, the outcome is better than anything an individual put forward alone, demonstrating the group is often greater than the sum of its parts.

There is a productive type of controversy

Brändle (2005) measured controversy as simply the intensity of the discussion, and it had no impact on the final quality of the article. However, in more common terms a controversy is a dispute or debate. Latour's (1987) seminal social description of science, Science in Action, outlines controversy as the process by which new facts sink or swim. In Science, diverse opinions in a controversy attacking a point can actually strengthen it in the end; the more test, trials, and attacks a point can defend against, the harder a fact it becomes. The harder the fact, the more people adopt it as truth, and it quickly becomes the commonly held knowledge.

This suggests that controversy comes in two types. The negative controversy that destabilizes a group, and a positive controversy that results in the group outcome. Put another way, the two types of controversies are:

What's the difference? Simply put, a convergent controversy is focused on ideas rather than feelings, in what is called a [healthy conflict] (Eisenhardt, Kahwajy, & Bourgeois, 1997).

In a divergent controversy, people put their egos on the line. Once they put forward their position, in order to save face, they have to harden their position against attack. Further, it is in their interest to attack other positions, thereby adding more and more tests, trials, obstacles, and concerns to overcome. Rather than attack ideas, people attack each other in order to disrupt the introduction of new ideas. Sometimes they attempt to disrupt the medium of communication itself, such as with edit wars on wikis.

Convergent controversy introduces facts and considers only facts. Facts reduces the gulf separating positions as some positions suddenly become untenable in the face of facts, making movement possible. As long as people are willing to give up their position, this process works.

Of course, if they are not, they have to resort to alternative attacks such as personal attacks or physical attacks on the medium, and you have divergent controversy.

Convergent controversy works best in the most diverse environment possible, as the widest variety of facts and positions attempting to cover those facts can be introduced. The only thing necessary is a mechanism to fairly integrate all of those inputs into a final group outcome that everyone can adhere to.

Stability

Why resolve controversies at all? What do we gain out of a controversy? When a controversy ends, either divergent or convergent, we reach a state of equilibrium, which means that action is over. Now, there are two types of equilibrium. In a divergent controversy, that equilibrium is an unstable equilibrium, like resting a ball on the summit of a mountain. Although the ball and discussion is at rest, on a wiki, it sits there festering like an open wound or a [landmine] to be tripped over, waiting to be rediscovered and then reopened at any time. One little tap and the ball could be sent into motion again, accelerating fast towards the ground and smash.

In a stable equilibrium, on the other hand, the ball is at rest in a valley. You can disturb it as much as you want, but it will come back to rest at its valley. Rather than releasing energy when it moves, it takes a lot of energy to move it somewhere else, perhaps to a better, prettier, deeper valley. A convergent controversy leads towards a stable equilibrium. The final outcome is strong and resilient to disturbance because it is built from facts that are hard to contradict.

So strong and resilient, it forms a new [stable base] upon which new work can be built. Latour (1987) describes this process as "ready-made science" supporting "science in the making," just as Newton so famously was standing on the shoulders of giants that came before him. Whether you are writing an article that builds off another article by clicking on a red link, or you are building on top of a finalized contract with a client, things only really get done from convergent controversy building on top of another convergent controversy.

Wiki and controversy

Wikis are controversy machines. An old adage on the [original wiki] was "Everything is wrong." Because conversations are never really dead on a wiki, and are always accessible, people can reopen a conversation--often a controversy--at any time. An individual with a fresh, new perspective--or even you yourself after a few years of mental gestation--can take exception to anything said before. One small comment, one small argument, can raise the interest and ire of the RecentChanges junkies and thus create a heated conversation. Very quickly, a docile page can become the centre of a hot debate. It seems all too frequently at the end, however, the wiki is left with the burnt out shell and bombed craters of a discussion that got too hot.

On a productive wiki, though, these conversations stabilize. They're exciting. They are what attracts people. They are in fact what builds the wiki, as they reflect a team of people working together to build the wiki together. The controversy rather than descending to personal attacks, ascends to teaching, learning, and building something together that seems like more than a sum of the individual parts.

The art of hosting a successful wiki is essentially the art of managing convergent controversy. Whether you have a small team wiki inside your company or Wikipedia, the essential problem is the same: How do you get people to contribute what they know and then integrate it with others peaceably?

If you compare wikis to the process of Science, they have important similarities. First, there is a community of practice in both, either the RecentChanges junkies or the Academy. Second, nothing is truly stable. Either pages are constantly available to be reanimated with a new perspective or an old theory can be knocked down with new evidence. Third, both get better the more people are involved.

The crucial difference is that we would not expect a heated flame war to erupt across the pages of the New England Journal of Medicine. (Although that happens.)

Many approaches have evolved to close this gap. They all benefit, and in many ways could not exist, without the elegance of wiki's [soft security], a social, flexible, reactive, contingent method of securing a wiki. Soft security responds by bending to an attack in order to deflect it, making it resilient to a wide range of attacks; traditional "hard" security tries stopping the attack directly, and therefore when it breaks, it breaks completely. Since soft security is flexible, it is open to diverse, unpredictable communication, even while allowing room for reconciliation afterwards.

Energy

Chris Purcell has described that [motivation, energy, and community] are the three pillars of soft security. Quoting the page from MeatballWiki:

"Wikis work if the energy of the community vastly outweighs the energy of malfeasants; this is a pillar of SoftSecurity. Traditionally, however, we have assumed malfeasant motivations are unstable, either inherently (e.g. teenage vandals who quickly get bored) or as a result of community action (e.g. peace-making, et cetera). The new "threat" to wikis . . .is simply one with a stable motivation."

The danger is a divergent controversy that 'stabilizes' into a flame war. Flame wars happen when people put too much negative energy into a controversy. As it implies, a flame war is a contest for unilateral victory rather than mutual gain. This means the negative energy is part of a power game rather than a valid complaint; in effect, it becomes an arms race. The negative energy (attacks) force the other participants in the contribution to respond with equal or more energy in order to maintain their position or exceed the other one. A scorched battlefield may be the result, making future communication impossible as the [personal relationship] has been severed and cauterized.

When individuals become enraged and estranged, they might even decide to attack the community, such as with an [edit war]. These people have decided that productive conversation is futile. They believe that the only way to make a difference is through pure action. That doesn't have to be true; certainly you could be very accommodating, but are still attacked. The difficulty is that you have no way of convincing the attacker that you are going to listen to them because they aren't listening to you.

This means the only way to communicate is with a counter-action or reaction. You revert them, for instance. But they revert you too. In an egalitarian society, everyone can hurt everyone else the same amount. The final winner is the one willing to put the most amount of energy into the battle. That means the controversy that had initially destabilized only one part of the community will grow and grow until it destabilizes the whole community.

The only solution is to "recan the worms". If you do this early, while people are still willing to talk, you can salvage the situation. If you do this late, you have to resort to traditional "hard" security and banish the attacker.

Wikis and stability

Wikis are not just a technology, but a way of people working with a technology. The essence of maintaining stability and productive, convergent controversy on a wiki is to ensure that speech is enough action. That means, people must both be active listeners, incorporate what has been heard (fair process), and refrain from personal attacks. As long as everyone sticks to this process, it is unlikely anyone will suddenly cease to be civil. The difficulty is that some people will come that are not civil. Their incivility may be enough to push other civil people into incivility. The real issue, fundamentally, is how to deal with these ["difficult people."]

Peer review

Many people, particularly academics, Encyclopedia Britannia (Ferkenhoff, 2005), and Larry Sanger (2005) have claimed that what Wikipedia really needs is peer review. Many people have similar views of wikis in general, except they forget that of course wikis have peer review. We have been claiming this for years, such as on [MeatBall:PeerReview]. RecentChanges is the way that peers review each other. As it say on the [WikiDesignPrinciples] a wiki must be "Observable - Activity within the site can be watched and reviewed by any other visitor to the site."

The accusation that wikis don't have 'peer review' often implies really one of two problems. Either it's the wrong "peers" (i.e. not one of the accusers, or some authority) or the wrong "review" process. This gives us two points.

Social boundary

All social organizations have a social boundary that separates the in-group that characterize and run the social organization versus the out-group with whom the social organization interacts. People can move between the in-group and the out-group at any time, for any reason, sometimes for no reason. The social boundary provides the in-group the possibility of controlling who can join, and thus prevent unnecessary damage.

To use a simplistic analogy, the social boundary operates much like a cell membrane. Without a cell membrane, it would be impossible for the cell to self-organize, and consequently impossible for the cell to develop complex structures. The in-group, meaning the proteins, genetic material, organelles, and other material inside the cell membrane, describe what the cell does and ensure its survival. Materials outside the cell membrane interact in a controlled way through the cell membrane. The cell can emit things through the cell membrane for the use of external entities, like other cells. The cell can also absorb necessary materials through the membrane that it needs to survive, like nutrients and sugars. The cell membrane also rejects materials that are hazardous to the cell, like toxins or organic materials.

Also much like cell membranes, the social boundary can come in many sizes and permeabilities, depending on the nature of the social organization. As mentioned above, Sanger (2005) prefers a tight social boundary to limit control to those with academic credentials. This stance is a very conservative one, borne out of an honest and valid fear that quality will degrade if we let less well "known" participants into the process. Academia, as Latour (1987) describes, is itself very conservative, as resources are very minimal. The risk of putting a relative neophyte in charge of a $100 million dollar lab is immense, particularly since that may be the only such lab constructed for several years, and it has to be enough to satisfy the research agenda of the funders. With so few opportunities for so many academics, it's no wonder that they spend a lot of time trying to knock each other out of the ring on paper before any gain access to critical resources. Indeed, they are so competitive, they devise formal methods for ascertaining quality (Scientific validity), a process known as disciplinization. The social boundary is quite rigid: a credential proving discipline (the Ph.D.).

In more well funded environments, like business, the risk of letting in non-vetted participants is lower. In truth, there are ample enough resources available to let many competitors enter a market. As a result, people do not always need to rigorously verify a candidate follows a proven methodology. Often you fire them later if they fail. That being said, business is often still risky when much money is on the line. From the interview to your contract to legislation to the right to fire, employers maintain a relatively tight social boundary. In this way, businesses ensure that employees will both toe the line and contribute to the bottom line. Successful businesses strive to keep profitable members inside the company and deflect or eject unprofitable individuals.

The problem with translating these social boundaries to most wikis boils down to a simple point: the risk of failure is so low, it is cheaper to be more laissez faire. If problems arise, they can be edited or reverted. Few people depend on a typical wiki to the extent that large costs will be incurred for incorrect or harmful information.

Thus, while Wikipedia may be not rigorous and it may contain errors, the project is not very risky. The cost of an error is very low as few people hang critical decisions on Wikipedia articles. If they did, it might be possible to raise funds to pay for a very economically and socially expensive process of deciding the in-group from the out-group. However, this does not fit with Wikipedia's essence. Wikipedia comes late to the party. Instead, it focuses on recording and organizing facts long after the controversies have settled, at a time when the potential capital rewards are minimal (Latour, 1987). It is the sum total of commodified information. Since Wikipedia deals in knowledge widely held in the population, it makes sense to let a wide range of the population contribute. Then more facts can be added. We can say that Wikipedia has a very wide social boundary.

In contrast, some wikis like MeatballWiki are not always working with facts created long after controversies have settled. They may be involved in creating and deciding new ideas, and thus they may be embroiled in new controversies. Unlike academia, the risk of performing badly on these wikis is not very high. An argument will not cut someone off from a lucrative grant, but maybe only an interesting conversation or a useful development project. A better analogy for places like Meatball might be a house party. People are openly invited to come, even those unknown to the host, but obnoxiousness may result in the host asking you to leave. You will also likely not want to come to the party if the type of people there do not interest you. We can say that places like MeatballWiki have a permeable social boundary, but not very wide.

Common, clear process

Wikis in action

Traditional model (C2).

People love this!

Why is this important?


CategoryConflict

I lost my list in an 'edit conflict'. Argh! --Sunir

References

Brändle, A. (2005). Too many cooks don't spoil the broth. In Proceedings of Wikimania, 2005. Available from [[Wikimania05/Paper-AB1]]

Eisenhardt, K., Kahwajy, J., and Bourgeois III, L.J. (1997). Managing conflict: How management teams can have a good fight. Harvard Business Review, July-August, 77-85.

Ferkenhoff, E. (2005). "Venerable encyclopedia seeks just the facts." Boston Globe, July 21, 2005. Available from http://www.boston.com/news/nation/articles/2005/07/21/venerable_encylopedia_seeks_just_the_facts/

Latour, B. (1987). Science in action: How to follow scientists and engineers through society. Cambridge, MA: Harvard University Press.

Sanger, L. (2005). The early history of Nupedia and Wikipedia, Part II. Slashdot. Available from http://features.slashdot.org/article.pl?sid=05/04/19/1746205&tid=95

Not to mention his numerous other writings on the same topic, like his 2004 kuro5hin.org article, [Why Wikipedia must jettison its anti-elitism]. -- Sunir

Surowiecki, J. (2004). The wisdom of crowds. New York: Doubleday.


Discussion

I like this article. The following is meant as constructive criticism:

This isn't a definition, it is a description of properties that SoftSecurity has. It's a very fluffy, metaphorical description to boot. Someone who doesn't already know what SoftSecurity means won't gain anything from this description.

I don't think the second sentence follows from the first. I guess you might say that "any zero-sum game is a power game". But that's not what you're saying; you're saying that a flame war is part of a larger context, and that larger context is a power game --- that is, you're saying something about the motivations of the flame war. But I think it's possible for two groups of people to get into a flame war even if there is no "power game" going on; that is, I think it's possible for two groups, even if neither one desires "power", to get into a flame war. Even if you think otherwise (if you really do think that flame war implies power game), I don't think that's a priori obvious, so perhaps you could make this assertion on your part more explicit. Or, perhaps you don't want to go into this further in this paper, and so you want to leave it as it is. -- BayleShanks

Quick points:

I have to think about what else you said with more seriousness. There is a tautology in my description as well, where convergence creates facts, but facts are used for convergence. I think this tautology is resolved in Latour's work by noting that the facts aren't the same. Old facts are used to create new facts. But I think I have to be more careful if I were to make this an actual paper. -- SunirShah


Discussion

MeatballWiki | RecentChanges | Random Page | Indices | Categories
Edit text of this page | View other revisions
Search: