[Home]PeerReview

MeatballWiki | RecentChanges | Random Page | Indices | Categories

Because collaborations are social works, social problems like dealing with malicious persons or even just errors and inconsistencies must be dealt with socially. At least ultimately. Technology certainly can help (e.g. AuditTrail), but it isn't a panacea.

The best way to ensure that the information is correct is through PeerReview à la academic journals. This ensures that actual peers — people with the apropriate expertise — review. There are different forms of academic peer review, however: single blind (reviers don't know the author), double blind (reviewers and authors don't know each other), and occasionally triple blind (editors, reviewers and authors don't know each other). Since it is hard to be truly anonymous in a small academic niche (think writing style), this doesn't always work: power structures remain in place. Transparent academic peer review is practiced by some OpenAccess? sites such as http://ArXiv.org.

A page that is universally editable (such as on wikis), while universally destroyable, is universally fixable. Hence, wikis practice extreme peer review. And since wikis have a public AuditTrail (PageHistory, RecentChanges), it's extreme transparent peer review.

Other review systems involve annotations like DiiGo, FireTrail (CritDotOrg currently defunct) or threaded discussion like SlashDot. However, these suffer by forcing readers to read both good and bad information before reaching a synthesis.

Ultimately, a collaboration works best because peers make up for each others' weaknesses and mistakes. It's not even necessary that all peers work to review, as merely the TeethToTailRatio must be maintained. As a whole, even with a few, the group is then very strong. This is exactly why the Wiki:ScientificMethod is so successful.

Strong yes. But in a good sense? Again, successful in what sense? --RichardDrake

Successful in the sense that Science is remarkably free of totally bogus ideas. Some may linger, but they will likely be eradicated in time. Strong in the CollectiveIntelligence sense.

See also Wiki:PeerReview, and ReversibleChange for some (reversible) means of PeerReview. For a case study in how you can end up with an infinite recursion of PeerReview if you're not careful, see MetaModeration.

PeerReview consists of two parts, the PeerPart and the ReviewPart.

See also AuditTrail, EnforceResponsibility, AcademicPeerReview, and the GuildModel of peer review.

CategorySoftSecurity


I believe this focuses almost exclusively on the ReviewPart, which is the actual content of the review. The other significant part of peer review is in selecting (or assessing) the reviewers, or the PeerPart. This is in my opinion given short shrift here, and is a large part of what I've been trying to address with systems such as the ScoopEngine. -- KarstenSelf 8 April 2001


One technical option to ensure the community is being diligent with PeerReview is to ensure each edited page was matched with at least one non-author view. This could be done through the access_log alone. However, since the access_log only tracks IPs, it is easy to dupe (edit at work, view at home). This would violate AvoidIllusion, perhaps encouraging people to be lax in their review of others.


I'm not a member, but apparently on TheWell they have a system where a host can hide a comment, but the comment can be unhidden by the author if they choose.


See also DelayAction, AcademicPeerReview.

[CategoryOnlineCommunity] [CategorySoftSecurity]


Discussion

LionKimbro -- Fri Oct 15 21:05:03 2010

I just saw this article: [Lies, Damned Lies, and Medical Science,] on the failings of Peer Review. It cites Nature, reading:

Nature, the grande dame of science journals, stated in a 2006 editorial, “Scientists understand that peer review per se provides only a minimal assurance of quality, and that the public conception of peer review as a stamp of authentication is far from the truth.”

The lesson of the article, (that I take away, at least,) is that: It's not enough to have peer review. You have to actually be making a concerted effort, to find out: "Is this thing true? Is this thing not true?"

It reminds me of the ease of saying, "Oh, the wiki will just clean everything up," that comes up so often in situations involving reworking. No, the wiki won't just clean everything up -- if a clean wiki is what is wanted, then we have to actually arrange for that. Similarly, I suspect that in Science, we can't go, "Oh, Peer Review will just sort it out." No, peer review will not just sort it out. You need to actually be very explicit about: We are searching for the truth of things, and we are organizing our search. What I speculate is that we may actually need a scientific program, with actual buy-in "this is important," in order to find out these kinds of truths.


MeatballWiki | RecentChanges | Random Page | Indices | Categories
Edit text of this page | View other revisions
Search: