[Home]PhonyFlood

MeatballWiki | RecentChanges | Random Page | Indices | Categories

In spaces without some method of accountability, content has limited credibility. For instance, in a completely anonymous system, there is no accountability. All interactions are "one-time identities," so there is nothing but a whiff of a shadow to hold responsible. Alternatively, in a weak pseudonymous space, there is no way of guaranteeing the pseudonym isn't being coopted by different people during different usages. In a strong pseudonymous space, there is no way of holding the underlying person accountable in the RealWorld. Moreover, the Internet loses ParaLanguage and BodyLanguage, so it's easier to lie. (see ImportanceOfIdentityInOnlineCommunities for more of this.)

Those people who understand this will attempt to corroborate important information on the Internet by cross-checking it against other information. Usually, they will cross-check it against other information on the internet first because that's the easiest. However, note that information on the internet is not validated as noted above. Feeding into this, information on the Internet moves rapidly, à la memes (see VirianLexicon), because it is easy to copy on the internet. Thus, duplicated information doesn't necessarily mean independently derived information. Finally, feeding into this, the Internet is not an edited forum; anyone can publish and republish, which means it's possible to create an idea at the "grassroots" level without having it martialed through fact-checkers. All one needs to do is appear convincing. If one is really good at this, one may induce mass hallucination, a self-reinforcing fiction in similar lines to FollowYourNeighbour?.

Therefore, to thoroughly mislead the public, one need only to create a seemingly trustable yet false account. Flood the space with phony facts, disinformation, that taken together seem to reinforce whatever idea they wish to create. Furthermore, because in these spaces there is no external traceability, it is also trivial to assassinate someone's character by stuffing words into their mouths.

Note that while it is trivial to disconnect nodes physically flooding the network in a short period of time (those perpetrating a DenialOfService), a PhonyFlood is semantic, and it can be built over time. Say a hostile entity (say a repressive government) wishes to discredit an individual, the network may decide to disconnect both the hostile entity and the individual frantically proclaiming her innocence as confused nodes. This would be a win for the hostile entity. At worst, the disinformation will be permitted to remain on the network, thoroughly confusing the audience and also scoring a win for the hostile entity.

Also, this scenario doesn't require a complex architecture like FreeNet. It is possible on the Internet as it is today, and it was so blatantly demonstrated by the crash of TWA Flight 800. LawrenceLessig recounts this well in CodeAndOtherLawsOfCyberspace. p.171:

On July 17, 1996, TWA Flight 800 fell from the sky ten miles off the southern coast of Center Moriches, New York. Two hundred and thirty people were killed. Immediately after the accident the United States launched the largest investigation of an airplane crash in the history of the National Transportation Safety Board (NTSB), spending $27 million to discover the cause of the crash, which eventually was determined to have been a mechanical failure. [1]

This was not, however, the view of the Internet. From the beginning stories circulated about missiles--people said they saw a streaking light shoot toward the plane just before it went down. There were also stories about missile tests conducted by the Navy seventy miles from the crash site. And then there were reports of a cover-up by the U.S. government to hide its involvement in one of the worst civil air disasters in American history.

The government denied these reports, yet the more the government denied them, the more contrary "evidence" appeared on the Net. [2] [3] [4] There were repeated reports of sightings of missiles by witnesses on the ground. These reports, writers on the Net claimed, were being "suppressed" by the government. The witnesses were being silenced. And then, as a final straw in the story, there was a report, purportedly by a government insider, claiming that indeed there was a conspiracy--because evidence suggested that a friendly fire had shot down TWA 800.

A former press secretary to President John F. Kennedy believed it. In a speech in France, Pierre Salinger announced that his government was hiding the facts of the case, and that he had proof.

Of course, he had no proof. He was duped by misinformation (perhaps disinformation?) on the Internet.

A similar instance occured in the days following the destruction of the World Trade Center on September 11, 2001. Many Arab news organizations begin reporting a story based on an dubious e-mail reference to [Information Times] that the Israeli government had warned 4000 New York Jews not to show up to work that morning. As more and more news organizations reported this, the Russian Pravda did as well--although, to their credit, they pulled the article hours after they posted it. [5]

Ultimately, there is no way to embed semantic notions of trust in an system that actively breaks social structures. Any attempt to do so will necessarily expose the network's members to some level. And without traceability to the RealWorld, the exposure will not be sufficient to defend against a dedicated PhonyFlood.

But, of course this is doable in the RealWorld as well. There are countless occurences of urban myths in high school textbooks for instance. And much journalism is just dressed up gossip.

All you need is a medium and anonymous or near anonymous sources. Sources are effectively anonymous when nobody bothers to follow them up. The phony flood effectively prevents people from making an informed decision. Take UFO stories as an example. There are magazines dedicated to the topic and sources remain nameless. People tire of trying to verify the claims and thus the topic as a whole is discredited. Confronted with the lack of trusted sources, people usually just DefendAgainstParanoia -- effectively shutting out information that doesn't fit the current world model.

Nonetheless, the point remains that it is impossible to secure a network against semantic attacks and that anonymity makes this easier.

Then again, even with accountability to the RealWorld, trust in the public should only go so far. Systems like the WebOfTrust require trust in the masses at large, but this trust is often thwarted. On AdvoGato, people reflexively rated a new user identifying him or herself as "esr" as a Master without verifying it was in fact EricRaymond. Spyware programs on the Internet proliferate because the guy down the hall happens to use Gator or Hotbar. Stocks get pumped not consciously but merely on "momentum", like Nortel. It doesn't require a conscious effort to flood the network with falsehoods, as we've learnt that many things in the world do not require conscious effort, just emergent behaviour.

It seems to me that WebOfTrust doesn't always require trust in the masses at large. It does when there is a single special "point" in the trust network, but not as much when everything is relative to every user. For example, in AdvoGato, "AdvoGato" itself may as well be a node in the network, and the user levels might reflect how much "AdvoGato" trusts each user (transitively through endorsements of other users). If, instead, each user only saw posts trusted by them (and transitively trusted by users whom they trust), then each user could keep a tighter lid on things if they chose (or they could choose to emulate the present system by trusting the AdvoGato node very fully).

It's not much of a web of trust if you only trust your immediate circle of friends. To put your faith into the web, you have to trust that the median person's idea of trustable aligns with your own. However, there are reasons why our circle of friends is a circle--people are discongruous, which is a good thing normally. However, trust isn't transitive, so friends aren't transitive. Also, if your interest requires specialized technical knowledge, such as understanding what spyware is, it's not worthwhile to poll the public who normally lean towards surface issues as those are the only ones they can relate to.

This can be seen as an extreme case of the circle of friends; one in which your friendship is infinitely transitive. What I am saying is that WebOfTrust includes useful intermediate cases; such as where you trust your friend's friend's friends, but no more. You can have a web that is larger than the circle of people whom you actually know, but smaller than the AdvoGato style of web.

One interesting question here (and maybe I should move this to WebOfTrust) is: do the SmallWorldNetwork? properties mean that there is some small number of max transitivity hops after which you may as well just include everyone? And, how much does this number vary with the other parameters of transitivity?

Yes, probably after five or six steps along the aquaintance transitivity, one would reach global coverage. If you only follow "trusted" paths, this would probably very soon lead to many smaller sets of transitivly trusting people.


The Case of Gorgeous Guy

After a photo of him taken by a would-be suitor at an area bus stop appeared in the CraigsList "Missed Connections" forum, captioned as "Gorgeous Guy", Don Baca artificed his own stardom by faking a crowd of interest. He eventually started getting calls from CNN and The Tonight Show.

http://www.museumofhoaxes.com/gorgeousguy.html

From the website...

No one knows who posted the original picture of Baca, but Cassel discovered that the majority of the initial follow-up messages that drew attention to the picture had been posted by Baca himself. He had created an array of online personalities to convey the sense that a crowd of people were talking about him. This strategy eventually succeeded in attracting the attention of a real crowd. No one ever sought him out at the busstop.

Baca demonstrated how easy it was to vault yourself into the upper strata of public attention without depending on the traditional stuffed shirts of big media to take you there.

CategoryCase


UrbanLegend?s are one instance, and the phenomenon certainly is independent of the Internet. ConspiracyTheory? is another example, as are PropagandaLie?s disseminated during wartime. As well as others, more later.

There seem to be several elements required for a successful PhonyFlood:

  1. Iteration. The story is repeated and reinserted many times. It is a frequently repeated lie.
  2. Appeal. There has to be a fundamental appeal to the story being told -- a desire to think it is true.
  3. Plausibility. Though not required, it helps for such stories to be at least on the border of plausible reality. If not itself believable, the contrary evidence should be weak or diffuse.
  4. Vague origins. The origins of the story should be somewhat hazy. "Friend of a friend" or other undefined but plausible source.
  5. Hazy facts. Either the facts are few, disputed, or hazy, or their negating arguments are in a similar position. It's hard (though not impossible) to espouse a viewpoint in the face of overwhelming evidence. Easier to pick up a situation in which there's some ambiguity to exploit.
  6. Emotional investment. Many of these issues are a counterpoint of disputed facts and a strong emotional factor, whether it's paranoia, prestige, patriotism, religion, or similar.

There's probably one ultimate exemplar of this phenomenon, a story which simply transcends proof, and relies instead on faith and repetition for belief: religion. Without meaning this as an assesment of religion as true or false -- in fact, as I said, it transcends this. Religion is ultimately something taken on faith -- you believe or...you don't.

What's the ultimate answer? "Truth is a battle." No rest for the wicked -- you've got to get your story out early and often. -- KarstenSelf


Email hoax viruses (like Good Times) survive by virtue of the same weaknesses that a PhonyFlood would exploit - right?
The longevity of hoaxes, lies, and conspiracy theories is a symptom of a culture based on lies. From advertising to politics to religion, the lie is the foundation of civilization. Did the US government shoot down TWA 800 by mistake? I don't know. If they did, would Bill Clinton lie about it? Of course he would. He would deny it as long as he could, and if pressed into a corner, he would finally admit that it happened, but claim that, technically speaking, he wasn't lying when he said it didn't happen.


[The Case of Alexandra Polier], a former intern for John Kerry, Democratic candidate for the 2004 Presidency, speaks to how a PhonyFlood can be emergent. Here, an explosion of misinformation and disinformation about an alleged (and untrue) affair with Kerry, spawned by a WebLog and fulcrumed by the DrudgeReport?, led to the destruction of Alex's reputation.

As I began to trace the rumor, I learned that the vaguer it was, the easier it was to spread. Without a specific intern’s name attached, the story was initially impossible to disprove, something Rick Davis, the manager of Senator John McCain?’s 2000 campaign, remembers well from his time fielding rumors that McCain? had fathered a black child out of wedlock. In fact, McCain? had adopted a Bangladeshi baby. In an episode that presaged the Kerry story, a professor at Bob Jones University had sent out an e-mail to thousands of people claiming McCain? had “chosen to sire children without marriage.”

“When the media asked what evidence the professor had,” says Davis, “he said McCain? had to prove that he didn’t. Wow! How do you deal with that?”

Politics was like a scary game of telephone. During the last election, people had discussed rumors that Bush had taken cocaine, a not entirely illogical jump from his wild days with alcohol. This time, Kerry’s dating record between marriages might have led people to assume he’d be up for an affair. (p. 4)

Missing from her analysis was the sheer hunger the public has for such affairs. A damning rumour that would be so salacious if it were true that it must be true, except that it isn't.

CategoryCase


American politics is a constant PhonyFlood; there is so much bad stuff flying around that no one believes anything. So when Michael Moore releases a documentary slamming Bush, the conservative half of the country are already inoculated against what he says, so they ignore it. When you cannot believe anything anyone says, how do you make an informed decision? The goal is not to release an infinite number of bad things, but to focus on high quality believable ideas that can achieve traction. -- SunirShah


Discussion

MeatballWiki | RecentChanges | Random Page | Indices | Categories
Edit text of this page | View other revisions
Search: