[Home]SocialContract

MeatballWiki | RecentChanges | Random Page | Indices | Categories

SocialContract theory is mostly defined in chapters 13 through 15 of Hobbes' Leviathan. The essential premise is that while we would naturally be all murderous thieves if left to our own, none of us enjoy this constant state of strife, so we band together. As a society, since we recognize that unjust deeds will face retribution from others, we don't do unjust deeds. Moreover, we also recognize that if we were permitted to steal from our neighbours, our neighbours would similarly be allowed to steal from us. This would spiral us down to that state of universal strife.

The SocialContract then is a bunch of overlapping and universal contracts between each agent in society that say I would do bad things to you if you won't do bad things to me. This is essentially the GoldenRule cooked into the spaghetti relationships that constitute society.

Nonetheless, one major failing grace of Hobbes' theory is that humans really aren't self-interested good-for-nothings. This is really a worst case theory. The second major failure of social contract theory is that it costs a lot to attack someone since you're likely to meet a defense (and thus risk), so it's unlikely people would do this anyway--especially in a libertarian, guns 'r' us world.

One major bogus response is the common notion that "It's not illegal if you don't get caught," so people will steal anyway. However, really, even if you aren't caught, the effects are still felt. If everyone had this reactive sense of morality, then it will become more commonplace that random uncaught thieves will pickpocket from you too. This is similar to FixBrokenWindows.

SoftSecurity has a large basis in SocialContract theory.


I somehow dislike this concept of SocialContract as a model. If we see the human being as a product of evolution, then there must a transition from non-social (physical, deterministic) to social behaviour. A falling tree is not murderous. An animal taking some food is not stealing, because it has no concept of property and perhaps no freedom of choice. People never made such a contract physically, they can't remember it, so why should they feel bound to it?

I prefer to think in a different way. Anything (physical, deterministic animal, free human) is looking for advantage. Things move according to statistical mechanics towards lowest potential energy. Animals act according to their genetical program to maximize their life chances. Humans too, but they can also reflect and develop alternatives and have a choice. In this case the human has to make a trade-off between advantages at different times (now, tomorrow, maybe in a few years). Social behaviour seems to start when you include the future. That's the difference between the physical world and the social world. There is no way that a physical thing can react to something in the future. Humans can build models about the future and optimize behaviour regarding advantages in the future.

The vision of a common future, of synergy and BarnRaising doesn't need a contract nor does it change our logic. We are looking for advantages. We have maximum advantages if we act together and share advantages. It's easiest if we agree to fairness and transparency. The social logic of community seems to be a logic of CommonAdvantageOptimization?. This means that the community has to build environments (culture) where individual advantage optimization at the cost of damaging others (violence) doesn't pay.

-- HelmutLeitner


I've been wondering about how we could make a legally binding contract for social purposes similar to the way the GNU General Public License is a 'real' SocialContract because Copyright law gives it legal footing.

Humans need a society-enforced SocialContract because our laws of PrivateProperty? are not dynamic enough to solve the tricky problem of DirectDemocracy? governance over joint PhysicalSources?.

-- PatrickAnderson


CategorySociology

Discussion

MeatballWiki | RecentChanges | Random Page | Indices | Categories
Edit text of this page | View other revisions
Search: