The SocialContract then is a bunch of overlapping and universal contracts between each agent in society that say I would do bad things to you if you won't do bad things to me. This is essentially the GoldenRule cooked into the spaghetti relationships that constitute society.
Nonetheless, one major failing grace of Hobbes' theory is that humans really aren't self-interested good-for-nothings. This is really a worst case theory. The second major failure of social contract theory is that it costs a lot to attack someone since you're likely to meet a defense (and thus risk), so it's unlikely people would do this anyway--especially in a libertarian, guns 'r' us world.
One major bogus response is the common notion that "It's not illegal if you don't get caught," so people will steal anyway. However, really, even if you aren't caught, the effects are still felt. If everyone had this reactive sense of morality, then it will become more commonplace that random uncaught thieves will pickpocket from you too. This is similar to FixBrokenWindows.
SoftSecurity has a large basis in SocialContract theory.
I somehow dislike this concept of SocialContract as a model. If we see the human being as a product of evolution, then there must a transition from non-social (physical, deterministic) to social behaviour. A falling tree is not murderous. An animal taking some food is not stealing, because it has no concept of property and perhaps no freedom of choice. People never made such a contract physically, they can't remember it, so why should they feel bound to it?
I prefer to think in a different way. Anything (physical, deterministic animal, free human) is looking for advantage. Things move according to statistical mechanics towards lowest potential energy. Animals act according to their genetical program to maximize their life chances. Humans too, but they can also reflect and develop alternatives and have a choice. In this case the human has to make a trade-off between advantages at different times (now, tomorrow, maybe in a few years). Social behaviour seems to start when you include the future. That's the difference between the physical world and the social world. There is no way that a physical thing can react to something in the future. Humans can build models about the future and optimize behaviour regarding advantages in the future.
The vision of a common future, of synergy and BarnRaising doesn't need a contract nor does it change our logic. We are looking for advantages. We have maximum advantages if we act together and share advantages. It's easiest if we agree to fairness and transparency. The social logic of community seems to be a logic of CommonAdvantageOptimization?. This means that the community has to build environments (culture) where individual advantage optimization at the cost of damaging others (violence) doesn't pay.
Humans need a society-enforced SocialContract because our laws of PrivateProperty? are not dynamic enough to solve the tricky problem of DirectDemocracy? governance over joint PhysicalSources?.