There are some theoretical foundations in the book: GOMS analysis and some cognitive psychology stuff on attention loci, etc.
It's important in analysing the book to separate the general ideas from the specifics, as the former is far more helpful.
People develop habits. Habits are good because then you can focus on other tasks. Thus, asking for confirmation is bad because users will develop a habit of confirming the command, or the interface will be obnoxious to use. Thus, commands must be revertible. Furthermore, adaptive user interfaces are bad, if the adaptation interferes with habits formed.
Modes are bad because they cause errors, even when there is an indicator for the mode. That's because people will focus on the task, not on indicators. This applies also for hardware (powerswitches are bad) and applications (restarting an application should put you back where you left it, not in another mode). Quasimodes are good — pressing a key to activate a mode (such as Shift, Control, Alt, or Meta) is good, because the touch required provides feedback on the mode.
Monotony is good. If there is only one way of doing things, then people develop habits and feel secure and in control. If there are too many choices, it is difficult to teach, and difficult to use because you have to decide how you will do something.
Customization is bad because people tinker too much with the system. There is no proof that after customization, people are more productive. Furthermore, the software is harder to test, and harder to debug. And there is no reason to believe that users will be better usability engineers than the authors of the software. Plus a lot of choice makes the system non-monotonous (see above).
Icons are bad because as soon as you have more than a very small number of them, they need explaining again.
A Zooming Interface displays a summarized overview of the computer's contents, and allows the user to zoom in on specifics.
Filesystems are bad. Consider the hassle of learning the importance of files, directories, filenames, etc. when all the user wants to do is type text and print it.
Modes are bad. Therefore LeapMode is good.
Customization is bad. Therefore allow no customization whatsoever.
Icons are bad. Therefore, stick to text.
Zooming Interface: ZoomingInterfaceParadigm (ZIP is something to replace applications, desktop, browsers, etc. All of the content is displayed on an infinite virtual plane. As you zoom closer, documents can be edited.
Filesystems are bad. Instead, just provide an interface where the user can type text. If not the zooming interface, then perhaps the old CanonCat interface — one huge text with document separation characters. If it is easy to select text and print it — i.e. it is easy to mark the text between two document markers — then files are not useful.
The hardware prototype of The Humane Interface was the CanonCat, while the software prototype is called Archy (derived from RCHI, the acronym of the institute) and formerly known as The Humane Environment. See HumaneEnvironment for details and commentary, or the official site at http://raskincenter.org.
Monotony is a standard technique in UI design, one that's been proven in, for instance, the Mac OS over the years. A UI has to map to the contours of the user, but people are people: many UI decisions can (and should) be made scientifically, based on the study of human-computer interactions. Customizing those things away is therefore provably detrimental. In other places, customizability can come with a hidden cost. How useful would it be in UNIX, for instance, to rename "cd" to something else? Sure, you can do it, but you break everything under the sun and make it impossible for people to help you.
On the other hand, in a well-designed operating system, many arbitrarily-chosen things can be customized with negligible testing or debugging overheads, things like backdrops, colour schemes and window sizes: imagine how horrible it would be to use a system where every window was of pre-chosen size and location on the screen because "customization is bad".
It's well-known that hierarchic filesystems suck. (Known to who?) They do not accurately model how we think. (They don't model how you think. Don't presume to know how I think.) Classification schemes always end up eschewing hierarchies. (Wrong. Simply and entirely wrong. Look up how libraries are organized some time.)
However, the solution taken by Raskin is not an improvement. Consider how many more things we actually do with our computers. How much of it can be hammered into a text-only interface? How is a linear ordering any better than a hierarchic one? Files serve a far more useful purpose than do filesystems.
Fortunately, modern operating systems are moving away from the limitations of hierarchy, albeit in more measured steps. Apple's Spotlight technology allows files to store and expose metadata, allowing the user to rapidly locate a file by a simple search. Google Desktop allows us to search our online memories. Several OSes are trying to swap out the old filesystem metaphor for a more database-like system, with varying degrees of success. Media players have long hidden the filesystem behind a metadata-driven interface.
Currently, the filing system is actually working against you, you are just acclimatised to it. (No. Wrong. Do not presume to know me.) Ideally, a system should let you just type, or play a tune, or whatever, and get back to the file later via a simple spatial, search-based or metadata-based interface. Instead of picking a filename, and trying to shoehorn the file's metadata into a hierarchy (do I put Project as the root directory? Date? Author?), one specifies the metadata explicitly.
Computers have, of course, had a zooming interface for decades: the filesystem. Each folder zooms in to show the files within it, which then zoom in to show their contents. If we ever move away from this hierarchic system, an important part to keep will doubtless be the zooming paradigm, in some alternative form.
However, simply placing every file onto an infinite plane willy-nilly is not a substitute for the organization of the filesystem. A user will not want to take six hours rearranging his files getting things looking neat. A new zooming UI must be a significant improvement on what we have now.
An anonymous user has edited the above paragraphs, changing their meaning in a way that I think is interesting and should be discussed. It's hard to discuss changes in the text itself, so I'm restoring the original version and putting the changed paragraphs here:
I think the first paragraph touches a very important point of the author's ability: people do make mistakes, both because of lack of skill and knowledge, and because of simple slip-ups. Computer programs are full of them, they are often called "bugs". Programmers have devised ways of coping with that, however: bug reporting, triaging, patches, security fixes, bugfix revisions, testing, whole release cycle. Usability-related problems are defects, no different from security holes, program crashes, non-standard behavior causing problems communicating with other software -- they should be reported and fixed, and it should be done globally, so that all the users can benefit. I see little sense allowing people to customize the syntax of HTTP headers in their web browser: not taking the time to read the specification and test it is hardly a sign of humility, and pushing the task onto the users doesn't seem like a good idea, even if the author's knowledge on the subject is worse than of most users of the software (and usually it's not that bad). I think it's the same with user interfaces: "I can't design a good interface so I will leave it to the users" is not a sign of humility, but of incompetence. Refusing to fix usability problems, saying "here, I made it customizable, fix it for yourself" doesn't sound like humility either.
The other changes seem to be expressing the lack of trust in solutions that are different from the established ones. I can't really argue with statements like "the utility of all these changes is frequently questionable", because there is simply no information in them, apart of a suspicion that the author is probably indeed frequently questioning them. The fact that changing something leads to making it more useful in one thing and less in other can't really be disputed either, although the form in which it was noted carries a lot of emotion. It's not necessarily a bad thing to make a music player better at playing music, even if it makes it worse at file management or kernel debugging -- even if that means that some users will have to use a real kernel debugger from now on.
A last thing, in the comment to a change the author remarked that Jeff Raskin "lost for a reason". I don't think he has lost: he has done tremendous work researching the human behaviors and evaluating different elements of user interfaces, and helped us all understand better to how design interaction to make it easier and less painful. He also introduced a lot of solutions that are used today, and more of them appear in the user interfaces of our programs as new technologies and audiences open the way. -- RadomirDopieralski
At this point the only thing I can do is to recommend reading that book, it's actually pretty fun to read, apart from the obvious benefits of introducing to formal usability testing that it gives. -- RadomirDopieralski