The Rise of ``Worse is Better''
By Richard Gabriel
I and just about every designer of Common Lisp and CLOS has hadextreme exposure to the MIT/Stanford style of design. The essence ofthis style can be captured by the phrase ``the right thing.'' To sucha designer it is important to get all of the following characteristicsright:
- Simplicity-the design must be simple, both inimplementation and interface. It is more important for the interfaceto be simple than the implementation.
- Correctness-the design must be correct in all observableaspects. Incorrectness is simply not allowed.
- Consistency-the design must not be inconsistent. A design isallowed to be slightly less simple and less complete to avoidinconsistency. Consistency is as important as correctness.
- Completeness-the design must cover as many important situationsas is practical. All reasonably expected cases must be covered.Simplicity is not allowed to overly reduce completeness.
I believe most people would agree that these are good characteristics.I will call the use of this philosophy of design the ``MIT approach.''Common Lisp (with CLOS) and Scheme represent the MIT approach todesign and implementation.
The worse-is-better philosophy is only slightly different:
- Simplicity-the design must be simple, both in implementationand interface. It is more important for the implementation to besimple than the interface. Simplicity is the most importantconsideration in a design.
- Correctness-the design must be correct in all observableaspects. It is slightly better to be simple than correct.
- Consistency-the design must not be overly inconsistent.Consistency can be sacrificed for simplicity in some cases, but it isbetter to drop those parts of the design that deal with less commoncircumstances than to introduce either implementational complexity orinconsistency.
- Completeness-the design must cover as many importantsituations as is practical. All reasonably expected cases should becovered. Completeness can be sacrificed in favor of any otherquality. In fact, completeness must sacrificed whenever implementationsimplicity
is jeopardized. Consistency can be sacrificed to achievecompleteness if simplicity is retained; especially worthless isconsistency of interface.
Early Unix and C are examples of the use of this school of design, andI will call the use of this design strategy the ``New Jerseyapproach.'' I have intentionally caricatured the worse-is-betterphilosophy to convince you that it is obviously a bad philosophy andthat the New Jersey approach is a bad approach.
However, I believe that worse-is-better, even in its strawman form,has better survival characteristics than the-right-thing, and that theNew Jersey approach when used for software is a better approach thanthe MIT approach.
Let me start out by retelling a story that shows that theMIT/New-Jersey distinction is valid and that proponents of eachphilosophy actually believe their philosophy is better.
Two famous people, one from MIT and another from Berkeley (but workingon Unix) once met to discuss operating system issues. The person fromMIT was knowledgeable about ITS (the MIT AI Lab operating system) andhad been reading the Unix sources. He was interested in how Unixsolved the PC loser-ing problem. The PC loser-ing problem occurs whena user program invokes a system routine to perform a lengthy operationthat might have significant state, such as IO buffers. If an interruptoccurs during the operation, the state of the user program must besaved. Because the invocation of the system routine is usually asingle instruction, the PC of the user program does not adequatelycapture the state of the process. The system routine must either backout or press forward. The right thing is to back out and restore theuser program PC to the instruction that invoked the system routine sothat resumption of the user program after the interrupt, for example,re-enters the system routine. It is called ``PC loser-ing'' becausethe PC is being coerced into ``loser mode,'' where ``loser'' is theaffectionate name for ``user'' at MIT.
The MIT guy did not see any code that handled this case and asked theNew Jersey guy how the problem was handled. The New Jersey guy saidthat the Unix folks were aware of the problem, but the solution wasfor the system routine to always finish, but sometimes an error codewould be returned that signaled that the system routine had failed tocomplete its action. A correct user program, then, had to check theerror code to determine whether to simply try the system routineagain. The MIT guy did not like this solution because it was not theright thing.
The New Jersey guy said that the Unix solution was right because thedesign philosophy of Unix was simplicity and that the right thing wastoo complex. Besides, programmers could easily insert this extra testand loop. The MIT guy pointed out that the implementation was simplebut the interface to the functionality was complex. The New Jersey guysaid that the right tradeoff has been selected in Unix-namely,implementation simplicity was more important than interfacesimplicity.
The MIT guy then muttered that sometimes it takes a tough man to make atender chicken, but the New Jersey guy didn't understand (I'm not sureI do either).
Now I want to argue that worse-is-better is better. C is a programminglanguage designed for writing Unix, and it was designed using the NewJersey approach. C is therefore a language for which it is easy towrite a decent compiler, and it requires the programmer to write textthat is easy for the compiler to interpret. Some have called C a fancyassembly language. Both early Unix and C compilers had simplestructures, are easy to port, require few machine resources to run,and provide about 50%--80% of what you want from an operating systemand programming language.
Half the computers that exist at any point are worse than median(smaller or slower). Unix and C work fine on them. Theworse-is-better philosophy means that implementation simplicity hashighest priority, which means Unix and C are easy to port on suchmachines. Therefore, one expects that if the 50% functionality Unixand C support is satisfactory, they will start to appear everywhere.And they have, haven't they?
Unix and C are the ultimate computer viruses.
A further benefit of the worse-is-better philosophy is that theprogrammer is conditioned to sacrifice some safety, convenience, andhassle to get good performance and modest resource use. Programswritten using the New Jersey approach will work well both in smallmachines and large ones, and the code will be portable because it iswritten on top of a virus.
It is important to remember that the initial virus has to be basicallygood. If so, the viral spread is assured as long as it is portable.Once the virus has spread, there will be pressure to improve it,possibly by increasing its functionality closer to 90%, but usershave already been conditioned to accept worse than the right thing.Therefore, the worse-is-better software first will gain acceptance,second will condition its users to expect less, and third will beimproved to a point that is almost the right thing. In concreteterms, even though Lisp compilers in 1987 were about as good as Ccompilers, there are many more compiler experts who want to make Ccompilers better than want to make Lisp compilers better.
The good news is that in 1995 we will have a good operating system andprogramming language; the bad news is that they will be Unix and C++.
There is a final benefit to worse-is-better. Because a New Jerseylanguage and system are not really powerful enough to build complexmonolithic software, large systems must be designed to reusecomponents. Therefore, a tradition of integration springs up.
How does the right thing stack up? There are two basic scenarios: the``big complex system scenario'' and the ``diamond-like jewel''scenario.
The ``big complex system'' scenario goes like this:
First, the right thing needs to be designed. Then its implementationneeds to be designed. Finally it is implemented. Because it is theright thing, it has nearly 100% of desired functionality, andimplementation simplicity was never a concern so it takes a long timeto implement. It is large and complex. It requires complex tools touse properly. The last 20% takes 80% of the effort, and so the rightthing takes a long time to get out, and it only runs satisfactorily onthe most sophisticated hardware.
The ``diamond-like jewel'' scenario goes like this:
The right thing takes forever to design, but it is quite small atevery point along the way. To implement it to run fast is eitherimpossible or beyond the capabilities of most implementors.
The two scenarios correspond to Common Lisp and Scheme.
The first scenario is also the scenario for classic artificialintelligence software.
The right thing is frequently a monolithic piece of software, but forno reason other than that the right thing is often designedmonolithically. That is, this characteristic is a happenstance.
The lesson to be learned from this is that it is often undesirableto go for the right thing first. It is better to get half of the rightthing available so that it spreads like a virus. Once people are hooked onit, take the time to improve it to 90% of the right thing.
A wrong lesson is to take the parable literally and to conclude that Cis the right vehicle for AI software. The 50% solution has to bebasically right, and in this case it isn't.
But, one can conclude only that the Lisp community needs to seriouslyrethink its position on Lisp design. I will say more about thislater.