I Believe – And am not entirely alone in saying this, that today’s Computer Science paradigms and philosophies are completely deranged in a lot of ways.
For starters, computer architectures have become a bloated, complex and unnintelligible bundle of hacks designed to perpetrate the same wrong idea about computer programming and design. As John Backus said in his Turing Award Lecture from 1977 (35 years ago!):
Surely there must be a less primitive way of making big changes in the store than by pushing vast numbers of words back and forth through the von Neumann bottleneck. Not only is this tube a literal bottleneck for the data traffic of a problem, but, more importantly, it is an intellectual bottleneck that has kept us tied to word-at-a-time thinking instead of encouraging us to think in terms of the larger conceptual units of the task at hand. Thus programming is basically planning and detailing the enormous traffic of words through the von Neumann bottleneck, and much of that traffic concerns not significant data itself, but where to find it.
In the very same article, entitled ‘Can Programming be Liberated from the Von Neumann Style?’, he outlines possible alternatives, advantages and drawbacks of non-von-neumann architecture paradigms, like the functional and applicative approach. Ironically enough, 1 year after this writing, Intel 8086 architecture was introduced to the market and became the dominant architecture till today, mainly because of the need of retro-conpatibility with legacy software. After a while, all the new and exciting ideas proposed by Backus lost focus and all the efforts were concentrated on getting the most out of x86 and the von neumann architecture, and together with it came the dominance of the C programming language, designed to work closely together with the way the computer architecture behaved, all nuts and bolts considered. All the exciting research about thinking differently simply vanished from the world.
The reasons why von neumann architecture and its most proeminent example, the x86, became the dominant way of thinking about computing can be said to be technical from some points of view, inertial in others, but the more important thing I believe this is so is neatly demonstrated in a famous article called “The Rise of “Worse is Better”“, from Richard Gabriel. The article should be read in its entirety, but I bring to light an excerption that exemplifies the difference between what he called the ‘worse is better’ approach and ‘the right thing’:
Two famous people, one from MIT and another from Berkeley (but working on Unix) once met to discuss operating system issues. The person from MIT was knowledgeable about ITS (the MIT AI Lab operating system) and had been reading the Unix sources. He was interested in how Unix solved the PC loser-ing problem. The PC loser-ing problem occurs when a user program invokes a system routine to perform a lengthy operation that might have significant state, such as IO buffers. If an interrupt occurs during the operation, the state of the user program must be saved. Because the invocation of the system routine is usually a single instruction, the PC of the user program does not adequately capture the state of the process. The system routine must either back out or press forward. The right thing is to back out and restore the user program PC to the instruction that invoked the system routine so that resumption of the user program after the interrupt, for example, re-enters the system routine. It is called “PC loser-ing” because the PC is being coerced into “loser mode,” where “loser” is the affectionate name for “user” at MIT.
The MIT guy did not see any code that handled this case and asked the New Jersey guy how the problem was handled. The New Jersey guy said that the Unix folks were aware of the problem, but the solution was for the system routine to always finish, but sometimes an error code would be returned that signaled that the system routine had failed to complete its action. A correct user program, then, had to check the error code to determine whether to simply try the system routine again. The MIT guy did not like this solution because it was not the right thing.
The New Jersey guy said that the Unix solution was right because the design philosophy of Unix was simplicity and that the right thing was too complex. Besides, programmers could easily insert this extra test and loop. The MIT guy pointed out that the implementation was simple but the interface to the functionality was complex. The New Jersey guy said that the right tradeoff has been selected in Unix-namely, implementation simplicity was more important than interface simplicity.
The MIT guy then muttered that sometimes it takes a tough man to make a tender chicken, but the New Jersey guy didn’t understand (I’m not sure I do either).
To put it shortly, worse is better approach is simple to design and implement, therefore potentially advances at a quick pace and is fast. However, it places on the programmer most of the burdens that should have been dealt elsewhere. It gets worse when people start to think that worse is better is the only possible thing, and then decide they should improve on top of worse is better, resulting in complex, bloated and insane (the opposite of sane) designs that were not the right thing to begin with and can never be.
The Worse is Better is all around Computer Science today. It goes as deep as the computer architecture itself (The Von-neumann design), and contaminates everything up in the chain: the operating systems (Unix-like and C-oriented), the Programming Languages, the software. The ultimate result being unreliable, unstable, unnecessarily complex computer systems.
As an example of what a sane computing environment should be like, consider reading this article on the ‘Seven Laws on Sane Personal Computing‘. Also very recommended is a famous Alan Kay presentation entitled “The Computer Revolution Hasn’t Happened Yet“.
To close it up, I think something really important needs to be said: Computer Science is not about what is there. It is not about the tools we use, it is not about the computers themselves. Computer Science is about what could be. It is an abstract science strongly rooted in logic and mathematics. The limits are our only imagination. The fact that we use computers to realize these visions doesn’t mean we are attached to them, it doesn’t mean there is only one way to do things. We should be doing real computer science, we should be loosing the strings and starting to think about how everything could be entirely different. We need to start the computer revolution, because alas, it hasn’t happened yet.