Skip to main content

Fred Brooks vs Dijkstra?

In the 60's, Freed Brooks led the development of the IBM/360 system and its OS. The experience made him write the famous The Mythical Man-Month.

Turns out that Dijkstra also wrote about that system in his EWD255. And everything he complained about on that system is the standard nowadays! Sounds like he considered that a lot of the complexity that should be fixed by the computer is being dumped on the programmer; that the features touted by the IBM/360’s OS are in fact partial fixes to the bad design of the computer:

The unbelievable thing is that this machine is now being advertised with the "wonderful operating system" that relieves the programmer of so many tasks! This glosses over the facts that:
  1. a large part of these tasks are necessitated by the hardware and would have been easier or non-existent in a better design; (*)
  2. this operating system implies an alarming overhead (but IBM is happy to supply a faster model from the family);
  3. the tasks have not been obviated, but have been transferred to the computing centre's management, which has to dimension and manage the machine
  4. the creation of this operating system turned out to be a programming task beyond the capabilities of the manufacturer; (**)
  5. that the system has become such a baroque monstrosity that no user can venture to adapt it.
In (*), he criticizes that the programs have to address memory instead of information, which dumps the responsibility of memory management to the programmer.

About (**), Dijkstra comments in EWD1303 that when the OS turned out to never be properly completed he wasn’t surprised:
In order to make errors reproducible they equipped the IBM/360 with a special hardware monitor; in the recording mode it would capture precisely where in the computation each interrupt took place, in the controlling mode it would force the computer to replay exactly the recorded computation, thus making intermediate results available for inspection. It struck me as a crude and silly way of proceeding, and I remember being grateful that lack of money had protected me from making that methodological mistake. I felt grateful and superior, and when a few years later it became clear that IBM couldn’t get its OS/360 right, I was not amazed.
He is dissing as a “methodological mistake” what even now would be considered a pretty good debugger!

So, according to Dijkstra, we are pretty much living in hell. I would love to know what they had by then.

SICP says early on that Computer Sciences do not study the computer, in the same way that Astronomy does not study the telescope. Maybe we have really started looking at the hand and missed the moon?

But back to Brooks: in 1986 he wrote No Silver Bullet, where the main recommendations are:
  • Buy software, do not build it; in particular, given cheap SW, people will adjust to it, instead of wanting to adjust the SW. So he recommends providing users with a generic word processor, spreadsheet and drawing program so they are pretty much self-suficient for anything (instead of building special tools for each company). Cue MS Office.  
  • Customer does not know, can not know what s/he wants. So he recommends fast turnaround, fast iterative development: cue the Agile movement
  • He talks about using OOP to remove "accidental difficulty" from the process, “allowing the designer to express the essence of the design without having to express large amounts of syntactic material that add no information content”…
…but he stops at OOP? Makes me think that he was not very informed about functional programming, given that OOP is (1) trivial to add to functional programming languages, (2) dissed by all the greats as uninteresting – or worse.

Interestingly, on 1987 the second AI winter started.

Says Brooks, 
…well over half of the time you spend working on a project (on the order of 70 percent) is spent thinking, and no tool, no matter how advanced, can think for you. Consequently, even if a tool did everything except the thinking for you – if it wrote 100 percent of the code, wrote 100 percent of the documentation, did 100 percent of the testing, burned the CD-ROMs, put them in boxes, and mailed them to your customers – the best you could hope for would be a 30 percent improvement in productivity. In order to do better than that, you have to change the way you think.
70% of the time thinking – but, how much of that time is spent breaking down the ideas into small-enough morsels that can be fed to the computer? That is, how much time is spent being what Paul Graham calls "the human compiler"?  Wouldn't that be fixed by using higher-level languages?

In this same direction, Brooks writes something that doesn't stand to scrutiny:
The complexity of software is an essential property, not an accidental one
That's in fact like throwing your hands up and admitting defeat.  But that complexity is much finer grained, and the distinctions have a big influence in the whole process.

A trivial example: in 2010 I had to write a piece of software using Borland C++ – an IDE and environment from about 1998. The IDE itself had a crap help system, the libraries included too, and the net was practically devoid of any information or possibilities of help about it all; add to that the extra bugs coming from using the still-barely-Windows 95 IDE, and its products, on a Windows XP.

Of course I WOULD have used any number of free, current-day systems instead of that IDE. But I HAD to use it. And that mandate made a lot of things take around twice the time and effort they should have taken.
That is an extreme example of forced complexity, yes; it's extreme just to drive the point home clearly. But we are so used to milder examples of arbitrary complexity that I believe that we are swimming in a sea of such forced complexity.

For example… Do you already use restricted pointers in your C code?
Or… Are you at least familiar with the strict aliasing rules that are in place since C99? You know, 16 years ago; the rules that gcc enforces for you if you are using C99 or C11 and optimization, even though probably it won't warn you when you break them – and instead will happily generate Undefined Behaviour.
Oh, and did you know that GCC 5 by default enables C11, instead of the C89 that was used up to GCC 4.x?
This transition will be a good moment to realize the hike in complexity. But surely soon we'll get used to it, and the peak will drown in the normal noise level. Because, hey, it's only natural that the compiler needs help from the "human compiler" to optimize things, right?


A great post on Brooks vs Functional: http://paulspontifications.blogspot.com/2007/08/no-silver-bullet-and-functional.html

And a bit (well, much more than a bit) of OOP bashing:
http://www.smashcompany.com/technology/object-oriented-programming-is-an-expensive-disaster-which-must-end

UPDATE: another paper from 2006, "Out of the tar pit", deals with the subject of "No Silver Bullet" vs functional (and other) programming, and makes a far more convincing point than I could – while delving on functional and logic programming (and more!), and how they can fix things.



Discussion in Hacker News

Comments

  1. > …but he stops at OOP? Makes me think that he was not very informed about functional programming, given that OOP is (1) trivial to add to functional programming languages, (2) dissed by all the greats as uninteresting – or worse.

    The OOP bashing is mostly complaints about the complexity of C++ as a language and the enterprise-imposed ceremony in Java, but nothing really interesting. It's mostly complains from people who rant, and not all the actual greats who program the stuff we use everyday.

    In the real world, almost every OS (Windows, OS X, iOS, Android), every desktop application (Word, Photoshop, Chrome, Firefox, Pro Tools, any AAA game, etc), every major dev tool (except Emacs), and every major service you use today (e.g. Google Search, Gmail), and every major programming language have been written using OOP (or more basic procedural) principles. They have adopted this or that not-really-functional concept first pioneered in Lisp and the like (e.g. GC, closures, vectors, etc), but they are not going functional any time soon.

    Here's how a Lisp expert, functional master, and head of AI in Google puts it: http://norvig.com/Lisp-retro.html

    ReplyDelete
    Replies
    1. "almost every OS (Windows, OS X, iOS, Android)" -> OS X is written in C, like Linux, like Android. The whole Unix-derived world is C.
      So, unless you want to argue that using structs and function pointers is OOP, then you're wrong.

      ... and if you DO want to argue that, then you're reducing OOP to using data structures, which 1) no one will tell you is bad in itself, 2) proves how tenuously defined OOP is.

      Maybe you meant the GUIs of those OSes - then I'd argue we need to tighten definitions.

      "In the real world" you say - you mean in the world in which Unix and C won, the "Worse is better" world, the world where we got C's Undefined Behavior and related buffer overflows and insecurity by definition, and in which even C++ is seeing value in adding lambdas.

      So, yes, "the real world" is an intriguing summary of the problem.

      Regarding the link - it summarizes as "Lisp is less popular than other languages". What is the point?

      Delete
  2. If you're only using objects as a module system with mutable state, as opposed to using object-oriented principles to organize code and reduce repetition through inheritance, you're not writing an object-oriented program. You're writing procedural code with more pointless ceremony. And guess what: Most varieties of functional programming language support that without the need for the ceremony.

    ReplyDelete

Post a Comment