Unix doesn't follow the Unix philosophy

The Unix philosophy is a noble idea, but even Unix doesn’t follow it too closely. The Unix philosophy, as summarized by Doug McIlroy, says

  1. Write programs that do one thing and do it well.
  2. Write programs to work together.
  3. Write programs to handle text streams, because that is a universal interface.

Here is an example from James Hague where the first point has been strained.

The UNIX ls utility seemed like a good idea at the time. It’s the poster child for the UNIX way: a small tool that does exactly one thing well. Here that thing is to display a list of filenames. But deciding exactly what filenames to display and in what format led to the addition of over 35 command-line switches. Now the man page for the BSD version of ls bears the shame of this footnote: “To maintain backward compatibility, the relationships between the many options are quite complex.”

James Hague gives this as only one small example of how programmers have allowed things to become unnecessarily complicated. He concludes

We did this. We who claim to value simplicity are the guilty party. See, all those little design decisions actually matter, and there were places where we could have stopped and said “no, don’t do this.” And even if we were lazy and didn’t do the right thing when changes were easy, before there were thousands of users, we still could have gone back and fixed things later. But we didn’t.

He’s right, to some extent. But as I argued in Where the Unix philosophy breaks down, some of the growth in complexity is understandable. It’s a lot easier to maintain an orthogonal design when your software isn’t being used. Software that gets used becomes less orthogonal and develops diagonal shortcuts.

Why does ls have dozens of tangled options? Because users, even Unix users, are not overly fond of the first two points of the Unix philosophy. They don’t want to chain little programs together. They’d rather do more with the tool at hand than put it down to pick up new tools. They do appreciate the ideal of single-purpose tools that work well together, but only in moderation.

I agree that “all those little design decisions actually matter, and there were places where we could have stopped and said ‘no, don’t do this.’” Some complexity has come from a lack of foresight or a lack of courage. But not all of it. Some of it has come from satisfying what complex humans want from their software.

Related post:

100x better approach to software?

31 thoughts on “Unix doesn't follow the Unix philosophy

  1. Hi people

    For the ultimate critique of Unix philosophy – read the Unix Hater handbook from 1994. The printed version came with a barf bag (I kid you not) http://www.simson.net/ref/ugh.pdf

    “Modern Unix is a catastrophe. It’s the “Un-Operating System”: unreliable, unintuitive, unforgiving, unhelpful, and underpowered. Little is more frustrating than trying to force Unix to do something useful and nontrivial. Modern Unix impedes progress in computer science, wastes billions of dollars, and destroys the common sense of many who seriously use it. An exaggeration? You won’t think so after reading this book.”

    They even let Ritchie write the anti-foreword: http://www.art.net/~hopkins/Don/unix-haters/DennisRitchie.gif

    He ends it thusly: ” TheUnixHatersHandbook is a pudding stuffed with apposite observations, many well-conceived. Like excrement, it contains enough undigested nuggets of nutrition to sustain life for some. But it is not a tasty pie: it reeks too much of contempt and of envy. Bon appetit !”

  2. I really want to be able to dismiss these statements, but I can’t. In fact, I find myself reconsidering what it is about Unix that draws me to it. I hope it’s not just experience.

    Very powerful article.

  3. Regarding your statement that “even Unix users, are not overly fond of the first two points of the Unix philosophy”:

    I think that maybe a more accurate statement is that no one likes having *only* small sharp tools to work with. The whole point of having small focused tools is to be able to combine them, but that doesn’t mean the user should have to do that combining. Sometimes it’s nice to have one flexible tool, but that tool doesn’t have to be a monolithic construction. That tool can just as easily delegate to the small, focused tools.

    In that light, one would ask why the developers who evolved ls didn’t rather create a new tool that either sat beneath or wrapped around ls (or both) to introduce this flexibility.

  4. I think the real issue is “where the complexity goes”. What the “Unix Philosophy” appears to advocate is “Compiled programs should be simple; put the complexity in shell scripts.” For instance, one could have made “ls” do just file listing and had several other smaller programs to handle various filtering options and a shell script to control the wiring of the different programs. With “ls” (and other examples) it’s not so much the “philosophy breaking down” as it is “people not adhering to it” by putting the complexity in the C program.

  5. Joel: I like your point about where the complexity goes. There’s sort of a conservation law of complexity, at least a lower bound. Often we say where we don’t want our complexity, so the right question to ask next is where do we want your complexity. We don’t ask that enough.

  6. Please don’t mix up Unix and GNU they are very different things.
    Gnu is Not Unix.

  7. Hi,

    I’ve used Unix for over 30 years now – closing in far too rapidly on 40 years of daily use. The biggest issue with the “Unix Philosophy” to my mind is point 1. What’s unsaid is that a lot of those tools don’t have only command line options, but full-blown languages, all of which are mutually incompatible, that you have to master in order to do much of anything. It’d be one thing if sed, awk, etc. all had something roughly approaching similar command languages, but they don’t.

    As for the growth of command line options, I blame all the attempts to “unify” SysV and BSD, which mostly came down to making sure that everyone’s favorite options were still available, and any collisions were dealt with “appropriately”.

    And let’s not even get started on the vi/emacs differences….


  8. No, modern UNIX wannabes don’t follow UNIX philosophy.

    Which is exactly why I use OpenBSD

  9. well, “ls” does one and does it well, the “well” part came with all the options people seem to be complaining about.

    How could it do “well” for most people with none or only a handful of options?.

    Will it be better to have one “ls” command with 30 different options, or two “ls” like tools each with 15 different options or 30 “ls” like tools each with a single option?

    It seem like “ls” tool is a bad example since it perfectly does what the first rule says. It only does one thing, and it does it well. Complains against it as far as the first rule goes would be justified if it did anything else apart from displaying folder contents.

    When it comes to managing complexity, i think one tool with 30 switches is better that 30 tools each with a single option.

  10. Many of the flags in ls are there because of point 3: the text stream. If I want to sort the output of “ls -l” by time, I can’t really pipe into “sort” because the format of the time column isn’t lexically sortable. So you need the “-t” flag of “ls”. If we gave up on text and went with structured records, then you could sort by any column, even undisplayed ones. I believe PowerShell does things this way.

  11. Many of the flags in ls are there because of point 3: the text stream.

    Absolutely. Sometimes text is what you want to process, but most of the time, it isn’t. Programming to the lowest common denominator of text was probably OK 40 years ago, but it’s not very useful for the majority of cases now. This has more to do with the shell than the OS, though.

    I believe PowerShell does things this way
    It does and it’s ridiculously easy to map, filter, reduce, and sort anything.

    Here’s a rant from someone on the xkcd forums about this issue.

  12. I think you are right on the conclusion, but wrong on the reasons.

    It’s not that unix user’s were unable or unwilling to chain together a series of small commands. Instead, it’s the text interface. While the text interface is incredibly universal and flexible, it doesn’t scale for complex data.

    Don’t get me wrong, using regular expressions you could parse anything. However, as that complexity increases by adding additional data, data types, and data relationships, the parsing becomes difficult. These problems were much easier to solve in a language that supports data structures. So, for common repetitive tasks, they were coded in as switches.

    The truth is, it was not 1 & 2 that were the problem, it was 3. In addition to the simplification provided by a standardized text interface for every utility, there needs to be a standardized interface for handling structured data. Pipes in their current form in combination with shell script don’t provide enough power to handle the range of capabilities that would be necessary to replace all of the switches.

    For example, if ls could provide all of it’s fields through the pipe and you could inline a smallish scripting language, then many switches would be unnecessary…something like
    ls | grep { structure.modifiedDate == today } | each { echo ‘${structure.name} was modded today’}

    In the end, it is a mess. I think that the concept is right. It’s just difficult to provide tools that enable writing software without having a programming language.

  13. Ken Olson once said:

    “It is our belief, however, that serious professional users will run out of things they can do with UNIX. They’ll want a real system and will end up doing VMS when they get to be serious about programming. With UNIX, if you’re looking for something, you can easily and quickly check that small manual and find out that it’s not there. With VMS, no matter what you look for — it’s literally a five-foot shelf of documentation — if you look long enough it’s there. That’s the difference — the beauty of UNIX is it’s simple; and the beauty of VMS is that it’s all there.” — DECWORLD Vol. 8 No. 5, 1984

    What Ken failed to realize is that once hit the limitations of what they could do with Unix; once they needed things like virtual memory, shared libraries, ACLs, memory mapped I/O, etc, they didn’t switch to VMS, they hacked them into Unix and Unix-like operating systems.

  14. mmm, that must be why the only sane way I found to use `ls` without remembering all the command line options is to pipe it to `grep`.

  15. UNIX philosophy was good for the time it was invented – time of 1Kb files and green-on-black terminals. Now it’s ridiculous, ugly and very limited set of “two-letters” programs. Music, video, executables, pictures – all of this is unmaintainable by any of grep/awk/perl/whatever. And as said, even LS is out of UNIX principles. So… we’re going closer and closer to something like Windows! But too late – we already have some ugly architected piece of sh*t, named MS Windows, occupied all our workstations. It’s time for a new OS, at least smth like QNX or Minix.

  16. I wouldn’t say the Unix philosophy is out of date. I just think it needs to be applied loosely with some concession for human nature.

    The basic idea of piping small programs together is alive and well. For example, Microsoft has embraced this idea in PowerShell and in the F# language.

    The idea of text as the universal interface is more complicated. As some comments have indicated here, sometimes text is too unstructured. There can be huge advantages to piping around objects rather than text streams. On the other hand, text streams require less coordination, and so you may be able to get started faster gluing together two things that share text strings. But if you are able to coordinate, objects are a higher-level interface that’s better in the long run.

  17. The phenomenon you are talking about is to some extent the result of the fact that Unix/Linux are the product of hackers. When any programmer can cram any option he thinks is “neat” into a program you wind up with this. I think this shows nicely one of the main problems with open-source software: no one is controlling the design, thus there is no coherent design philosophy.

  18. The reason that ls is so complex is due to the interaction between the shell and the directories it is listing. If you want a text-based tool that streams to stdout, use find. The question I am asking myself is: Is the output of ls in fact fit for stdout piped to stdin? I rarely use ls that way. The forest of options in ls does give me more helpful organization to a file listing than echo *, such as ls -lrt. Does that mean that it needs to be a different application? Perhaps ls should be broken up into different applications, but a shell alias could also do that for you once you learn how to list what you want. Yanking out the hard-to-learn interactions might be worthwhile, but why has no one written a replacement? It is a one-stop-shopping kind of application where the attributes of files are used to list them, so perhaps that is ok in the end. I think that Les is on to something, that ls was the equivalent of a GUI app by today’s standards, it was just implemented as a ChUI because that is what they had at the time. Unix/Linux is not underpowered for the work it was designed to do – it does much more with less than its contemporaries did. But GUI systems are a completely different paradigm, so compare Linux vs Windows vs Mac vs Amiga for that. There is no end in sight for the way that text can be used to process information. It can be used to prototype many things that then turn into the fundamentals of new programs. When was the last time you prototyped a new GUI using the elements of other GUIs that you chained together? I don’t think that Unix/Linux are somehow deficient because developers need a place to put irreducible complexity in the context where it will be used. It is hard to create something that is both a small sharp tool that is also an inventory of the domain it sits in. You might say that ls has succeeded in achieving this.

  19. 1. Well, there is ls and there is find. Two file listers that do similar jobs but each with its own twist. Now imagine that find and ls were combined. Now that’s complexity!!!

    2. And don’t forget head and tail and xargs to prune and post-process the list. Instead of building all that complexity into ls.

    3. I remember old issues of Byte magazine where they showed examples of DOS batch files to do “useful things”. What could be done in a few clear and elegant lines of sh, ksh, or bash was instead done using a convoluted mess that rambled for a half-page. By comparison, UNIX looked clear, concise, elegant, and even pretty. It’s all relative!

    4. UNIX is fragmented and ugly. It’s the worst OS, except for the alternatives.

    5. UNIX is not a trip to Paris. No software is. Only a trip to Paris is a trip to Paris.

  20. Brian’s fourth point boiled it all down to the fundamental truth… All operating systems suck! *NIX just sucks less than all the rest. It’s a complex, convoluted, ungainly mass of programs littered throughout a filesystem on crack. But the simple fact is that, when compared to every other option, it’s the one you can use for any and all computing needs.

    But that’s just my opinion.

  21. The most non UNIX way tool afaik is Git, it is overloaded with flags, there are more then one (or even more than two) way to do common tasks. No wonder it is made by father of Linux.

  22. About unix law #3: It seems like M.S. really brought in the Media files over Text alone streaming…..but I like the idea of having both Media and Text files separate on a computer OS, but yet the Media files have no standards and is still ever growing whereby we are in need of a new paradigm !!!!!!!!

  23. It has paid the bills and a bit more for over 20 years, been interesting, fun and meet many great people along the way. Would not have swapped Unix as a career choice for any philosphoical point.

Comments are closed.