Yesterday I ran across Askar Safin’s blog post The Collapse of the UNIX Philosophy. Two quotes from the post stood out. One was from Rob Pike about the Unix ideal of little tools that each do one job:
Those days are dead and gone and the eulogy was delivered by Perl.
The other was a line from James Hague:
… if you romanticize Unix, if you view it as a thing of perfection, then you lose your ability to imagine better alternatives and become blind to potentially dramatic shifts in thinking.
This brings up something I’ve long wondered about: What did the Unix shell get right that has made it so hard to improve on? It has some truly awful quirks, and yet people keep coming back to it. Alternatives that seem more rational don’t work so well in practice. Maybe it’s just inertia, but I don’t think so. There are other technologies from the 1970’s that had inertia behind them but have been replaced. The Unix shell got something so right that it makes it worth tolerating the flaws. Maybe some of the flaws aren’t even flaws but features that serve some purpose that isn’t obvious.
(By the way, when I say “the Unix shell” I have in mind similar environments as well, such as the Windows command line.)
On a related note, I’ve wondered why programming languages and shells work so differently. We want different things from a programming language and from a shell or REPL. Attempts to bring a programming language and shell closer together sound great, but they inevitably run into obstacles. At some point, we have different expectations of languages and shells and don’t want the two to be too similar.
Anthony Scopatz and I discussed this in an interview a while back in the context of xonsh, “a Python-powered, cross-platform, Unix-gazing shell language and command prompt.” While writing this post I went back to reread Anthony’s comments and appreciate them more now than I did then.
Maybe the Unix shell is near a local optimum. It’s hard to make much improvement without making big changes. As Anthony said, “you quickly end up where many traditional computer science people are not willing to go.”
Related post: What’s your backplane?
There are two things that strike me about UNIX, conceptually.
One is its use of text as a uniform bedrock stratum for everything; the idea that text ought to be the underlying reality from which all else is built. UNIX is one of my cases-in-point for my conviction that systems to be truly stable must do this, grounding themselves in text — programming languages; TeX/LaTeX; UNIX (to be clear: not just the shell, the whole shebang); html; wikis.
The other thing that strikes me about UNIX is its, well, /geometric/ view of the content of a system. My mental model of a Windows system is basically that it’s a pathologically complex tangle of stuff with random connections all over the place; sorry if that sounds harsh, but it’s what it feels like to me, things like a “registry” just making it all more chaotic/spaghetti-like. But, put me in a UNIX shell, and the whole system is a place —remember the term “cyberSPACE”? my emphasis, of course— a sort of building I compellingly feel can wander around in, and find things housed in rooms and annexes built for them. From my dorm room (er, home directory) I have stairways leading up to lofts where I stash my stuff of various kinds (subdirectories), and there’s one of those tight winding iron spiral staircases that goes all the way down to the sub-sub-basement (root directory), where the whole structure of the building is visible, stairways leading upward to major wings of the building, like the dormitory wing where the users live (/user), campus plant services where the devices are housed (/dev), etc. (So help me, it’s like I can hear the heavy machinery thrumming, from down there.) It seems as if our push to create more graphical interfaces has paradoxically moved us further /away/ from the cyberspace metaphor. Navigating the system through the UNIX shell is maybe a bit like playing Zork, an interactive text adventure without spiffy graphics to show you what’s going on (it is pitch black; you are likely to be eaten by a grue), but, it always has been devilishly hard not to lose the essence of a good book when converting it into a movie.
I invite all to listen to Jeffrey Snover and his work on Powershell from yester-decade or later:
https://channel9.msdn.com/Blogs/TheChannel9Team/Jeffrey-Snover-Monad-explained
Michael: There are things I really like about Powershell. And I like some of the things Jeffrey Snover and Bruce Payette have said about its design. Similar to Anthony Scopatz’ remarks, they found they had to make some compromises to Powershell as a language to match expectations of a shell.
I can still recall my two favorite shell items encountered when first using BSD 4.1 back in ’81: “|” and “&” (piping and process management).
Before then we certainly had scripting in the form of literal “batch” files, but output had to be written to temp files which were then read by the next command. Ugh.
To access multitasking we had to ask the job control system to fire multiple batch files for us or (later) have multiple terminal sessions open at once (the shift from multiple physical terminals to ‘screen’ was near-miraculous). And to get these instances talking to each other we had to use (wait for it) more text files. Double-ugh.
Accessing similar functionality in a language REPL takes far more than a single character!
I love Python and I love Bash, but Xonsh left me cold. I could be a fossil.
The POSIX shell got at least two things right: efficiency and user empowerment.
The eternal quest in computing is to get something done as efficiently as possible. That’s literally why computers were invented (there was math needing to be done, and long division is ugly). So having a shell that talks, more or less, directly to the computer so that you can make exactly what you want to happen is really powerful. Especially on a POSIX system, you’re writing only a few levels above what the computer is actually doing; it’s the API to your OS.
Secondly, the ability to use pipes and scripts to construct personalised extensions to the commands you’re given is huge. It means that everyday users can find the efficient way to do a series of things, and then make it all happen with one command.
That’s it, in a nutshell. I disagree strongly with the statement that the UNIX philosophy is dead, because it’s still at work today, from the cloud to worldwide collaboration and version control. However, I agree that it’s not holy or perfect, so I’m fine with it changing over time. But the core idea is still here, and until something better does come along, I’ll keep it.
for a modern shell both for scripting and doing OS leve stuff, check out http://www.lihaoyi.com/Ammonite/#Ammonite-Shell
Ammonite shell?
so you need to be running a JVM and layer Scala on top of it?
To get a useful command line?
NO. Just NO.
Why not just port PowerShell to Linux?
The biggest thing IMO is that it made the transition from simple commands to complex scripts *seamless*. You can start with that single command, pipe its output through a filter or two, add a loop to process each item, a few branches to handle special conditions, and before you know it you have an actual script likely worth saving. This is also its greatest weakness BTW. It’s why we have tons of scripts that do insufficient error or input checking, and do terrible things e.g. if a file name contains white space. I’ve never really used it, but I’ve heard that PowerShell took a couple of steps in the right direction here. It’s too bad that most people in the Linux world don’t *want* the right direction, but that’s a different topic.