Unix advocates often say Unix is great because it has all these powerful tools. And yet practically every Unix tool has been ported to Windows. So why not just run Unix tools on Windows so that you have access to both tool sets? Sounds reasonable, but hardly anyone does that. People either use Unix tools on Unix or Windows tools on Windows.
Part of the reason is compatibility. Not binary compatibility, but cultural compatibility. There’s a mental tax for shifting modes of thinking as you switch tools.
I think the reason why few people use Unix tools on Windows is a sort of negative space. Artists use the term negative space to discuss the importance of what is not in a work of art, such as the white space around a figure or the silence framing a melody.
Similarly, part of what makes an operating system culture is what is not there. You don’t have to worry about what’s not there. And not worrying about something frees up brain capacity to think about something else. Having too many options can be paralyzing. I think that even though people say they like Unix for what is there, they actually value what is not there.
24 thoughts on “Negative space in operating systems”
makes sense, people say that they like things because it is just as in the right place for them on the complexity / simplicity scale in terms of their needs
The first thing I used to do was to install all the Unix tools on Windows. I’ve stopped that practice in the hopes of giving the default Windows tools a chance.
Oh it is painful. So painful.
I started out programming on Unix. When moved over to Windows, I brought my favorite Unix tools with me: emacs, grep, etc. I soon decided that it was easier to go native. Every so often I’d try to come up with a best-of-both-worlds synthesis, but it never worked. It’s easier to use one tool chain or the other.
I admit I have not used the fabled Powershell, but many Unix tools on Windows leave a bitter aftertaste, mostly because they were thought and implemented in a bash/tch/korn context. ipython, for example, feels like a hack. MINGW is not nearly as well integrated in Windows. Try to add libraries: in Linux, adding ATLAS is one click away, in windows, it’s complicated. I could provide more examples, but you get the point. Powershell is built from the ground up, hence the difference. However, I venture to say that a lot up people used bash/pithon/perl for their scripting needs, and very few use the Powershell (I learned about it on this blog, and I work for an IT Big Multinational…).
To every OS its role and strengths. Windows is not very friendly toward ported tools, but has the best consumer applications.
A lot of people continued/continue to do stuff from the command line using old DOS commands. I’m not sure whether that’s even still possible in modern Windows OS? But in, say, Windows 95 it was still often easier to pop open the command window and fix a problem there rather than to try to navigate the gui.
It’s funny, but Microsoft has discovered the command line and created an object oriented shell, PowerShell. It’s becoming easier to do administration via scripts, especially things like working with Active Directory and Exchange.
Does Powershell comes with Vista by default? I think not.
You can get Windows to be Unix-like, as you could get any OS to mimick any other OS… however, the devil is in the details. Small things that you will never get working like you need to.
You’re right. Making Window Unix-like or making Unix Windows-like is never going to be satisfying. When in Rome, do as the Romans do.
I’m surprised nobody has brought up OS X. Apple goes a great job of managing negative space.
As for PowerShell, it did not ship with Vista by default, but it is supposed to ship with Window 7. And it ships with servers.
“I think the reason why few people use Unix tools on Windows is a sort of negative space”
Yes, Windows is a negative space especially for the Unix tools. Just have a look at the file(system) naming scheme under Windows that broke all the standard Unix tools. Cygwin tried to circumvent such behaviour but you can forget to move a script from any Unix-like operating system to the Windows. It’s better to stay under Unix.
I didn’t mean “negative” to have a pejorative connotation. “Negative space” is just the term artists use. It’s an unfortunate term; simply saying “space” would be more accurate. I guess it’s “negative” in the sense of a photographic negative, inverting your usual perception. In any case, more is not always better, and what’s not there can be valuable.
I will simple add that a Windows port of a large collection of unix tools exists at
In order to make many of them work, it was necessary to make a
mapping for a lot of underlying concepts (i.e. things like having
a “/dev” included in the filesystem)
I see two reasons why many Unix tools feel »wrong« on Windows: First of all, they are abstracted to Unix/Linux, not generally. That means they rely on the shell doing things in a certain way (e.g. globbing) which the Windows shell does not and other things. Then there are cultural differences and things you’d never do on the other system. I hate poorly-ported programs that insist on creating a
.foobarfolder in my home directory – those aren’t hidden by default and therefore get in my way. On Windows there is AppData for that stuff, which still resides under my profile, but differently. Likewise I’ve crashed a few makefiles or build scripts by trying to build stuff in a folder that nicluded spaces in its name. Not that that cannot be done on Unix (I did, after all), but nearly no one expects it (whereas on Windows, especially with the awkwardly-named »Documents and Settings« at least is something many people are aware of).
I think I’m mostly native on Windows, although I have large parts of GNUWin32 installed. However, the only thing I use on a regular basis is
wc -land it’s not that that couldn’t be approximated with a quick
find /c /v "". And poorly-ported programs that literally try opening a file named
for %x in (*) do foobar %xaroudn them. You’ll eventually know this stuff and adapt but indeed, it gets painful after a while. If PowerShell would start faster than 2 seconds I’d probably use it a lot more (well, and I might need to look into the PowerShell integration for Far again since I’m there most of the time, command-line-wise).
I am constantly working in both worlds, Unix and Windows. I find the operating system environments necessary, but annoying, and do do a fair bit of scripting and regex-based pattern rewriting to prepare data for analysis. Then I bring things into R and all is well.
While I am trying to do more and more within the R native environment, I’m sure I’ll never leave the inconvenience of operating systems completely behind.
I tried Unix tools ported to Windows, but it was incomplete until I discovered Softintegration‘s Ch environment and C/C++ interpreter. (I tend to only use the C part.) It has a shell, and you can work Unix commands there to your heart’s content. (Yes, sorry, Ch is not free in its most expansive version.)
Still, having Unix in hand doesn’t solve all the things I need to do, whether a simulation like Ch or Linux. Sometimes I need to write things afresh (in C), and sometimes it’s worth using a good, paid-for product, like a good well-regex-endowed editor, GREP-like facilities, and occasionally a DIFF-like tool. For my work, each of these need to handle gigabyte-sized files without complaining. Even poor (Debian?) Linux seems challenged by some of the files common these days: 8 gigabyte things make some of the tools just choke.
I use windows.
I use cygwin.
I use mingw.
Here’s the thing: those unix tools in windows are only superficially unix tools.
For example, I often have to deal with permissions issues where windows just won’t work with the permissions set up using the unix tools.
For example, I often have cp fail on a networked drive, because someone added a feature where cp is sensitive to file timestamps and the network drive does not support those semantics.
For example, xargs by default isn’t really even a unix tool (you need gnu xargs -d \n for that). And the shell shares a similar issue (try IFS=”`(printf ‘n’)`” some time — it’s broken by design).
But the real problem is that windows has been designed so intensely to be inefficient when used via unix tools. Process creation is tediously slow (process level modularity bad for windows, use threads instead). I sometimes fire up a unix-y process on my multi-core windows box, and then when I get impatient I go fire up a virtual (and otherwise slow) ec2 box which then completes the same process long before the windows instance completes.
So.. yes… negative space… but even more than that: negative time. And by “negative time” I mean “time that you do not have to spend, in unix, that you have to spend, in Windows”.
Apart from the culture, I think that the integration is lacking. Even with bash, grep, vim, etc. under Windows, you can not grep/sed windows registry, windows system programs are not really designed to be piped etc.
Good point. Unix has a design principle that configuration is in text files by default. Under Windows, some things are text, most are binary, some are XML, …
Although the Windows OS components are not designed to be piped together, the new PowerShell APIs for manipulating them are designed to be piped together.
That said “pipe” means something different in power shell than in unix. Radically more complex, in my opinion. The unix system feels something like working with monads. The power shell system feels something like working without them.
When I had to use Windows in a corporate environment, I had cygwin + the Windows port of the blackbox window manager. After two years like that… “Put lipstick on a pig, it’s still a pig…”
Of course, some people like pigs. My cousins once had a pet pig (they named it “Hamlet”)…
I will totally agree with you that some aspects of linux, and the various unixes, are so utterly superior to windows that it’s not even silly. But there’s a lot of talent that’s gone into providing things within the windows environment, and some of that work has not been equalled elsewhere. (Personally, for example, I think Fiddler is superior to Charles, where it works. Still, Charles can be good enough, and will work on machines where Fiddler is inert and useless.)
For what little my thoughts are worth as a non-developer, I found Linux was much more open and invited the user to learn about it, whereas Windows (which I had used exclusively up to that point) seemed to do its best to ensure you didn’t understand and stayed ignorant — it wasn’t simply that technical skills weren’t necessary, but it seems as if they are actively discouraged, to enable or even induce incuriosity. Linux was the polar opposite, not only inviting learning with its open nature but almost necessitating it.
Now, I utilize shell scripts and terminal on a daily basis simply to optimize my everyday computing experience, in both Linux and in Windows using Cygwin (more accurately a “weekly” basis here, in accord with my relative use of the two OSs). I don’t know if Linux’s command line is objectively more powerful than Windows with Cygwin, but it is where I learned and where I feel most comfortable applying those skills.
Negative space is an interesting way to look at the differences between UNIX and Windows cultures, but I just stumbled across this blog post (love your Twitter feeds, BTW!), and I’m having trouble applying it to the question of why UNIX tools haven’t taken hold on Windows.
Why try to answer it with just two data points? Grab some examples of operating systems that aren’t purely Windows or UNIX, and see how well a particular theory holds. Two examples from my past lives where UNIX tools struggled to get traction: AmigaDOS and VMS. On the other end of the distribution (so to speak), Mac OS X’s Mach/FreeBSD/proprietary blend, managed to stay well connected to the UNIX fold, despite unequivocal divergences such as XML and binary plists. BeOS is somewhere in the same category. QNX managed to go from being a wholly un-UNIX-like RTOS to being a strong yet distinct hybrid. In BusyBox, I’ve noticed an extreme kind of uncanny valley that might be an enlightening to take as an example.
I’d primarily argue that the failure of UNIX tool uptake on Windows stems from a kind of friction: The extra time and effort required to apply the tools in areas where there are no idioms for their usage, or worse, it eventually becomes apparent “you can’t get there from here”. Not cheaply enough, anyhow. As UNIX and Windows evolve (and grow) independently, there is an ongoing maintenance cost for keeping a UNIX-on-Windows ecosystem, one that might be too high, given today’s alternatives.
Speaking for myself: The early workstation VM environments, combined with shared disks and sysadmins paid to maintain standard PC’s, drew down my reliance on cygwin, which was mostly limited to my personal refuge within Emacs. In a virtual world, you can have almost any UNIX flavor you want, or conversely, just pay the Microsoft tax when you don’t have a better option.
One reason Unix tools haven’t taken off on Windows is that what they do isn’t as valuable on Windows. Unix has excellent text processing tools, but Windows doesn’t depend as much on text processing. On Unix, text processing lets you configure servers. On Windows text processing just lets you process text.
I assumed when you said Unix, you included OS X as it is Unix. Are there even any other consumer Unixes out there? I said Unix, not Unix-like.
This negative space (things not there to worry about) is what makes mechanical typewriters excellent for writing certain kinds of prose (like essays, memories, musings, etc.)