Software that gets used

I’ve been looking back at software projects that I either developed or managed. I thought about which projects produced software that is actively used and which didn’t. Here’s what the popular projects had in common. The software

  1. was developed to address existing needs, not speculation of future needs;
  2. solved a general problem; and
  3. was simple, often controversially simple.

The software used most often is a numerical library. It addresses general problems, but at the same time it is specialized to our unique needs. It has some functions you won’t find in other libraries, and it lacks some functions you’d expect to see but that we haven’t needed.

A couple of the more successful projects were re-writes of existing software that deleted maybe 90% of the original functionality. The remaining 10% was made easier to use and was tested thoroughly. No one missed the deleted functionality in one project. In the other, users requested that we add back 1% of the functionality we had deleted.

Related posts

Simplicity in old age

Quote from Julian Barnes:

There is something infinitely touching when an artist, in old age, takes on simplicity. The artist is saying: display and bravura are tricks for the young, and yes, showing off is part of ambition; but now that we are old, let us have the confidence to speak simply.

HT: Signal vs. Noise

More on simplicity

Simple legacy

Benoit Mandelbrot makes the following observation in The Fractal Geometry of Nature.

Many creative minds overrate their most baroque works, and underrate the simple ones. When history reverses such judgments, prolific writers come to be best remembered as authors of “lemmas,” of propositions they had felt “too simple” in themselves and had to be published solely as preludes to forgotten theorems.

If you’re not familiar with lemmas and theorems, think of a musician who is famous for a short prelude written as an introduction to a longer piece nobody remembers. For example, Rossini’s four-minute William Tell Overture is far more famous than the four-hour William Tell opera it introduces.

Returning to famous mathematicians, I remember as an undergraduate hearing of Schwarz’s lemma and waiting for the corresponding theorem that never came. The same applies to Poincaré’s lemma, Zorn’s lemma, and Fatou’s lemma.

We’re all naturally proud of things we work hard for. We expect other people to value our work in proportion to the amount of effort we put into it, but the world doesn’t work that way. It can be discouraging focus on the big, complex projects we’ve worked on that haven’t been appreciated. On the other hand, it can be very encouraging to think of the potential impact of small projects and simple ideas.

Conflicting ideas of simplicity

Sometimes it’s simpler to compute things exactly than to use an approximation. When you work on problems that cannot be computed exactly long enough, you start to assume everything falls in that category. I posted a tech report a few days ago about a problem in studying clinical trials that could be solved exactly even though it was commonly approximated by simulation.

This is another example of trying the simplest thing that might work. But it’s also an example of conflicting ideas of simplicity. It’s simpler, in a sense, to do what you’ve always done than to do something new.

It’s also an example of a conflict between a programmer’s idea of simplicity versus a user’s idea of simplicity. For this problem, the slower and less accurate code requires less work. It’s more straightforward and more likely to be correct. The exact solution takes less code but more thought, and I didn’t get it right the first time. But from a user’s perspective, having exact results is simpler in several ways: no need to specify a number of replications, no need to wait for results, no need to argue over what’s real and what’s simulation noise, etc. In this case I’m the programmer and the user so I feel the tug in both directions.

Try the simplest thing that could possibly work

Classroom exercises always have nice, tidy solutions. So students implicitly assume that all problems have nice, tidy solutions. If the solution isn’t working out simply, you must have made a mistake.

Outside the classroom, applications seldom have simple solutions. So after a while you get jaded and quit trying to find a simple solution. But sometimes real problems do have simple solutions, or at least simpler solutions than seemed possible.

The Extreme Programming folks have a saying “Try the simplest thing that could possibly work.” If that doesn’t work, then try the next simplest thing that could possibly work. That line of thinking has paid off a few times lately.

I’ve had a couple math problems that I first assumed had to be approximated numerically that were more easily computed exactly. And I’ve had a couple programs where I was able to debug a section of code by simply deleting it. Things don’t always work out that well, but it’s fun when they do.

A little simplicity goes a long way

Sometimes making a task just a little simpler can make a huge difference. Making something 5% easier might make you 20% more productive. Or 100% more productive.

To see how valuable a little simplification can be, turn it around and think about making things more complicated. A small increase in complexity might go unnoticed. But as complexity increases, your subjective perception of complexity increases even more. As you start to become stressed out, small increases in objective complexity produce big increases in perceived complexity. Eventually any further increase in complexity is fatal to creativity because it pushes you over your complexity limit.

graph of perceived complexity versus actual complexity

Clay Shirky discusses how this applies to information overload. He points out that we can feel like the amount of information coming in has greatly increased when it actually hasn’t. He did a little experiment to quantify this. When he thought that the amount of spam he was receiving had doubled, he would find that it had actually increased by about 25%. Turning this around, you may be able to feel like you’ve cut your amount of spam in half by just filtering out 20% of it.

A small decrease in complexity can be a big relief if you’re under stress. It may be enough to make the difference between being in a frazzled mental state to a calm mental state (moving out of F-state into C-state). If you’re up against your maximum complexity, a small simplification could make the difference between a problem being solvable or unsolvable.

Small simplifications are often dismissed as unimportant when they’re evaluated in the small. Maybe a new term makes it possible to refer to an idea in three syllables rather than six. No big deal if it’s a term you don’t use much. But if it’s a term you use all the time, it makes a difference. That’s why every group has its own jargon.

Suppose one application takes five mouse clicks to do what another can do in three. Maybe that’s no big deal. But if you’re under stress, those two mouse clicks might make the difference between deciding a finishing touch is worthwhile versus not worthwhile.

Suppose one programming language takes five lines of code to do what another language can do in four lines. So what? How long does it take to type one line of code? But multiply that by 10. Maybe you see 40 lines of code on your laptop at once but you can’t see 50. Or multiply by 10 again. Maybe you can hold 400 lines of code in your head but you can’t hold 500. Language features dismissed as “syntactic sugar” can make a real difference.

When you’re stressed and feel like only a radical change will do any good, think again. A small simplification might be enough to give you some breathing room by pulling you back down below your complexity limit.

Related post: What happens when you add another teller?

Three quotes on simplicity

It’s easy to decide what you’re going to do.  The hard thing is deciding what you’re not going to do.
Michael Dell

Clutter kills WOW.
Tom Peters

Any intelligent fool can make things bigger, more complex, and more violent. It takes a touch of genius — and a lot of courage — to move in the opposite direction.
Albert Einstein

The simplest thing that might work

Ward Cunningham‘s design advice is to try the simplest thing that might work. If that doesn’t work, try the next simplest thing that might work. Note the word “might.”

We all like simplicity in theory, and we may think we’re following Cunningham’s advice when we’re not. Instead, we try the simplest thing that we’re pretty sure will work. Solutions usually get more complex as they’re fleshed out, so we miss out on simple solutions by starting from an idea that is too complex to begin with.

Once you have a simple idea that might work, you have to protect it. Simple solutions are magnets for complexity. People immediately suggest “improvements.” As design guru Donald Norman says “The hardest part of design … is keeping features out.”

Simple unit tests

After you’ve read a few books or articles on unit testing, the advice becomes repetitive. But today I heard someone who had a few new things to say. Gerard Meszaros made these points in an interview on the OOPSLA 2007 podcast, Episode 11.

Test code should be much simpler than production code for three reasons.

  1. Unit tests should not contain branching logic. Each test should test one path through the production code. If a unit test has branching logic, it’s doing too much, attempting to test more than one path.
  2. Unit tests are the safety net for changes to production code, but there is no such safety net for the tests themselves. Therefore tests should be written simply the first time rather simplified later through refactoring.
  3. Unit tests are not subject to the same constraints as production code. They can be slow, and they only have to work in isolation. Brute force is more acceptable in tests than in production code.

(Meszaros made points 1 and 2 directly. Point 3 is my interpolation.)

A well-tested project will have at least as much test code as production code. The immediate conclusion too many people draw is that therefore unit testing doubles the cost of a project.  One reason this is not true is that test code is easier to write than production code for the reasons listed above. Or rather, test code can be easier to write, if the project uses test-driven development. Retrofitting tests to code that wasn’t designed to be testable is hard work indeed.