Scaling up and down

There’s a worn-out analogy in software development that you cannot build a skyscraper the same way you build a dog house. The idea is that techniques that will work on a small scale will not work on a larger scale. You need more formality to build large software systems.

The analogy is always applied in one direction: up. It’s always an exhortation to use techniques appropriate for larger projects.

But the analogy works in the other direction as well: it’s inappropriate to build a dog house the same way you’d build a skyscraper. It would be possible to build a dog house the way you’d build a skyscraper, but it would be very expensive. Amateur carpentry methods don’t scale up, but professional construction methods don’t scale down economically.

Bias for over-engineering

There’s a bias toward over-engineering because it works, albeit inefficiently, whereas under-engineering does not. You can use a sledgehammer to do a hammer’s job. It’ll be clumsy, and you might hurt yourself, but it can work. And there are tasks where a hammer just won’t get the job done.

Another reason for the bias toward over-engineering is asymmetric risk. If an over-engineered approach fails, you’ll face less criticism than if a simpler approach fails. As the old saying goes, nobody got fired for choosing IBM.

Context required

Simple solutions require context to appreciate. If you do something simple, you’re open to the criticism “But that won’t scale!” You have to defend your solution by explaining that it will scale far enough, and that it avoids costs associated with scaling further than necessary.

Suppose a group is debating whether to walk or drive to lunch. Someone advocating driving requires less context to make his point. He can simply say “Driving is faster than walking,” which is generally true. The burden is on the person advocating walking to explain why walking would actually be faster under the circumstances.

Writing prompt

I was using some database-like features in Emacs org-mode this morning and that’s what prompted me to write this post. I can just hear someone say “That won’t scale!” I often get this reaction from someone when I write about a simple, low-tech way to do something on a small scale.

Using a text file as a database doesn’t scale. But I have 88 rows, so I think I’ll be OK. A relational database would be better for storing million of records, but that’s not what I’m working on at the moment.

More posts on scale

Following an idea to its logical conclusion

Following an idea to its logical conclusion might be extrapolating a model beyond its valid range.

Suppose you have a football field with area A. If you make two parallel sides twice as long, then the area will be 2A. If you double the length of the sides again, the area will be 4A. Following this reason to its logical conclusion, you could double the length of the sides as many times as you wish, say 15 times, and each time the area doubles.

Except that’s not true. By the time you’ve doubled the length of the sides 15 times, you have a shape so big that it is far from being a rectangle. The fact that Earth is round matters a lot for figure that big.

Euclidean geometry models our world really well for rectangles the size of a football field, or even rectangles the size of Kansas. But eventually it breaks down. If the top extends to the north pole, your rectangle becomes a spherical triangle.

The problem in this example isn’t logic; it’s geometry. If you double the length of the sides of a Euclidean rectangle 15 times, you do double the area 15 times. A football field is not exactly a Euclidean rectangle, though it’s close enough for all practical purposes. Even Kansas is a Euclidean rectangle for most practical purposes. But a figure on the surface of the earth with sides thousands of miles long is definitely not Euclidean.

Models are based on experience with data within some range. The surprising thing about Newtonian physics is not that it breaks down at a subatomic scale and at a cosmic scale. The surprising thing is that it is usually adequate for everything in between.

Most models do not scale up or down over anywhere near as many orders of magnitude as Euclidean geometry or Newtonian physics. If a dose-response curve, for example, is linear for based on observations in the range of 10 to 100 milligrams, nobody in his right mind would expect the curve to remain linear for doses up to a kilogram. It wouldn’t be surprising to find out that linearity breaks down before you get to 200 milligrams.

“Any sufficiently advanced logic is indistinguishable from stupidity.” — Alex Tabarrok

Related posts

Customizing conventional wisdom

From Solitude and Leadership by William Deresiewicz:

I find for myself that my first thought is never my best thought. My first thought is always someone else’s; it’s always what I’ve already heard about the subject, always the conventional wisdom. It’s only by concentrating, sticking to the question, being patient, letting all the parts of my mind come into play, that I arrive at an original idea. By giving my brain a chance to make associations, draw connections, take me by surprise. And often even that idea doesn’t turn out to be very good. I need time to think about it, too, to make mistakes and recognize them, to make false starts and correct them, to outlast my impulses, to defeat my desire to declare the job done and move on to the next thing.

Conventional wisdom summarizes the experience of many people. As a result, it’s often a good starting point. But like a blurred photo, it has gone through a sort of averaging process, loosing resolution along the way. It takes hard work to decide how, or even whether, conventional wisdom applies to your particular circumstances.

Bureaucracies are infuriating because they cannot deliberate on particulars the way Deresiewicz recommends. In order to scale up, they develop procedures that work well under common scenarios.

The context of Deresiewicz’s advice is a speech he gave at West Point. His audience will spend their careers in one of the largest and most bureaucratic organizations in the world. Deresiewicz is aware of this irony and gives advice for how to be a deep thinker while working within a bureaucracy.

Related posts

Scalability and immediate appeal

Paul Graham argues that people take bad jobs for the same reasons they eat bad food. The advantages of both are immediately apparent: convenience and immediate satisfaction. The disadvantages take longer to realize. Bad jobs drag down your soul the way bad food drags down your body.

I first read Graham’s essay You Weren’t Meant to Have a Boss when he wrote it three years ago. I read it again this morning when I saw a link to it on Hacker News. I found his thesis less convincing this time around. But he makes two general points that I think I missed the first time.

  1. Watch out for things that are immediately appealing but harmful in the longer term.
  2. Watch out for being part of someone else’s scalability plans.

The first point is familiar advice, but worth being reminded of. The second point is more subtle.

Companies sell bad food for the same reason they offer bad jobs: it scales. It’s easy to create bland food and bland jobs on a large scale. Fresh food and creative jobs don’t scale so well.

When you choose to eat junk food, you more or less consciously choose convenience or immediate satisfaction over long-term benefit. But it may not be obvious when your range of options has been selected for scalability. For example, few students realize how much the educational system has been designed for the convenience of administrators. Being aware of an organization’s scalability needs can help you interact with it more intelligently.

More posts on scale

Appropriate scale

“Scale” became a popular buzz word a couple decades ago. Suddenly everyone was talking about how things scale. At first the term was used to describe how software behaved as problems became larger or smaller. Then the term became more widely used to describe how businesses and other things handle growth.

Now when people say something “won’t scale” they mean that it won’t perform well as things get larger. “Scale” most often means “scale up.” But years ago the usage was more symmetric. For example, someone might have said that a software package didn’t scale well because it took too long to solve small problems, too long relative to the problem size. We seldom use “scale” to discuss scaling down, except possibly in the context of moving something to smaller electronic devices.

This asymmetric view of scaling can be harmful. For example, little companies model themselves after big companies because they hope to scale (up). But running a small software business, for example, as a Microsoft in miniature is absurd. A small company’s procedures might not scale up well, but neither do a large company’s procedures scale down well.

I’ve been interested in the idea of appropriate scale lately, both professionally and personally.

I’ve realized that some of the software I’ve been using scales in a way that I don’t need it to scale. These applications scale up to handle problems I don’t have, but they’re overly complex for addressing the problems I do have. They scale up, but they don’t scale down. Or maybe they don’t scale up in the way I need them to.

I’m learning to make better use of fewer tools. This quote from Hugh MacLeod suggests that other people may come to the same point as they gain experience.

Actually, as the artist gets more into her thing, and gets more successful, the number of tools tends to go down.

On a more personal level, I think that much frustration in life comes from living at an inappropriate scale. Minimalism is gaining attention because minimalists are saying “Scale down!” while the rest of our culture is saying “Scale up!” Minimalists provide a valuable counterweight, but they can be a bit extreme. As Milton Glaser pointed out, less isn’t more, just enough is more. Instead of simply scaling up or down, we should find an appropriate scale.

How do you determine an appropriate scale? The following suggestion from Andrew Kern is a good starting point:

There is an appropriate scale to every human activity and it is the scale of personal responsibility.

Update: See the follow-up post Arrogant ignorance.

Related posts

Hanlon’s razor and corporations

Hanlon’s razor says

Never attribute to malice that which is adequately explained by stupidity.

At first it seems just an amusing little aphorism, something you might read on a bumper sticker, but I believe it’s profound. It’s a guide to understanding so much of the world. Here I’ll focus on what it says about corporations.

I hear a lot of complaints that corporations are evil. Sometimes corporations in general, but more often specific corporations like Apple, Google, or Microsoft. I don’t deny that large, powerful corporations have the potential to do harm. But many accusations of malice are mis-attributed frustrations with stupidity. As Grey’s law says, any sufficiently advanced incompetence is indistinguishable from malice.

Corporations aren’t evil; they’re stupid. Not stupid in general, but in a specific way: they don’t handle edge cases well.

Organizations scale by creating procedures to replace human judgment. This is mostly a good thing. For example, electronic devices are affordable in part because companies can hire unskilled teenagers rather than electrical engineers to sell them. But if you have a question or problem that’s off the beaten path, you’re out of luck. Many complaints about evil corporations come from outliers, the 1% that corporations strategically decide to ignore. It’s not that that the concerns of the outliers are not legitimate, it’s that they are not profitable to satisfy. When some people say that a corporation is evil, they should just say that they are outside the company’s market.

Large organizations have similar problems internally. Policies written to handle the most common situations don’t handle edge cases well. For example, an HR department told me that my baby girl couldn’t be added to my insurance because she wasn’t born in a hospital. Fortunately I was able to argue with enough people resolve the problem despite her falling outside the usual procedures. It’s harder to deal with corporate rigidity as an employee than as a customer because it’s harder to change jobs than to change brands.

More corporate life posts

Stupidity scales

I’m fed up with conversations that end something like this.

Yes, that would be the smart thing to do, but it won’t scale. The stupid approach is better because it scales.

We can’t treat people like people because that doesn’t scale well.

We can’t use common sense because it doesn’t fit on a form.

We can’t use a simple approach to solve the problem in front of us unless the same approach would also work on a problem 100x larger that we may never have.

If the smart thing to do doesn’t scale, maybe we shouldn’t scale.

Related posts

Little programs versus big programs

From You Are Not a Gadget:

Little programs are delightful to write in isolation, but the process of maintaining large-scale software is always miserable. … Technologists wish every program behaved like a brand-new, playful little program, and will use any available psychological strategy to avoid thinking about computers realistically.

Related posts

Organizational scar tissue

Here’s a quote from Jason Fried I found recently.

Policies are organizational scar tissue. They are codified overreactions to unlikely-to-happen-again situations.

Of course that’s not always true, but quite often it is. Policies can be a way of fighting the last war, defending the Maginot Line.

The entrance to Ouvrage Schoenenbourg along the Maginot Line in Alsace, public domain image from Wikipedia

When you see a stupid policy, don’t assume a stupid person created it. It may have been the decision of a very intelligent person. It probably sounded like a good idea at the time given the motivating circumstances. Maybe it was a good idea at the time. But the letter lives on after the spirit dies. You can make a game out of this. When you run into a stupid policy, try to imagine circumstances that would have motivated an intelligent person to make such a policy. The more stupid the policy, the more challenging the game.

Large organizations will accumulate stupid policies like scar tissue over time. It’s inevitable. Common sense doesn’t scale well.

The scar tissue metaphor reminds me of Michael Nielsen metaphor of organizational immune systems. Nielsen points to organizational immune systems as one factor in the decline of newspapers. The defense mechanisms that allowed newspapers to thrive in the past are making it difficult for them to survive now.

Computer processes, human processes, and scalability

Jeff Atwood had a good post today about database normalization and denormalization recently. A secondary theme of his post is scalability, how well software performs as inputs increase. A lot of software developers worry too much about scalability, or they worry about the wrong kind of scalability.

In my career, scalability of computer processes has usually not been the biggest problem, even though I’ve done a lot of scientific computing. I’ve more often run into problems with the scalability of human processes. When I use the phrase “this isn’t going to scale,” I usually mean something like “You’re not going to be able to remember all that” or “We’re going to go crazy if we do a few more projects this way.”