Appropriate scale

“Scale” became a popular buzz word a couple decades ago. Suddenly everyone was talking about how things scale. At first the term was used to describe how software behaved as problems became larger or smaller. Then the term became more widely used to describe how businesses and other things handle growth.

Now when people say something “won’t scale” they mean that it won’t perform well as things get larger. “Scale” most often means “scale up.” But years ago the usage was more symmetric. For example, someone might have said that a software package didn’t scale well because it took too long to solve small problems, too long relative to the problem size. We seldom use “scale” to discuss scaling down, except possibly in the context of moving something to smaller electronic devices.

This asymmetric view of scaling can be harmful. For example, little companies model themselves after big companies because they hope to scale (up). But running a small software business, for example, as a Microsoft in miniature is absurd. A small company’s procedures might not scale up well, but neither do a large company’s procedures scale down well.

I’ve been interested in the idea of appropriate scale lately, both professionally and personally.

I’ve realized that some of the software I’ve been using scales in a way that I don’t need it to scale. These applications scale up to handle problems I don’t have, but they’re overly complex for addressing the problems I do have. They scale up, but they don’t scale down. Or maybe they don’t scale up in the way I need them to.

I’m learning to make better use of fewer tools. This quote from Hugh MacLeod suggests that other people may come to the same point as they gain experience.

Actually, as the artist gets more into her thing, and gets more successful, the number of tools tends to go down.

On a more personal level, I think that much frustration in life comes from living at an inappropriate scale. Minimalism is gaining attention because minimalists are saying “Scale down!” while the rest of our culture is saying “Scale up!” Minimalists provide a valuable counterweight, but they can be a bit extreme. As Milton Glaser pointed out, less isn’t more, just enough is more. Instead of simply scaling up or down, we should find an appropriate scale.

How do you determine an appropriate scale? The following suggestion from Andrew Kern is a good starting point:

There is an appropriate scale to every human activity and it is the scale of personal responsibility.

Update: See the follow-up post Arrogant ignorance.

Related posts

7 thoughts on “Appropriate scale

  1. The idea of appropriate scale at different levels reminds me of A Pattern Language by Alexander et al.

  2. And this is why I like extensible tools, and I’m very wary of anyone who says extensibility is bloat. If something is extensible, it will exist happily at nearly any scale based on what you extend it with.

    Emacs is the classic example, of course, but compare a laptop to an iPhone: An iPhone may be more minimal than a specific laptop or it may not, in terms of what each is able to do, but if you need to extend what you have into an odd (or just non-Apple-approved) direction, using the laptop as the base will be a lot less painful than trying to take the iPhone in an unapproved direction.

  3. Hi John,

    I’m curious about which are the fewer tools that you are choosing.

    Personally, I consider myself a dabbler. On the positive side I enjoy learning new tools and this learning brings new ideas. On the negative side, it is not efficient; one can often solve a problem more quickly if one knows his tools inside out.

    I have to acknowledge that I tend to worry if the solution will ‘scale up’. Most of the time the worry never materializes, although there have been circumstances when I have needed to modify the implemented solution (meaning models and code) because we have learned more about the problem.

    I struggle to get very excited about the whole minimalism thing, which seems to be very USA-centric. Sort of discovering that despite all the ‘stuff’ some people are not, unless superficially, very happy. The whole minimalism movement seem awfully superficial to me.

  4. I am really happy to see a more experienced person has noticed the same thing I do. I discussed scaling down and networks in a recent blog post:

    http://turingcomplete.blogspot.com/2011/03/networks-and-unnecessary-complexity.html

    Apart from discussing the vast number of protocols for the exact same problem that do not offer different advantages , I most importantly stressed out that in favor of universality, many algorithms used in corporate networking are also used in home routers, where a much simpler algorithm would suffice.

    In theory, this could be linked to circuit complexity , i.e. having different algorithms for different input sizes, although in a real life example, you’d expect partitioning the input size to a few categories.

  5. Hi john,
    i enjoy reading your articles word by word. i’ve always thought about “scaling down” issues but wasn’t sure about the correct word for it.

  6. Thought about this again, and in the context of “Arrogant Ignorance.” One thing: Misjudging of appropriate scale. There may be many instance of this, but, to start with something concrete, consider computation and computational engines. In the world of big computation these days there appears to be a tension between truly large scaled distributed ways of calculating things and smarter, more involved computations run on powerful, perhaps multithreaded uniprocessors. Not all algorithms are amenable to map-reduce decomposition. Trouble is, in some quarters, that an algorithm is not so amenable is a strike against it, even before a needed scale is identified. To me, there’s a certain ugliness to using simple procedures on multiple processors to solve problems which can be done on uniprocessors but require just a little numerical sophistication. No doubt, a rational apportionment of effort would depend upon relative costs of these resources, including those of people. Rational apportionments are seldom done, even (?) in private corporations.

    When it comes to solving truly hard problems, perhaps I’m old fashioned, but it seems to me y’need to solve an example of these at a smaller order using an algorithm on a uniprocessor, and then, when there’s confidence and knowledge in it, deal with the problem of scaling up to big data sizes across multiple processors. For one thing, numerical results at big scales aren’t numerically reproducible, since not all realizations of algorithm parts commute under finite precision. This approach appears to be doubted.

Comments are closed.