Computer processes, human processes, and scalability

Jeff Atwood had a good post today about database normalization and denormalization recently. A secondary theme of his post is scalability, how well software performs as inputs increase. A lot of software developers worry too much about scalability, or they worry about the wrong kind of scalability.

In my career, scalability of computer processes has usually not been the biggest problem, even though I’ve done a lot of scientific computing. I’ve more often run into problems with the scalability of human processes. When I use the phrase “this isn’t going to scale,” I usually mean something like “You’re not going to be able to remember all that” or “We’re going to go crazy if we do a few more projects this way.” 

One thought on “Computer processes, human processes, and scalability

  1. A lot of HPC codes can exhibit near-linear scalability if the computational tasks are sufficiently fine-grained and can be multi-threaded. But not always. Note that the Sandia representation of latency (the bathtub curve) is analogous to Brooks’ law for the “scalability of human processes.” :)

    As a mathematician, you might find it interesting that scalability can be quantified as “A General Theory of Computational Scalability Based on Rational Functions”. This unifying model subsumes Amdahl’s law, Gustafson’s law and the retrograde effects seen at Sandia (as well as elsewhere).

Comments are closed.