It has always seemed odd to me that computer science refers to the number of operations necessary to execute an algorithm as the “complexity” of the algorithm. For example, an algorithm that takes O(N3) operations is more complex than an algorithm that takes O(N2) operations. The big-O analysis of algorithms is important, but “complexity” is the wrong name for it.
Software engineering is primarily concerned with mental complexity, the stress on your brain when working on a piece of code. Mental complexity might be unrelated to big-O complexity. They might even be antithetical: a slow, brute-force algorithm can be much easier to understand that a fast, subtle algorithm.
Of course mental complexity is subjective and hard to measure. The McCabe complexity metric, counting the number of independent paths through a chunk of code, is easy to compute and surprisingly helpful, even though it’s a crude approximation to what you’d really like to measure. (SourceMonitor is a handy program for computing McCabe complexity.)
Kolmogorov complexity is another surrogate for mental complexity. The Kolmogorov complexity of a problem is the length of the shortest computer program that could solve that problem. Kolmogorov complexity is difficult to define precisely — are we talking Java programs? Lisp programs? Turing machines? — and practically impossible to compute, but it’s a useful concept.
I like to think of a variation of Kolmogorov complexity even harder to define, the shortest program that “someone who programs like I do” could write to solve a problem. I started to describe this variation as “impractical,” but in some ways it’s very practical. It’s not practical to compute, but it’s a useful idealization to think about as long as you don’t take it too seriously.