Last week I wrote a short commentary on James Hague’s blog post Organization skills beat algorithmic wizardry. This week that post got more traffic than my server could handle. I believe it struck a chord with experienced software developers who know that the challenges they face now are not like the challenges they prepared for in school.

Although I completely agree that “algorithmic wizardry” is over-rated in general, my personal experience has been a little different. My role on projects has frequently been to supply a little bit of algorithmic wizardry. I’ve often been asked to look into a program that is taking too long to run and been able to speed it up by an order of magnitude or two by improving a numerical algorithm. (See an example here.)

James Hague says that “rarely is there some … algorithm that casts a looming shadow over everything else.” I believe he is right, though I’ve been called into projects precisely on those rare occasions when an algorithm *does* cast a shadow over everything else.

Of course, and algorithmic wizardry may also be a huge help in managing complexity — having the right algorithm handy often simplifies a task greatly.

Though I must admit that much of the optimization work I’ve done of over the years consists more of finding the existing algorithmic stupidity and eliminating it. Most recently I found an algorithm that was O(N^2) for silly historical reasons; since N < 4 was normal, this had never been noticed, but when we ran into a N = 1000 case it was suddenly completely dominating performance.

Back when I was in school, and interning for an oil and gas company, my boss told me that an intern the previous year had changed the algorithm on a job that ran all weekend to only run in a few hours. “He paid for his internship that day.”

My personal best is 10 hours to 16 minutes. I was told that I made the department look bad. I had initially tried the same approach as the previous developers, and then switched because it was hard to reason about, and testing took way too long. (The final version used threads and a couple of other tricks to speed things up, so it wasn’t all a change of algorithm.)

This post does illustrate a very good point which I think people missed. As engineers, we are trained to solve problem.

As we focused on the ‘how’, many time we got stuck in bad problems, but we forget a very important piece, solve the right problem to start with!

Many successful stories are not about having great problem solving techniques and solved the hardest problem, but about solving the right problem in a relatively straightforward manner but turn out helped a lot of people!

There is also the opposite approach of developing algorithmic wizardry to the point that a single wizard can solve problems that would otherwise require a whole team of non-wizard developers. This approach is particularly prized in the communities around APL and its derivatives J and K, but it is also present, at least as a myth but sometimes as a reality, in the Lisp community.

It’s ironic that a post about the unimportance of algorithmic wizardry brings down a server.

There is an analogous situation in other fields. When I was in school, I remember that doing the math and stats problems was the hard part. I got it into my head that I would be able to add value to my future employer by knowing more about integrals and eigenvalues and statistical distributions and optimization techniques.

Now I’m a finance guy and I find that the hard part is defining the problem. Most times, once the problem has been identified, it’s not especially hard to solve. If it is, then the next hard part is identifying an easy problem to solve that is a sufficiently close analogy to the actual problem. But none of this is ever as hard as figuring out what question it is that I or my colleagues are trying to answer in the first place.