One variation on Moore’s law says that computer performance doubles every 18 months. Another says performance improves by a factor of 100 every decade. The two are equivalent, though the latter sounds more impressive.
Exponential growth doesn’t mesh well with human intuition. Scott Berkun said “Technological progress is overestimated in the short term, underestimated in the long term.” This is probably not limited to technology but true of exponential growth in general.
(Why are the two statements of Moore’s law equivalent? The first says performance grows like exp( log(2) t / 1.5 ) and the second says it grows like exp( log(100) t / 10 ). And log(2) / 1.5 = 0.462 and log(100)/10 = 0.461.)
Related post: Moore’s law and software bloat