The Lindy effect says that what’s been around the longest is likely to remain around the longest. It applies to creative artifacts, not living things. A puppy is likely to live longer than an elderly dog, but a book that has been in press for a century is likely to be in press for another century.
In a previous post I go into the mathematical detail of the Lindy effect: power law distributions etc. The key fact we need for this blog post is that if something has the kind of survival distribution described by the Lindy effect, then the expected future lifetime equals the current age. For example, the 100 year old book in the opening paragraph is expected to be around for another 100 years.
Note that this is all framed in terms of probability distributions. It’s not saying, for example, that everything new will go away soon. Everything was once new. Someone watching Hamlet on opening night would be right to speculate that nobody would care about Hamlet in a few years. But now that we know Hamlet has been around for four centuries and is going strong, the Lindy effect would predict that people will be performing Hamlet in the 25th century.
Note that Lindy takes nothing into account except survival to date. Someone might have been more bullish on Hamlet after opening night based on other information such as how well the play was received, but that’s beyond the scope of the Lindy effect.
If we apply the Lindy effect to programming languages, we only consider how long they’ve been around and whether they’re thriving today. You might think that Go, for example, will be along for a long time based on Google’s enormous influence, but the Lindy effect does not take such information into account.
So, if we assume the Lindy effect holds, here are the expected years when programming languages would die. (How exactly would you define the time of death for a programming language? Doesn’t really matter. This isn’t that precise or that serious.)
Language | Born | Expected death |
---|---|---|
Go | 2009 | 2025 |
C# | 2000 | 2034 |
Java | 1995 | 2039 |
Python | 1991 | 2043 |
Haskell | 1990 | 2044 |
C | 1972 | 2062 |
Lisp | 1959 | 2075 |
Fortran | 1957 | 2077 |
You can debate what it even means for a language to survive. For example, I’d consider Lisp to be alive and well if in the future people are programming Clojure but not Common Lisp, for example, but others might disagree.
“We don’t know what language engineers will be coding in the year 2100. However, we do know that it will be called FORTRAN.” — C.A.R. Hoare
That’s a cute table, but as you point out in your other blog post, the Lindy effect doesn’t really say that a thing’s remaining life expectancy is equal to its age, as is commonly stated, but just that it is proportional to its age. The proportionality constant is equal to 1 when the alpha is 2. You’d have to demonstrate from the data that the alpha is near 2. I’d actually suspect the proportion is much larger than 1 (i.e., smaller alpha, thicker tail), as programming languages are difficult to kill thanks to the ease of storing information on the net (but I could be wrong).
I suspect you’re right. And I think you’d get a different value of alpha depending on how you define a language to be “dead.”
Suppose you consider a language to be alive if people are writing and speaking about it. Then languages die quickly. But languages like C++ have gone underground, so to speak. I believe the number<.em> of C++ programmers has steadily grown, though the proportion of programmers writing C++ has declined.
Well very few people know Fortran, but there are also Fortran codes that may never change (I heard a story about some Fortran code that has been deemed “official” by NIST, so any other code, even if it computes the exact same thing, cannot be used). The same is true for COBOL.
I personally think of Lindy in relative terms. Fortran will likely outlive Java, since it is 3 times older, and Java will likely outlive Go as it is 2.75 times as old. Fortran is older than Lisp, but proportionally not by much (60 vs. 62 years), so you can’t really say much about which will live longer, statistically speaking. I suppose one could compute, for a given alpha, just how likely.
Agreed. I’d only take this seriously in ordinal terms, as you said: Java will likely outlive Go, etc. Even then, I wouldn’t take it too seriously.
The quote about FORTRAN is from C.A.R. Hoare.
Thanks, Patrick. That makes more sense.
Why didn’t COBOL, Pascal or BASIC make your list?
I picked languages that I considered to be “thriving,” and I don’t consider COBOL, Pascal, or BASIC to be thriving. I debated whether to include Fortran. I readily admit it’s all subjective. But I do think this much is true: All else being equal, languages that have been around longer are likely to be around longer into the future.
Stroustrup said there are two kinds of languages: the ones everyone complains about and the ones nobody uses. I imagine the dull, workhorse languages will outlive most of the languages that are hot on Hacker News.
Hoare is already wrong! I learned FORTRAN in college, but nowadays people program in this newer language they call Fortran. I can imagine a trend in the next 83 years where people will use all lower-case, thus programming in yet another language named fortran.
Basic, particularly Visual Basic for Applications is thriving and entrenched, particularly in Government and Financial Services. It’s one of the few IDEs that is installed (along with Office) that is on billions of PCs. It’s shocking how much day-to-day VBA uses syntax from Basic 1.0, like GoSub, Goto and line-numbers.
I’m more interested in programmer working life expectancy.
I know a bunch of languages quite well, along with some major proven talents (including IoT), but finding interesting employment post-60 has been a brick wall.
I may need to take up teaching. (Those who can, do…)
About the C.A.R. Hoare quote:
He referred to year 2000, not 2100. Having learned Fortran IV in my first year of college, I checked out what was called Fortran around year 2000, finding the Fortran 2003 standard – and discovered that Hoare was right :-) First time I read a Fortran 2003 program I didn’t recognize it as Fortran at all; actually, I refused to believe it was Fortran until somebody forced the language definition down my throat.
One point that few young people are aware of today: The context of the quote. Hoare wrote an article in one of the ACM magazines (I guess it must have been either CACM or SIGPLAN Notices) in the late 1970s, starting out something like (I’m quoting from memory, so it may not be 100% exact): “This has been a depressing week. I have been reviewing the four selected proposals for the Fortran 77 language specification.” — He was extremely frustrated over all the crazy, non-Fortran-style, extensions people wanted to include in the new Fortran spec.
Well don’t take it all literally. Hoare’s quote was specifically for “2000”.
Just a thought from a different perspective.
Older programming languages had to have (what I’ll refer to as) classic design. There was no space for fluff, bells, and whistles which really answer to current styles. They were, of necessity, required to target as robust a solution as possible to as broad a range as possible.
In a way, it feeds back on the original premise: they’ve been handling whatever has been thrown at them for a long time, and with some modification, continue to do so. The young upstarts (i.e., the latest trending languages) typically solve current needs without that look forward (and back) beyond their motivating design criteria.
So, they fail and dry up early because they’re admirably filling a niche that is, unfortunately, a transient.
Lindy does not take into account the possibility that with in a relatively short time programming languages used now may have to be replaced with languages that will be dealing with computers of a totally different processing that may not even be binary.