Virtual reality pioneer Jaron Lanier writes in his book You Are Not a Gadget about the lack of creativity in our use of computing power.
Let’s suppose that back in the 1980s I had said, “In a quarter century, when the digital revolution has made great progress and computer chips are millions of times faster than they are now, humanity will finally win the prize of being able to write a new encyclopedia and a new version of UNIX!” It would have sounded utterly pathetic.
The quote specifically alludes to Wikipedia and Linux, but Lanier is critical of web culture in general. I’m not sure what I think about his position, but at a minimum he provides a counterbalance to the people who speak about the web in messianic tones.
11 thoughts on “Underwhelmed with progress”
Actually, I think wikipedia is a pretty great innovation, making information available to vastly more people.
I cannot imagine the discoveries in genetics ever having come about without high-speed computing. When I was in graduate school, the most common etiology for children with profound to severe disabilities was “unknown”. We have now identified the genes that cause many of those disabilities. You cannot imagine the relief on the face of a mother when I can tell her authoritatively, “You did NOT cause this. It was nothing you did during your pregnancy, during your life. Your child’s disorder is caused by a partial deletion of the nth chromosome. It’s sad, it’s unfortunate but it is not your fault.”
There have been great accomplishments. But, yes, there is also an incomprehensible amount of time devoted to youtube videos proving just how stupid drunk people can be , searching on the latest celebrity addiction/ breakup/ pregnancy and games for killing the greatest number of people/aliens/labradors in the shortest amount of time in the widest variety of ways. I admit I don’t get it.
Genetic research is a great example of computing power put to good use.
One thing that frustrates me is that sometimes it seems software development is running around in circles. Software bloat grows at least as fast as Moore’s Law. We keep solving the same problems, but the API’s change every few years. That’s not entirely true, sometimes it seems that way.
I heard Jaron Lanier speak about his book at the Harvard Book Store in Harvard Square. I found his presentation and views bifurcated. Mr Lanier has a lot of good points to make about how the Web and Internet are going, consistent with Dr Cook’s observations about how uncreative many applications are. But upon questioning, Mr Lanier claims he loves the Web but only has objections to “Web 2.0 stuff”. Roughly speaking, that’s the Facebook/AJAX/Twitter/IM/SMS ball of wax, although it’s probably hard to discern clean boundaries.
Personally, having seen the Internet and computing grow up (in grad school 1974), I find there’s an awful lot about it that I like. I mean, compared to almost any other “promise of the future” (space travel, undersea exploration, electric cars, widespread robotics) it came true. I have vastly more computing power in hand than I could ever dream.
On the other hand, a lot of this is deeply disappointing. It may be the society and culture more than just the Internet, but I find I have far less time than I would like to really concentrate and understand a few things. I keep getting called in to doing distracting jobs and errands at work and such. I find that, compared to say the 1980s, the generation of software engineers who grew up through the Internet bubble and during the advent of SAP/PeopleSoft and packaged software solutions are far less disciplined than they should be and fail to know some basic things, like the behavior of floating point numbers, for instance, and when you really want to fixed point, for instance, or how to fit a univariate line to points.
I don’t know why it is, but I have gotten less enchanted with the open source/free software movement with time. Part of that is that the availability of “good enough” free software has caused the population of good programmers to shrink, because the exiles can no longer make a living at it, unless they move to Thailand or some such place. Part of that is that a lot of free software is junky, and it tends to encourage its users to set lower standards for the software they themselves write. And, finally, part of it is because the Internet applications are nearly always written as if they had to be done yesterday and rapidly devolve into just fighting and fixing todays fires, repeated endlessly. The highest good seems to be delivering a solution in blindingly fast time, but, oh, well, yeah, it may crash from time to time, but doesn’t everything?
I hope Lanier’s book is better than his talk was. He talked me out of buying and reading it. He spent a lot of time describing how it was really Timothy Leary who convinced him he should write this book. Hmmmmm.
Jan: Parts of his book were very interesting, some parts not so much. I enjoyed his interview with Moira Gunn and later bought his book. He talked about some themes in the interview that I hoped he would develop in the book, but he didn’t. I imagine he’d be very interesting to talk to.
Hm, saying Wikipedia is just a new encyclopedia is sort of like saying a mobile phone is a new cordless phone. It is true, but it also kind of misses the point.
And as much as I admire – and rely on – linux, it’s not really a good example. His statement really says that operating systems were a solved problem in the 1980’s. The innovation of linux isn’t what it is – yes, another Unix – but the process by which it is created, and the resulting capability and cost.
Would you have said in the 1980’s that you’ll get a Unix that runs everywhere from inside children’s toys, to “a kind of cordless phone”, to servers, to the largest of supercomputers; and that anybody can get the system – not just a kernel, but drivers, development tools, documentation, userspace apps, the whole enchilada – for free, then that might capture the flavour a little better.
I forgot to add that if Jaron Lanier is right, and he may well be, then Ted Nelson was more right, and earlier, despite his faults in pursuit and implementation.
There is much I could say about this, instead I’ll put Lanier’s statements in perspective. By his logic the telephone was a mundane invention since it did nothing other than let people talk to each other, something they had been doing for tens of thousands of years prior.
I have to admit I don’t understand why Lanier gets so much play. He’s always struck as the sort who’s discovered that this contrarian shtick earns eyeballs and ears (or, please correct me, has he ever done anything remarkable since coining the term VR? Show me the code, man.). Frankly, I’ll take Alan Kay, among others, if we’re in need of public intellectuals worth listening to in computing. But they’re too busy actually trying to make a difference.
I’m not crazy about the messianic views of the web either. But there are *so* many people in the middle dissecting this thing in N different ways, looking at it through the lens of their own personal theories, without such glorified tones. Some kind of synthesis will emerge. The problem is that all of that is quite boring where the soundbite and media reporting is concerned.
I’m not entirely sure what Lanier is seeing when he looks at the Internet. How is the ability to process humongous datasets in milliseconds, the availability of pretty much all of human knowledge at your fingertips, and something that connects the entire world NOT a remarkable achievement?
I suppose Lanier believes it could be qualitatively different and not just big.
It *is* qualitatively different. Once you reach a certain size threshold, it is a qualitative jump. (The jump here is the rise of probabilistic algorithms, since deterministic ones often can’t handle the scale)