To those who have posted responses to the essays: What I write here does not acknowledge what you’ve written. Apologies. It’s been an insanely busy week for me and I haven’t had time to read anything but the primary essays.
To David Gelernter
Yes, I should have included Lifestreams as a sterling example of a non-file approach to computing. It is also a fine example of designing computation around human experience. The length of time it takes for ideas like Lifestreams to come into use will be a measure of how bad our legacy problem really is. I hope very much that it will turn out that I was too pessimistic, and that the adoption of ideas like Lifestreams will not be excessively delayed by the crappy software that’s already laid down.
Thank you for stating the obvious about UNIX! For God’s sake, what’s with this Linux worship going on all over the place? The UNIX approach is this old, wretched snapshot of history. Every time some Slashdotter waxes lyrical about its virtues I feel even more strongly that software lock-in has caused idea lock-in.
Regarding specific points of disagreement: I would say that gasoline can be re-used if you smash the engine, just like dinosaur residue can be reused even though they and their ecosystems are smashed. Computer code loses its potency in a way that nothing else does.
Regarding “slaves”: The slaves I spoke of are the end users who engage in “tweakage denial.” We pretend to spend less time being humiliated by our machines than is actually spent.
Here are some recent examples, from the last two days, in fact. Two very prominent computer scientist friends, both founders of prominent computer science departments, deleted a half hour from their lives trying to figure out how to read email from an academic account on the East Coast. These guys have done this a lot, almost certainly more than you, dear reader, whoever you are, but the details keep changing. Then I got personal email not meant for my eyes, having nothing to do with me, forwarded to me by mistake. This was due to a server bug in the computer of an extremely famous technologist who has publicly disputed my contention that software doesn’t improve the way hardware does. I won’t use the names here, but you know who you are. Another example: Earlier this week I had dinner with the chief technologist of one of the biggest tech companies, and he was locked in combat for much of the evening with his phone/PDA gadget trying to pry some data out of it.
During the same period of time, your humble author was plagued by an apparent side effect of a software/security update from Apple, which screwed up my music editing system. I zapped pram, reset PMU, fixed permissions (damned UNIX!), reinstalled drivers, initialized the open firmware (FORTH!), and now I can only say I think I fixed it. Any of you doing sound editing on OSX will recognize these terms. They mean absolutely nothing but time wasted on tangled software despite what I am certain are good intentions from the folks over at Apple.
No one wants to admit how much time one spends on mandatory digital tweakage, so we deny it. There’s something, well, unmanly about admitting you sunk a bunch of time on this kind of idiocy. So we pretend it’s only the other people who waste time this way. Go ahead, pretend you don’t do it, but you do. We all do, and we are the “slaves” while we tweak, no matter how much we deny it. (An acknowledged software engineer is not a slave at all, of course.)
To John Perry Barlow
The hobbyhorse I insist on riding, or beating, or whatever it is one does with that odd metaphor, is not so much my unhappiness with the command line, or the file, or any other particular thing, but the loss of self-awareness that seems to be taking place. If architecture is politics, then we ought to take notice when our ideas are locked in by software.
Whether these particular ideas are the important ones or not is another question. I am annoyed by the command line, it is true; but what I wish to point out is that the underlying process by which it lives or dies is different from those of other ideas or designs, because of the exaggerated legacy effect of software lock-in. When we fight over the design of digital infrastructure, the fight is extra-important because the cost to reverse a mistake rises over time to infinity.
You worry about intellectual property. Intellectual property as defined by law is tolerable. We know that because we’re all still here even though it’s been around. Intellectual property as it might someday be implemented in a network of rigid software structures would be awful and perhaps intolerable.
Hey, I’m still utopian; this essay just happened to be about other stuff. I don’t like talking about an emergent mind of mankind, because I don’t know how. But I do love talking about how the varieties of connection between people will expand. I have a whole rap about “Post-symbolic communication” that I’ve been presenting for more than twenty years, and I still get excited about. (Briefly, this means that someday you’ll be able to improvise every aspect of a shared virtual world with other people, so you can make stuff instead of using symbols to talk about stuff. It will be a shared, waking state, intentional dream.) Yes, I believe in a wonderful future filled with more adventure, challenge, and beauty than people can take in. And I believe technology will be an essential, enabling part of it.
So I join you in utopian reverie, but I’d like to make a comment about timeframe. I’m delighted by people who believe in the coming of the Messiah, the Apocalypse, Ragnarok, the emergent mind of the Net, or post-symbolic communication, but only so long as they don’t expect any of these things very soon. When someone expects any of that stuff in our lifetimes, then you usually have a problem. People get nutty, extremist. This is how I feel about some of the Singularity believers, for instance.
Just because I think we have a long haul ahead of us to fix digital systems, maybe to be measured in centuries, please don’t worry that I’ve lost my utopian cred.
To Glenn Reynolds
Good point about space travel. Actually my private theory about how we overcome the software legacy problem in the far future is that information systems which are spread out over deep space become desynchronized because of the long latencies involved. Finally, some of them untangle themselves from MIDI, Windows, command lines, and so on. Distance will not only provide survivability, but a level of diversity that isn’t found in mere “Globalization.”
To Eric Raymond:
Raymond claims my term ‘Antigora’ is just a replay of one of his terms, but then uses my term differently than I do! Such sensitivity about whether his accomplishments are recognized! It doesn’t detract from Raymond’s reputation that other people also have thoughts.
(He seems to be accusing me of ripping off his rants. If it matters, I was ranting about this stuff noisily from around 1980. My first rant about the problems of UNIX and the command line was in 1978. Many of the ideas in the essay appeared earlier, in places like this.)
I’ll respond to a couple of specific points:
The frequency of unviable fetuses is irrelevant to the argument. There are indeed deadly mutations, but the point is that there are so many survivable ones that are similar to one another. There’s a signal-to-noise ratio for the feedback loop of any adaptive system—just like for any signal. Biological systems have enough signal. This comes about because there’s a modicum of smoothness in the system. Small changes in causes create small changes in results often enough to allow adaptation. Simple, right? The claim on the table is that software as we know it does not have enough signal in the noise, because small changes lead to big changes too often.
While I love the free market system, and could not imagine that a weirdo like me would thrive in any other type of society, I have little sympathy for Panglossian free market fanaticism. Raymond’s analysis of seemingly every event that has occurred is that it was the result of competition in a free market, and that therefore the outcome was for the best.
He applies his one-size-fits-all theory to TCP. Wow. I remember when TCP/IP had rivals. The process by which it won was not purely libertarian. In the interest of defending the reputation of one of the few useful politicians we’ve had in our arena, I must remind everyone that there was this thing called the Gore Bill. Government stepped in to help bring order to our online communities for the mutual good. Of course there were many factors aside from that, but this idea that TCP/IP was out there in a price war is a poor telling of history. The point Raymond raises about how monopolies can die from overreach is sound, but in Microsoft’s case, I fear the code will survive the company’s fate, whatever it is.
Next up for Panglossian treatment is what I call “leaky” systems. He says the reason we don’t have them is that they lost out in marketplace. That must be why book authors are suing Google and Sony has become a malicious PC cracker. Someone offered a solution and everyone thought it would be cheaper to suffer. In truth, a solution has not yet appeared. One of the reasons is that it will be technically difficult to create it.
Regarding the holy relics of the file and the command line: My point wasn’t about the text aspect of the command line, but about timing and interactivity, which are different issues entirely, with cognitive implications, and also not in any sense uniquely connected to virtual reality. You can’t play a violin with a command line. You also wouldn’t be advised to operate UNIX with a violin.
See my thoughts on the problems VR must still overcome. Motion sickness is probably well enough understood for prime-time, but there are plenty of other challenges. The point is that video game interactions are based on discrete interaction events instead of continuous engagement, in the way that sports, sex, music, language, eating, and all non-computer activities are continuous. I hope to see more profound levels of interaction in future games. Whoever tries to make that happen will have to battle against a bunch of entrenched code.
I hope one of the multiple-universe hypotheses is true and there’s an alternate universe out there where some of the most ambitious, weird, early computer science projects actually happened. In one of those universes, Ted Nelson’s Xanadu as imagined early on is streaming nodes, as well as my old dream of the Mandala operating system from 25 years ago—and Jef Raskin is making minimalist user interfaces that are widely used. I am confident that in all those universes all of us are arguing all the time. Raymond sees files as the only possible idea, but he is a creature of this universe, this narrow little place of ours that we’ve made a bit narrower.
About Jef Raskin: he was doing text-based work all along. He was never the grand old man of the GUI, but of UI minimalism. If there ever was a grand old man of graphics UI, maybe he was Ivan Sutherland (still youthful in every way that counts and productive, by the way) or Alan Kay (same). For those who don’t know, Jef passed away recently and he is sorely missed.
Raymond claims that what I dreamed up as a “farce” is to him a “hopeful” possibility. What I wrote up as farce was people abandoning the species to become Gods because of Wal-Mart.
I am curious whether a cultural gulf between Raymond and me is so profound that my words are impenetrable to him, or whether he just skimmed and made assumptions about what I said.
The main point for me is noticing the warmth and generosity of what’s happened with the net. The problem with capitalism is that it works too well and can distract people from noticing beautiful things. If you think of absolutely everything in creation, or even just in human affairs, as capitalism-in-action, then you live in an impoverished universe of your own reduction. I point out that the same is true for the spectacular economic successes of the new economy.