
our programs. Each time we depressed a key, the teletype would bash out a
letter on the paper in front of us, so we could read what we'd typed; but at
the same time it would convert the letter into a set of eight binary digits, or
bits, and punch a corresponding pattern of holes across the width of a paper
tape. The tiny disks of paper knocked out of the tape would flutter down into
the clear plastic hopper, which would slowly fill up what can only be
described as actual bits. On the last day of the school year, the smartest kid
in the class (not me) jumped out from behind his desk and flung several
quarts of these bits over the head of our teacher, like confetti, as a sort of
semi-affectionate practical joke. The image of this man sitting there, gripped
in the opening stages of an atavistic fight-or-flight reaction, with millions of
bits (megabytes) sifting down out of his hair and into his nostrils and mouth,
his face gradually turning purple as he built up to an explosion, is the single
most memorable scene from my formal education.
Anyway, it will have been obvious that my interaction with the computer was
of an extremely formal nature, being sharply divided up into different
phases, viz.: (1) sitting at home with paper and pencil, miles and miles from
any computer, I would think very, very hard about what I wanted the
computer to do, and translate my intentions into a computer language--a
series of alphanumeric symbols on a page. (2) I would carry this across a
sort of informational cordon sanitaire (three miles of snowdrifts) to school
and type those letters into a machine--not a computer--which would convert
the symbols into binary numbers and record them visibly on a tape. (3)
Then, through the rubber-cup modem, I would cause those numbers to be
sent to the university mainframe, which would (4) do arithmetic on them
and send different numbers back to the teletype. (5) The teletype would
convert these numbers back into letters and hammer them out on a page
and (6) I, watching, would construe the letters as meaningful symbols.
The division of responsibilities implied by all of this is admirably clean:
computers do arithmetic on bits of information. Humans construe the bits as
meaningful symbols. But this distinction is now being blurred, or at least
complicated, by the advent of modern operating systems that use, and
frequently abuse, the power of metaphor to make computers accessible to a
larger audience. Along the way--possibly because of those metaphors, which
make an operating system a sort of work of art--people start to get
emotional, and grow attached to pieces of software in the way that my
friend's dad did to his MGB.
People who have only interacted with computers through graphical user
interfaces like the MacOS or Windows--which is to say, almost everyone who
has ever used a computer--may have been startled, or at least bemused, to
hear about the telegraph machine that I used to communicate with a
computer in 1973. But there was, and is, a good reason for using this
particular kind of technology. Human beings have various ways of
communicating to each other, such as music, art, dance, and facial
expressions, but some of these are more amenable than others to being
expressed as strings of symbols. Written language is the easiest of all,
because, of course, it consists of strings of symbols to begin with. If the
symbols happen to belong to a phonetic alphabet (as opposed to, say,
ideograms), converting them into bits is a trivial procedure, and one that
was nailed, technologically, in the early nineteenth century, with the
introduction of Morse code and other forms of telegraphy.
We had a human/computer interface a hundred years before we had
computers. When computers came into being around the time of the Second
World War, humans, quite naturally, communicated with them by simply
grafting them on to the already-existing technologies for translating letters