DETAILS


COLUMNS


CONTRIBUTIONS

a a a

ENTERTAINING THE FUTURE

Vol.32 No.1 February 1998
ACM SIGGRAPH



Beware of the QWERTY



Mike Milne
FrameStore


February 98 Columns
About the Cover Visfiles


Mike Milne
Previous article by Mike Milne Next article by Mike Milne

Since anniversaries are in the air, our thoughts naturally turn to what has happened in the past 25 years. I had intended to do the same, by reviewing the changes over the last two and a half decades in film effects technology (lots) and whether I thought it had changed film content (not much). Then, the plan was to lay out my vision of the next 25 years of computer graphics in the entertainment industry -- complete with virtual film studios, total performance capture, synthetic recreation of dead actors from computer analysis of archive footage and all the rest of that familiar chorus of future-babble that echoes around the fringes of our industry.

After wrangling with that for a few days, I came to the conclusion that, once again, there was Something On My Mind that had little to do with crystal-ball gazing and a lot more to do with hands-on experience. "Hands on what?" you might ask. Well, that's the root of the matter, as it happens. The particular topic that I'd like to (metaphorically) chew on is that of the interface hardware: the bits that allow the computer to tell us what it's doing, and the bits that allow us to tell the computer what we'd like it to be doing instead.

A Diversion

But first, a little diversion -- about four decades backwards in time should be about right.

Once, when I was a small boy on holiday with my parents in Rome, they decided to visit an old friend who was an art collector, journalist and critic. They were forced to take me along, through lack of anywhere suitable to park me for a few hours, and our host suggested I might like to look at his collection of antique typewriters to keep me amused for a while. Some hours later, when it was time to go and my parents came to find me, I could not be persuaded to leave -- I had found my perfect toy. This was one of the first typewriters, and it was a strange thing indeed.

The rear of the machine was similar to the more familiar machines of later years, with a roller for carrying the paper, and some arrangement of metal hammers for striking the paper with a piece of type through an ink-impregnated ribbon. But the strange part was the bit at the front -- what would now be called "the interface." There was only one key (which might have been labeled "press," or "enter," or maybe just "type" -- I don't remember now), and a long brass pointer that hung down vertically from an arm. On the flat base of the machine were engraved all the letters of the alphabet, with the usual assortment of figures and punctuation marks. The mode of operation was simple -- with one hand, the pointer was moved until its point lined up with the required letter on the base, and with the other hand the typing key was struck. Hey presto! The letter appeared on the paper, as if by magic! Of course, typing speed was a problem -- a proficient operator could only achieve about 10 or 15 words a minute.

Now why should this technologically primitive device have caught my imagination so thoroughly? After all, even in those long-gone days of my youth, I had seen what was then considered to be a "modern" typewriter -- in my own home. I had played with it many times, and had even written actual documents with it (albeit very short: "Dere Grannn, thNanks foR hte preSsent, loVe frM Me" would have been about the longest). No, what was fascinating about the antique machine was this: the letters were laid out in alphabetical order -- which I had previously learned at school, and held no terrors for me. I knew where I was with the good old alphabet. Name me a letter, any one, and I could tell you what came after it -- and what came after that one, as well. This was in complete contrast to the extraordinary way in which the keys on the typewriter at home were arranged -- with the letters "QWERTYUIOP" on the top row, and the remainder arranged in no less puzzling an order. I mean, what on earth was a Qwerty? Most of my time at the keyboard was spent hunting for the right letter. I would even have to run into the room next door to ask an adult where to find the letter "K" (always a tricky one, that K) -- so it was no wonder that the older machine was better, as far as I was concerned -- I could actually type faster with it!

Another Diversion

Another diversion -- this time to evolutionary biology. (Yes, I know we're straying from the subject, but please indulge me. I'm sure it's all leading somewhere in the end!) In his book The Extended Phenotype, zoologist Richard Dawkins first outlined the concept of "memes," which are, for the mind, what genes are for the body. A gene relies on the reproductive system of the body within which it is carried to effect its transmission to another generation; the meme relies on cultural transmission via the various sensory organs. A meme is, basically, an idea that can be passed from one human mind (or meme-nest) to another; the more frequently it makes the jump, the more widespread the meme becomes. Successful memes survive for many reasons; some are essential if their human hosts are to thrive (the use of fire, wearing clothes in a cold climate, the wheel) and hence get repeated as valuable lessons; some are simply attractive in their own right (phrases of music, jokes), and some have no discernible purpose but get duplicated anyway (wearing a baseball hat the wrong way round, body piercing).

The layout (or, if you like, meme) of the "QWERTY" keyboard has a historical reason for its existence and success, which people are fond of repeating at every opportunity -- and which I shall mention briefly (thus aiding the reproductive success of the "Qwerty history" meme) for those who have not heard it. When the first multi-keyed typewriters were produced, the mechanism was fairly primitive, and the hammers that typed the characters took an appreciable time to return after a printing stroke. If another key was struck immediately after the first, the new hammer on its way up to the platen would catch on the old one coming down, and the two would lock together. Unlocking them was a messy nuisance. In order to avoid this happening so often, an early manufacturer (was it Mr. Remington or Mr. Underwood? I don't know) hit upon the idea of measuring the frequency of letter pairs in the English language, and using that information to arrange the typewriter keys in such a way as to make it as difficult as possible to type them quickly. In other words, the keyboard layout is designed to be as inefficient as possible, in order to compensate for the shortcomings of the technology.

Of course that technology has been superseded, and so we need not be burdened by that clumsy keyboard layout -- except that, of course, we are -- the QWERTY meme is now locked into our culture. The QWERTY machines were mass-produced, and many people learned to use them -- so well, in fact, as to negate the manufacturer's efforts to slow them down -- and so many people learned it, that any aspiring typewriter manufacturer had to adopt the QWERTY layout if they wanted to sell machines. (As it happened, there was an early competitor to the QWERTY keyboard -- the SHRDLU layout. It survived by carving itself an evolutionary niche in the newspaper industry, where it became the standard layout for some typesetting machines. Often, journalists would file stories that were incomplete -- there might be names to verify, for instance. The typesetters would simply fill in the missing names by running their finger across the first few keys, and the offending line would later be replaced by a correct version. Of course on many occasions the unchanged line made it into print, so that newspaper stories of the first half of this century were filled with reports of "Mr. Shrdlu, president of the Union Bank," or "Governor Smith's beautiful wife Shrdlu." The SHRDLU keyboard became extinct when computer-based typesetting, from the rival QWERTY camp, took over the industry. However, if you were to come across an old clipping with a SHRDLU typo in it, you would be holding in your hand the "fossil" of an extinct meme!).

Evolutionary theorists like Mr. Dawkins have adopted the QWERTY story as an analogy for the way in which biological anomalies get incorporated into living things -- like the fact that the "wiring" in our eyes appears to be back-to-front: the nerves come out of the front of the rods and cones, and loop round to the back -- a less than perfect arrangement. This sort of thing is known as a QWERTY phenomenon, or as Francis Crick (of DNA fame) calls it, a "frozen accident" -- and frozen accidents persist for the same reason that the QWERTY keyboard persists: sheer weight of numbers. The cost of replacing all those keyboards and retraining all those typists is now too great compared to the modest benefits that might result from a new keyboard layout.

What are our QWERTYs?

My concern is not with the biological analogies of QWERTY phenomena, however; I am more interested in social (or "memetic") QWERTY phenomena -- including the eponymous keyboard itself, since it impinges directly on our everyday lives in the computer graphics business. Are there other examples of QWERTY phenomena, either already established or perhaps starting to take root, in our industry? And if there are, is there anything we can do to nip them in the bud, and replace them with more useful or elegant designs? I suppose the first thing we have to do is to learn to recognise them; after all, these rogue memes all start by masquerading as improvements -- indeed, some are, in reality, improvements -- at least in the beginning.

My prime suspect for a QWERTY-in-the-bud is the result of cathode ray tube technology. In our business, the CRT replaced the earlier teletype -- effectively a giant electric typewriter, with the keys wired via the computer -- to which it was undoubtedly superior, in almost any way you could think of. So why, if it is an actual improvement, do I think of it as undesirable? (Apart, that is, from the bulk, weight and general impracticality of the hardware itself.) Well, I think that the widespread use of the CRT has given birth to an unpleasant meme -- which I'll call the "Blinkered View" meme (Blinkers were leather sidepieces attached to a horse's bridle to prevent sideways vision, so that the horse wasn't alarmed by other street traffic).

The limitations in resolution and size of CRTs meant that while working at a computer, vision was limited to a small area of the visual field -- somewhere around a tenth of your normal field of view. In the early days of computing, this was not a serious limitation -- since most of it involved editing short lines of text and watching streams of numbers. It's a different story now -- we use computers for all sorts of different applications -- but although the interface has undergone a few changes, essentially we're still limited to that one-tenth view. "Ah yes, but we have Windows now," you might object. Aye, there's the rub -- if the CRT gave birth to the Blinkered View meme, then the windows paradigm (itself an insidious meme) has been responsible for nurturing it to adulthood.

The idea of text windows that could be stacked and closed was an obvious improvement on the original, single-screen display. It seemed to open up the computer more, and allow us to have more virtual space to work in. Unfortunately, it also allowed another notion to slip in, unnoticed: the notion that this is a normal way to work.

The real-world analogues of closing and stacking windows might be say, closing books, putting papers or drawings away in boxes or placing them in neat (or not-so-neat) piles. I maintain that this is not what you do while you're working; this is what you do when you've finished work. When you're working, you want everything visible at once. If you were writing in the bad old precomputer days, you might be sitting at a desk with a typewriter on it. Next to it you might have a dictionary (open at the last place you looked, just in case you need to look again), a notebook (also open), an old shopping list with a couple of good ideas scrawled on it, a first draft with pencil lines all over it, maybe a magazine or two (open to interesting articles) and the previous pages of your text. If you had to keep all of those things in a heap, and only slide them out (one at a time) to look at them, then put them back before continuing work, you would consider that a major interruption to your normal working method. And to those who would argue that the windows method allows you to have many things visible on the screen at once, I would say no, it doesn't. It allows you to have one thing clearly visible, and a lot of other things half-hidden.

Mike Milne is Director of Computer Animation at FrameStore, which together with its sister company CFC, forms one of Europe's largest digital effects teams. Mike started out as an artist and beachcomber in the '60s, moved into graphic design in the '70s and finally to computer graphics in 1982. Sometimes he regards his career as one long, downhill slide.




Mike Milne
FrameStore
9 Nole Street
London W1V 4AL
United Kingdom

Tel: +44-171-208-2600
Fax: +44-171-208-2626


The copyright of articles and images printed remains with the author unless otherwise indicated.

Similarly, an artist at an easel, or a graphic designer at a drawing board, traditionally works with all sorts of other material visible -- rough sketches, references, a photo of the family dog -- or whatever. Basically, we have a sophisticated visual system that allows us to see a wide field of vision, and with (extremely rapid) movements of the eyeballs, to focus quickly on any area of interest. Movements of the head, accomplished with an equally sophisticated arrangement of vertebrae, allow us to extend that visual field even further, with little conscious effort. Meanwhile, our beautifully engineered arms and hands can achieve extremely precise movements, anywhere in about 24 cubic feet of space. If we need to see a little more detail, we lean a little closer -- without thinking about it, or noticing that we're doing it.

Now contrast that with the current method of working for an artist on a computer. The head doesn't move, and the eyes move only slightly -- 15 or 20 degrees at the most. One arm and hand is motionless, while the other arm and hand moves slightly (again, 15 or 20 degrees) to manipulate the mouse. If the artist wants to see detail, one portion of the screen is zoomed up, but the rest of the picture is hidden. Other references are visible only as reduced images, if they are to remain visible while working (and it's no good leaning closer -- there's no detail there).

Now to me, this working method is not an improvement; it's a restriction -- which is why I have called it the "Blinkered View." Quite literally, it seeks to limit our field of view, and eliminate lateral vision -- just like the original leather blinkers. Metaphorically, it is more dangerous -- because it has introduced the small screen, small hand movement concept to many other applications.

It's not that I don't appreciate the advantages that come with the computerisation of the industry, and the massive possibilities that are on offer. I'm terrifically excited about the whole thing, and since I spend most of my waking hours in front of a computer (and sometimes some of my sleeping ones too), I guess that's a good thing. I'm simply pointing out that just because something is widespread, it doesn't mean that it's the best -- or even that it's desirable -- and that we should be on the lookout for any QWERTY-style memes that are burrowing into our cultural soft tissues, before it's too late to do anything about them.