DETAILS


COLUMNS


CONTRIBUTIONS

a a a

COMPUTER GAMING

Vol.32 No.2 May 1998
ACM SIGGRAPH



The Attack of the Autistic Peripherals



Dave Taylor
Crack dot Com

In the computer games industry, where the competition is arguably even fiercer than in the hardware industry, you need every ounce of innovative gee-wizardry you can spare in order to differentiate your product and get it noticed in the marketplace.

Your eyes dominate your impression of a new game, followed by your ears. So in order to create a first order difference from the competition, you need to provide a radically innovative graphics presentation. Blockbuster games like Myst and Doom showed precisely just what innovative, gorgeous graphics can do for a title.

While salivating over the display of any 3D game running on a Voodoo II graphics card in March of 1998, it is easy to forget why the computer was such a marvelous invention. The real innovation of the computer was that it was a general purpose processing machine -- as opposed to a multiplier or a machine that only computes logarithms. But transfixed by the millions of lovely colorful triangles flitting about the glowing monitor, this $250 graphics card can quickly convince the weak-minded they are seeing a deity instead of a monster hack.

Unfortunately, itís these religious moments, largely communicated to us through our manipulatable eyeballs -- one of our most sensitive high-resolution input organs -- that are bringing technological innovation in todayís games to a crawl, even as the frame rates go through the roof.

Manufacturers lose sight of this, and instead of creating powerful new general purpose computers and peripherals, they create hardware specifically targeted to solve one task, such as rendering triangles. Then when the competition forces them to differentiate their product, and the manufacturer realizes they can only make the silicon go so fast, creeping ďfeaturismĒ turns into a bloated sac of oozing interconnected computational units, otherwise known as a ďgraphics pipeline.Ē

We all talk about a graphics pipeline as if itís a lovely, elegant thing that can be abstracted to software or hardware at any layer. But if you close your textbooks for a moment and take a look at real software, you see a graphics pipeline for what it is: a graphics pipe-dream!

The graphics pipeline represents the most compute-intensive stage of any game. The 3D video card rationale is, ďGet that rendering off my CPU!Ē This is generally seen as a noble rationale because itís easy to justify with Amdahlís Law, but Iíve learned through painful experience that this answer is wrong.

Useful gameplay code is constantly interrupting the graphics pipeline, forcing you to either duplicate work done in the pipeline or to complicate the pure pipeline with game code. Worst of all is the lack of any consistency in implementations. The differences are so drastic that you are forced to implement a software rasterizer, just in case someone owns a 3D video card your game doesnít run acceptably on.

They say history repeats itself, but until I started approaching the dreaded age of 30, I never believed in this ďgoofyĒ grown-up philosophy. After all, how does the Internet repeat itself? Could there ever really be another Apple Macintosh? Isnít it just about old folks trying to scare up analogies between something they donít understand and tin cans connected with string?

Iím a believer in repeating history now.

It happened in the early 1990s right under my nose, and itís happening now in the late 1990s. I was on the front lines of both and can tell you that itís a horrifying experience. This historical loop, for lack of a better B-title, was ďThe Attack of the Autistic Peripherals.Ē

Once upon a time, about a decade before the coming of the second millennium AD, there was an ancient operating system named ďDOS.Ē It had no sound driver, and because of this, all who developed in DOS heard a mighty sucking sound that was their development money getting blown on supporting various sound cards, from the Sound Blaster to the Gravis Ultrasound to the AWE/32 to the Mediavision to the Ensonic Sound System.

Fortunately, there was no such sucking sound in the realm of hard drive drivers or network cards. They just worked. Today, drive controllers and network cards are mostly boring. There arenít any new features which blow your mind. The only big advances left are faster and faster controllers that can up the bandwidth through parallelism, latency-hiding and improvements in ease-of-installation.

But do users dislike their hard drive controllers and network cards because theyíre boring? Not at all. In fact, users can depend on those devices better than they can depend on any 3D video accelerator card or sound card. Users like boring when it comes to peripherals. Once itís boring, itís comfortable. You trust it because you know the hardware is trying to toast a piece of bread faster and more evenly instead of shredding your bread, toasting crumbs and coating them with a thin lacquer in a plaid pattern.

Witness the attack of the autistic sound cards. One claimed to be the standard because it was backwards compatible with old code and other cards were compatible with it. Only it wasnít and they werenít. Another claimed to have cleaner sound, only it required some special case code and not as many people had these. Another would mix the sounds in its own RAM for you, and its users were loud and obnoxious, and the card was a mighty hack and not worth supporting, despite the collective fanaticism of the users. And another tried to emulate this card, only it was special because it trademarked a new term for the sounds it mixes for you. And another was made by a staggeringly influential company but didnít even have an interrupt to signal when the card was done reading a DMA buffer.

Oh, but that was just the start. These sound cards could all play MIDI music! To explain -- you could not play a high quality MOD audio file in real time because it cost too many CPU cycles, and CD-ROMs were still expensive so you couldnít just stream them off of floppy disks or hard drives. So you wrote MIDI music, and MIDI music sounded terrible on almost everything except this one card which no one could afford, and even it had annoying limitations.

Is this ringing any bells yet?

So what did game developers do about this? We all got sick of the multiple standards, basically asked that you have a plain old sound card to output sound with, then we did mixing ourselves. With the advent of CD-ROMs, we dumped MIDI music in favor of that same sound card, which we stream our music out to now. We simplified. We wanted a dumb peripheral and better general purpose computing power, so that we would have exact, precise control over our code and our games.

So if history repeats itself, what is going to happen with this 3D card silliness? Some have alpha-blending. Some donít. Some have Z-buffers. Some donít. Some have ultra-modifiable rasterization pipelines. Some donít. Some support edge setup. Some donít. Some support transformations. Most donít. Most donít allow you to manage texture memory. They all seem to have different texture formats. Some donít like the CPU touching the frame buffer at the wrong time and also arenít too great at reading frame buffer memory.

Even although individual features of these cards are impressive, they all effectively distort the presentation, because we have been handed the choice between a poor 3D API and a good 3D API which only a handful support to an acceptable degree. We, the game developers, get to sort out what to do with this monstrosity that is the 3D video card market.

What we have is an official mess. It blows away the last mess we had with sound cards, MPU 401 MIDI wavetable music or FM synthesis. Sound cards were childís play. 3D cards are proving to be the great technological nightmare of the 1990s and represent todayís autistic peripheral.

Perhaps because our sense of awe is so overwhelmingly influenced by our eyeballs, itís not hard to understand how we got in this mess. Someone realized that you could bring home the gorgeous graphics available on SGI and Evans and Sutherland 3D hardware, and Pandoraís box was officially opened. Never mind that this hardware was designed for very specific purposes and very narrow, wealthy markets. It was time to cram that design into our home computers, and riches and power to he who is handiest with a shoehorn.

As a result, todayís bleeding-edge gaming scene is getting bland fast. When you purchase a PC game today, this bleeding edge consists of 3D games using triangles as primitives. There is innovation happening via multipass techniques on the triangles, but there are only a handful of operations we can do in these passes. If these triangles were consistent on every machine, we wouldnít have a problem, but they arenít.

Many of us arenít aware of how stifling the triangle primitive can be. Although probably impossible to accelerate on 3D cards, anyone who looked into Animetekís Caviar (real-time voxel volume rendering) program in 1997 was far more impressed than those who checked out the latest and greatest in accelerated, conventional triangle-based modeling programs, because Caviar is a new approach. But this is just the beginning. Hardware vendors have already discovered that 2D and 3D rendering techniques donít mix well in hardware, and game developers are pulling their hair out as a result. We have learned how painful it can be to abstract your texture memory management. There are a lot of factors like these that we all have to put up with just so that, in the end, our games can look like everyone elseís.Well, Iím tired of this state of affairs, and I think there is an excellent solution with only two problems. The solution is symmetric multiprocessing. The problems are our two favorite monolithic computer companies. They helped bring personal computers to many tens of millions of people, but ironically, the problems are Intel and Microsoft.

Electrons donít even go the speed of light and so to make them do more work, we need to use more electrons doing work in parallel. Unfortunately, weíre all writing code for a bloated, expensive, slow processor architecture. If this architectureís creator took their money and invested it in a simple, fast general-purpose processing unit with no out-of-order execution, fixed-width instructions and unified floating-point/integer hardware, a simpler cache, then they could probably cram four of these screaming entities in the same die space as their last product, and clock them at twice the speed. We would all be much happier.

Dave Taylor, one of the programmers who worked on Doom at id Software, is the owner of a small, up-and-coming game development house, Crack dot Com. Their second title, Golgotha, is slated for release in 1998 for both Windows 95 and Linux.

Dave Taylor
Crack dot Com
1101 South Capital of Texas Highway
Austin, TX 78746


The copyright of articles and images printed remains with the author unless otherwise indicated.

In addition, however, thereís our operating system buddies ó Microsoft in Seattle, Washington ó the most popular operating system in the world will most likely never effectively support multiple processors.

The incredible inertia and stock value of these two companies is probably going to keep this amazing technology off our desktops for years. They, no doubt, at the insistence of their shareholders, will pace things out so they can squeeze every last gigabuck out of our favorite operating system and CPU architecture.

However, assuming these systems do eventually end up on our desktops, game developers will basically have to relearn how to code. I for one am very excited about the prospect. Writing threaded real-time code is no easy job, but itís what we should be learning how to do, because electrons want to work in parallel, and we shouldnít be fighting that. What we need is a ďstupidĒ frame buffer that displays pixels the way we tell it to, and not the way some API thinks it ought to happen on the canonical hardware model.

But I rant. This is what I think is wrong with game development today. I want to write games, and Iím particularly fond of writing portable games that behave roughly the same on everything, like videotapes behave in different VCRís. Stupid hardware is in my way, and clever hardware ought to be sitting on my desktop. I hope the powers that be relearn what itís like to make a technological breakthrough again so that we can get to the next level.