DETAILS


COLUMNS


CONTRIBUTIONS

a a a

COMPUTER GAMING

Vol.32 No.2 May 1998
ACM SIGGRAPH



Portrait of the Artists in a Young Industry



Noah Falstein
The Inspiracy

Today's big computer games are often graphics-intensive behemoths with SGI-rendered imagery filled with 3D figures and created with big budgets. But games and graphics haven't always been so closely linked. I was pleased to be invited to write this retrospective, as even though my own background in games began in programming, far from art, I've always enjoyed the special synergy that can develop between artists and programmers working together on a computer game. The programmers are amazed as their crude stick-figure sketches turn into glorious images in the hands of a talented artist. Then it is the artists' turn to smile, as their series of still frames comes to life and responds to their control enabled by the magic of the programmer. How has the role of graphics evolved in games, and what lies ahead?

At the Beginning

I became hooked on computer games in 1975 when I entered Hampshire College in Amherst, MA. Our computer lab had no actual computers, but rather Teletype terminals which were simple input/output devices connected to a remote computer via 110 baud acoustical couplers (modems you actually pushed a phone handset into -- see the movie Wargames). The terminals had little cylindrical typeheads that chattered up and down and often created "typos" with unintended or malformed letters displaced randomly above and below the line of text. I'm not sure of the overall capabilities of the Control Data Cyber 74 mainframe they connected to, but I feel sure it was less powerful than the laptop I'm using to write this article.

The game that hooked me was a Star Trek game written in Fortran, where you played the part of Captain Kirk and fought against a Klingon warship. It accepted only a dozen or so commands entered as numbers, and then typed out responses like "Sulu: The Klingon ship is closing to 472.3 kilometers. Chekov: He is firing phasers! Shield number 3 hit, 57.5% remaining." Even by the standards of those days it left much to be desired, but it served as inspiration. A friend and I set out to recreate it and improve it by programming a version together. We did so in part to learn APL, a math-oriented computer language that still has its loyal adherents today, distinguished by being one of the most compact and obscure programming languages ever invented. You can write programs that would take dozens of lines in another language in one line of APL, using lots of Greek letters and made-up symbols for even simple tasks. It was very handy for impressing people with our cleverness, a weakness to which modern C++ programmers are, of course, immune.

Our game began with the capabilities of its Fortran predecessor, and added on from there. Our first innovation was to allow you to rename your ship, and enter a name for yourself. Next were additional commands to add more combat options and hence a greater range of strategy. We even added the first graphics we'd seen in a computer game. When you finally defeated the enemy vessel, you were treated to a still picture of it exploding. This was painstakingly constructed out of character graphics (complete with the name of the ship on one fragment), an art form still alive today in the signature files on the Internet.

The fact that even in those early days we craved graphics is a telling one. Much of a human brain is given over to visual processing, and the quest to add ever more elaborate graphics to our computer games has existed from the real pioneer, Space War, invented at MIT in the early 60s.

Computer Graphics Enters the Mass Market

It wasn't until the early 70s and Nolan Bushnell that computer graphics finally entered the mass market. He first created a rather sophisticated arcade game called Computer Space that never really took off, although it helped inspire me when I first saw it in 1974. Bushnell hit it big with the much simpler bar game, Pong, which he then parlayed into a home game, and from that start he built Atari. My own experience with that round of home console graphics came with my first full-time job in the games business, designing and programming games for the Atari VCS (also called the 2600) for Milton Bradley.

The Atari VCS system was a far cry from today's home computers. It had 128 bytes of RAM. Most programmers thought of it as 1024 bits since you had to use every 1 and 0 even in simple games. The mathematically and technically astute among you will realize that 128 bytes is only enough to define a monochromatic and blocky screen --picture a 32x32 grid of white and black blocks. And yet the VCS ended up with some very sophisticated and surprisingly detailed graphics, in 128 colors.

This was accomplished with two basic tricks. The first was specialized graphics hardware, allowing for something called sprites, independently movable graphic objects, and other special purpose registers. The second was a clever but maddening architecture that forced the programmer to rebuild the screen one scan line at a time, on the fly, each sixtieth of a second. As the beam of the TV swept across the picture tube, the little 1 MHz 6502 chip was valiantly trying to keep up, filling special graphic registers in a race against time. Some early games gave up, and the color or the shape of the sprites developed glitches on the sides of the screen showing graphically the precise point on the scan line that a register changed value. Astute programmers learned to count the cycles of each machine language instruction, and became masters of obscure tricks in order to save a cycle or two here and there, or add one when a change happened too soon. Typical program sizes were 2K or 4K 8-bit bytes long, smaller than the file size Windows uses to save a single sentence of text these days. Even if we could have displayed a high-resolution picture, there was no room to store it in the cartridge.

One result of these draconian restrictions was that it was pointless to employ classically trained artists to create the images. There were certainly many artists who would look at the crude images on the screen and chuckle, but even the few that mastered 6502 assembly code found themselves boxed-in by hardware limitations. Programmers did the best they could, balancing the aesthetic needs with considerations of game play and frequently overwhelming technical constraints. Often the aesthetics came last.

Amidst these intimidating beginnings, there was a ray of hope for budding computer graphics artists. The very territory Computer Space had failed to conquer in the early 70s was exploding with coin-operated videogames by the first years of the 80s. Sales were mounting, with games like Defender and Robotron selling over 50,000 units, and peaked with Ms. Pac-Man at 110,000. Given that each of these games was sent to arcades, bars and supermarkets around the world to be played by hundreds or even thousands of people, videogames were mass-market entertainment.

Computer graphics were a big part of that success. The meager $200 worth of hardware that could be crammed into a home game system that connected to a television was far outshone by arcade games. Their electronics cost 10 times as much as their home counterparts, and they boasted specially tuned monitors for their display, with custom-designed control panels.

The Arcade Boom

I had my chance to create for these state of the art machines when I went to work for Williams Electronics in 1982, at the height of the arcade boom. Their games could take up 64K of ROM and more; the processor was a 6809, both faster and more capable than the 6502 of the Atari VCS; and they had even created a custom "blitter chip" that did high-speed DMA transfers of graphics from ROM to video RAM. I was the first programmer/designer they had hired with previous computer game experience and so was assigned to lead a project that had been struggling for some months. It had a working title of Juggernaut, but it turned out that only about half the people informally polled knew what a juggernaut was (look it up if you're one of them!) We decided to rename it Sinistar.

Sinistar was the first game I worked on where we had the luxury of art created by an actual artist. This was due in large part to the efforts of the chief game designer at Williams, John Newcomer. John had come from the toy design world, and knew the value of professional artists. He had come up with the original Juggernaut concept, and even drew the initial image, a giant angry head, floating in space. To my untrained eyes it looked pretty good. But John knew his limitations, and helped us hire Jack Haeger, a very genial and talented illustrator with fine art aspirations, who became one of the first such artists to be seduced by the dark art of pixel pushing.

Jack had a distinct personal style, often wearing a robe decorated with Japanese characters while he worked, and brought those sensibilities to his creations. He had to work with a palette of 16 colors for the entire game, including the essential black and white. By carefully pairing light and dark hues of some critical colors like red and purple, and adding several shades of gray, he showed us the value of light sourcing and shading. The Sinistar Jack created popped out of the screen, clearly a sphere and not just a flat face. It gleamed metallically in the harsh light of space, and its eyes had an evil glint. Later, when we added speech (a first for a Williams videogame) Jack animated its mouth, and its cries of "Run, Coward!" and "Beware, I hunger!" brought a touch of shock and fear even to its creators when encountered unexpectedly during a 3 a.m. debugging session.

Incidentally, Sinistar was probably the first videogame to employ motion capture for graphics -- of a sort. Jack provided us with three mouth positions, closed, half-open and open. It was up to Sam Dicker, the lead programmer, and myself to figure out which positions to use for which phrases. After a few unsuccessful attempts to synchronize it by hand we hit on a scheme. We wrote each of the short phrases Sinistar spoke on a whiteboard. Then Sam held a marker to his chin with its tip touching the board and moved his head along the phrase, reading it aloud. This gave us a sort of graph showing how his chin dropped as he spoke. Then we "digitized" it, eyeballing the curve, reducing it to three different states and noting duration. Finally, Sam entered the numbers.

He called me over as he ran it for the first time. Disaster! The timing seemed right, but something was very wrong about the mouth movements. I began to mutter something about trying to find some other way when Sam slapped his head, went back to the computer and began feverishly typing. Then he reassembled that section of the program and downloaded it into our emulator, and started it up again. It was perfect! He had transposed the numbers designating three mouth states in a different order than they had been stored in memory. The Sinistar glared at us. "Beware, I live!" it intoned. We got shivers.

Sinistar debuted at the Amusement Machine Operators of America (AMOA) show in the fall of 1983, and received record advance orders for Williams. The cabinets, both upright and sit-down, were covered in stencils and artwork designed by Jack, and they fit the game art beautifully. Unfortunately, within months of the show, the arcade market collapsed. Within a matter of months, arcade attendance dropped by 90 percent. It was a disaster for the coin-operated arcade world, and most of Williams' software staff quit, or were laid off. The exodus from the arcade world spawned many new graphics-oriented companies. Sam went on to become one of the first employees at a new computer company called Amiga, eventually taking Jack Haeger and R.J. Mical -- Sinistar's special effects programmer --with him. But that's another story.

On To Research

My next career move transported me to the Mecca of practical computer graphics research -- Lucasfilm's computer division. I joined the fledgling games group when it was too young to have released any games yet, or even chosen an official name. We worked side by side with many of the luminaries who went on years later to form Pixar, and we were in awe of their skill and envious of their equipment. I attended my first SIGGRAPH conference in 1984 as a Lucasfilm employee, and enjoyed the reflected glory of our computer scientist co-workers. One of the first games, Rescue on Fractalus by David Fox, featured real-time fractal generation code written by Loren Carpenter. Loren had shared an office with David and undertook to implement a 6502 version of the sophisticated fractal algorithms he was creating for Industrial Light and Magic's celebrated feature film graphics.

Our own lot was somewhat more humble. We were programming games for the Atari home computer and later the Commodore 64, a step up from the Atari VCS but not quite as capable as the arcade machines I had become used to. But our first artist, later head of the art department, was a talented ex-Atari employee named Gary Winnick. Gary had his start in the comics business, and took a wry pride in his ability to create despite the hardware limitations. Years later I heard Gary talk about those days, "when the pixels were the size of your head." He often suggested that if he needed another job he could always set up shop at Disneyland and make a living creating caricatures of people by stacking up colored bricks.

I was at Lucasfilm Games, later LucasArts, from 1984 to 1992. In that time we saw a remarkable change happen, foreshadowed by Loren Carpenter's early contributions. In the early years our group had something of an inferiority complex, growing up in the shadow of our ILM older brothers.

Try to visualize our annual company meetings. Held in a palatial 300-seat theater at Skywalker Ranch, each division of the company presented their year's achievements on the huge screen. The licensing division showed endless arrays of Star Wars and Indiana Jones items, bringing record profits to the company. Skywalker Sound (or its forerunner, Sprockets) would show clips from movies they had done post-production for, in full THX splendor. Then ILM would show 10 minutes of their best highlights from three or four films they had worked on over the year. Finally it was our turn, and the $47 million worth of state of the art special effects moviemaking faded from the 60-foot wide screen, to be replaced by a tiny projection of 16 color 320x200 pixel output from a Commodore 64.

It was a humbling experience, but also an inspiring one. Certainly the foundations of the current success of LucasArts Entertainment were formed in those days by our aspirations to live up to the example set by ILM. And over the years some curious things began to happen. Computer games went from a niche for kids to a popular entertainment form. The interactive industry took flight, and suddenly the releases of new computer games were noted in the Wall Street Journal. Typical budgets went from under $100,000 for a game in 1984 to over $1 million in 1992. We started hiring artists that had worked at ILM, and the graphics gap at the company meetings began shrinking.

Chaos Island

My most recently completed design was for a game called Chaos Island for Dreamworks Interactive, based on last year's hit movie The Lost World. I came to Dreamworks as an executive producer, the third employee of their new interactive division. It was thrilling to be part of the first new movie studio in 50 years and to help incorporate an interactive group into the studio from the start. All the existing movie studios have had to awkwardly tack an interactive division onto an existing company, which has always created rivalries and internal competition, often stifling the potential cooperation.

Dreamworks was not immune to those pressures. By the time Chaos Island started, internal politics had driven me to become a freelance designer instead of a full-time employee. But the project itself was a model of cooperation among three cultures. Dreamworks Interactive was built with people from the game development community, from the application software teams at Microsoft and from the Hollywood culture. This sometimes created confusion. The term "development phase" meant story preproduction to the Hollywood types, full production to most of the game developers and the coding process to the Microsoft people. It took a little translation at times, and fostered some misunderstandings.

But on Chaos Island, we figured it out. The producer, Denise Brown, and the lead developer, Steve Herndon, had come from Microsoft as two of the hand-picked group that was allowed to make the transition from the hundreds of hopeful Microsoft employees caught up in the excitement of the Studio's formation. They applied impressive expertise in organizing and coding a major software project. Our assistant producers, Barbra Isenberg and Carol Romo, both came from the film and television industry, and used that expertise to secure great voice performances from stars like Jeff Goldblum and Sir Richard Attenborough without cratering our budget. Our lead artist, Nick DeSomov, brought both studio experience from his work at Disney and great game credentials as the lead artist for Westwood's Command and Conquer.

With these skills the team was able to create a real-time strategy game engine from scratch and complete the title in under a year on a modest budget by today's standards. It shipped precisely on schedule and under the budget planned a year earlier, and even included an extra level and additional features.

How was this possible? We avoided expensive custom solutions. The team used an array of off-the-shelf graphics programs, including Fractal Design Painter and Adobe Photoshop, Premiere and Illustrator. The aforementioned mix of skills was a big factor. Certainly luck had something to do with it. But the team had another quality that was quite unusual. Everyone got along splendidly; there were no ego clashes or prima donna fits. Everyone ate lunch together nearly every day of the project, not through fiat but by choice. When there wasn't enough room to go around, people squeezed together. We went on group outings, for instance to a taping of the TV show Friends and a comedy review. Above all, there was mutual respect for the different backgrounds and skills brought together.

In this, the example of Steven Spielberg certainly had an influence. Steven commanded enough respect that he could have decreed any changes he liked and everyone would have rushed to comply. But he's a computer game fan, and wisely appreciated the abilities of the people working for him, offering suggestions but declining to dictate in a field in which he was more of a consumer than a creator. Ultimately, this project was one of the significant successes in blending game expertise with sound software management techniques and Hollywood glitz and glamour.

What Makes a Successful Game

There's still considerable debate within the game development community regarding whether this is really such a good thing. Some make the valid point that ultimately the most important feature of a game is not the surface appearance, but rather the elusive quality known as "gameplay." But it's impossible to argue with the success of games like Myst and Doom that delivered both entertaining gameplay and state of the art graphics. As I write this article, the movie Titanic is becoming the top-grossing movie of all time, a year after it was making headlines for becoming the most expensive movie of all time. In entertainment there is frequently a strong link between lots of first-rate and often expensive visuals and financial success.

Noah Falstein is a freelance interactive designer living in Marin County, CA. His current clients include Dreamworks Interactive, JVC Digital Studios and Shell Oil.

Noah Falstein
The Inspiracy
391 North Almenar Drive
Greenbrae, CA 94904
Tel: +1-415-461-0157


The copyright of articles and images printed remains with the author unless otherwise indicated.

On the other hand, big budgets don't mean commercial success, and certainly don't dictate underlying quality. A couple of years ago, Wing Commander IV set records for the most expensive computer game of all time. Its budget ran, by most accounts, to $11 million -- give or take a million -- due largely to the incorporation of movie-style graphics with name talent like Malcolm McDowell. The game had fairly good critical reception, and did well in the marketplace but barely enough to break even. That same year Nicholas Cage won the Oscar for Leaving Las Vegas, a film that cost around $3 million. The computer game industry has finally closed the budget gap, with all the advantages and dangers that implies. But just like the movie industry, there's no sure-fire formula for success, either artistic or financial, and that's actually comforting. We'll have to find our own path.

In many ways computer games are growing out of an awkward youth into the first glimmerings of adulthood. Currently an adolescent with two well-meaning but very different parents trying to influence career choices, success will come by combining the best of both legacies into something fresh and unique. The maturity of the industry is still some years in the future, but from my perspective, this offspring will do both its software and Hollywood parents proud.