written by Melanie A. Farmer

When Paul Debevec was a post-doc at UC Berkeley in the 1990s, achieving realistic digital human faces was considered a “holy grail” goal in the field of visual effects. This challenge became an area of computer graphics that Debevec and others would work in for the next two decades. Lucky for many moviegoers, they stuck it out: through their revolutionary facial capture methods, memorable characters in such high-grossing films as Avatar and Spider-Man 2 were brought to life on the big screen.

This year, the Academy of Motion Picture Arts and Sciences awarded Debevec, Tim Hawkins,  Wan-Chun Ma, and Xueming Yu with a Technical Achievement Award (a Sci-Tech Oscar) for the invention of the Polarized Spherical Gradient Illumination facial appearance capture method and the design and engineering of the Light Stage X capture system. The innovative system has been used in over 40 feature films, including Maleficent, Furious 7, Oblivion and Blade Runner 2049.  It will also contribute to five new movies releasing this year, including Captain Marvel and Shazam.

Indeed, the team’s facial scanning technique coupled with the light stage production system has staked its claim as an industry standard in Hollywood.

“This technology has helped achieve some of the most realistic photoreal digital actors in movies to date,” says Debevec, senior staff engineering at Google VR. “It allows visual effects artists to build a digital character based on the face of the real actor which has all of the same facial shape, appearance, and expressive lines and wrinkles as the animated character. Without this level of detail, the digital actor may not look believable, which would take the audience out of the story.”

Debevec and his team first introduced the new facial capture technique in a paper at the 2007 Eurographics Symposium on Rendering. The team had been working on how to digitize the shape and appearance of a human face down to the level of the person’s skin pores and fine facial creases. Pre-dating the team’s technology, techniques to create digital faces required a lot of time and effort, with low-res results that did not produce realistic details. Debevec and colleagues developed a method to capture natural human faces digitally in a few seconds with their spherical light stage device by shooting a series of photos under special polarized lighting conditions.

“We first put polarizers on all the light stage lights in 2005 to see if we could cancel out the shine of the skin from every lighting direction at the same time,” notes Debevec. “Then, we figured out the gradient illumination patterns to compute surface orientations from the specular reflections the following year. By August 2006, we showed some initial results of the high-res facial scanning to friends at Weta Digital, and within a few weeks they were sending over Sam Worthington and Zoe Saldana, the stars of Avatar, to have their faces scanned with us.”

In addition to feature films, the team’s light stage technology has been used to create digital characters in a variety of video games.  At SIGGRAPH 2013’s Real-Time Live, Debevec’s team worked with Activision to demonstrate “Digital Ira”, one of the first photoreal digital characters achieved with real-time rendering.  Today, copies of the current USC ICT light stage system have been purchased and installed at both the Activision and Electronic Arts game studios in Los Angeles.  And in the team’s highest-profile scanning project, they were invited to build and operate a special light stage scanning system to digitize President Barack Obama in the State Dining Room of The White House in the summer of 2014.

The team’s scanning time is down to just a fraction of a second now.

Debevec, who has been an active ACM SIGGRAPH member for more than 20 years and the recipient of SIGGRAPH’s first Significant New Researcher Award, attended his first SIGGRAPH in 1994 while a summer intern at Paul Allen’s Interval Research Corporation.  Debevec published his first SIGGRAPH paper in 1996 on the topic of modeling and rendering architecture from photographs, and went on to use these techniques to direct the Electronic Theater film The Campanile Movie at SIGGRAPH 1997 which helped inspire the “Bullet-Time” shots in The Matrix. “And, I haven’t missed a SIGGRAPH since,” he says.

Debevec and team were presented with their Academy Technical Achievement Award at the Feb. 9, 2019 Sci-Tech awards ceremony held at the Beverly Wilshire Hotel. This marked Debevec’s second Academy Award; in 2010, the Academy presented him and his colleagues with a Scientific and Engineering award for the first generation of the light stage capture system and image-based relighting process used on films such as Spider-Man 2 and the Curious Case of Benjamin Button. “Receiving a second Academy Award is just as exciting as one would hope. The Academy Sci-Tech committee puts an enormous amount of effort into researching and validating which technologies and people should receive awards,” notes Debevec, “and they set a very high bar, so it is very gratifying