SIGGRAPH 2002 Papers Fact Sheet

Conference: 21-26 July 2002
Exhibition: 23-25 July 2002

Henry B. Gonzalez Convention Center
San Antonio, Texas USA

www.siggraph.org/s2002

The SIGGRAPH 2002 Papers program has established itself as the worldís premier forum for the presentation of new and exciting results in computer graphics and interactive techniques. SIGGRAPH 2002 continues the tradition with groundbreaking papers in many areas. From a record number of 358 submissions, 67 papers were accepted for presentation at the conference. John F. Hughes, Brown University, is the SIGGRAPH 2002 Papers chair.

PAPERS PROGRAM HIGHLIGHTS:

Motion Synthesis from Motion Capture Data
Five SIGGRAPH 2002 papers address the single topic of motion synthesis from motion capture data from multiple perspectives. With recent developments in motion-capture devices, large collections of motion-capture clips are now available. Just as a single image can be created by piecing together bits of other images, new character motion can be created by splicing together pieces of recorded motion. The hard work is in determining how to break a recorded motion into clips, and then make the joints between clips appear seamless.

Each paper addresses this issue in different ways, and each presents a different application of the general idea.

Motion Graphs shows how to apply this idea to creating natural walking motions along arbitrary paths under user control.

Interactive Control of Avatars Animated with Human Motion Data uses it to control synthetic characters that represent the userís persona in a virtual world. The user can control the avatar by making desired motions in front of a single video camera, for instance, and the system converts these into plausible 3D motions.

Motion Textures: A Two-Level Statistical Model for Character Motion Synthesis uses captured motion to create new but similar motion, so that a virtual character can dance in a non-repeating way after only a modest amount of motion-capture data has been recorded.

Motion Capture Assisted Animation: Texturing and Synthesis allows animators to use traditional keyframe tools to roughly sketch out an animation, and then can fill in the details of the motion from the captured data.

Interactive Motion Generation From Examples concentrates on generating new motions from old in real time, and even on generating motion for multiple characters at once.

New Hardware
Two papers concentrate on new hardware. SAGE Graphics Architecture describes in detail the design of a new graphics system capable of rendering 80 million high-quality triangles per second. Of particular interest is the support for high-quality antialiasing of triangles, removing the jagged-edged artifacts that are especially apparent on thin polygons and lines. By contrast, Real-time Ray Tracing on Programmable Graphics Hardware shows how ray tracing, once thought to be too expensive for any real-time application, may be competitive with traditional triangle-based rendering as new graphics hardware evolves.

Ideas for Future Use in Commercial Software
Four image-based papers describe ideas that are likely to rapidly become commonplace in commercial software. In Self-Similarity Based Texture Editing, the authors describe how a user may mark up a portion of an image (coloring one roof-shingle brown, for instance); the system then finds all similar portions of the image (all the other shingles) and modifies them similarly, and the house gets a brown roof. Instead of changing the color of a part of an image, one can instead change its geometry. This means you can enlarge all parts of an image that are similar to the one beneath the cursor and shrink the others, so you can make a brick wall have larger or smaller mortar seams, for example.

Transferring Color to Greyscale Images does just what its title promises. If you have a black-and-white image, and a somewhat similar color image, the system will produce a colored version of your black-and-white image. For instance, if you have a B/W forest scene, and a color image of some trees, the system can color in the forest, make the sky blue, etc. ‚ all without user intervention.

In Object-Based Image Editing, the authors describe how to edit a photograph in which distinct parts are reasonably evident and distinguished by color ‚ a photo of a national flag, or a traffic light, for instance. In such an image, the user can manipulate regions of fairly constant color as if they were objects, moving them around, distorting them, etc., so that one could, for instance, turn all the round lights of the traffic signal into square ones, or shuffle their positions.

Finally, in Stylization and Abstraction of Photographs, the authors extend previous non-photorealistic rendering work in a particularly innovative way. There are many algorithms for turning images into digital paintings, but the results donít often resemble the painting that an artist would make of the same scene, because artists tend to concentrate their strokes in the important parts of an image, and itís difficult for an algorithm to know which parts these are. In this work, the original image is presented to a human viewer, whose eye-motions are tracked. Then paint strokes are applied in regions where the eye spent more time, giving a much better result.

Paper Titles and Authors

Interactive Control of Avatars Animated With Human Motion Data
Jehee Lee
Jinxiang Chai
Carnegie Mellon University

Paul S. A. Reitsma
Brown University

Jessica K. Hodgins
Carnegie Mellon University

Nancy S. Pollard
Brown University


Interactive Motion Generation From Examples
Okan Arikan
D.A. Forsyth
University of California, Berkeley

Motion Capture Assisted Animation: Texturing and Synthesis
Katherine Pullen
Christoph Bregler
Stanford University

Motion Graphs
Lucas Kovar
Michael Gleicher
University of Wisconsin-Madison

Fred Pighin
USC Institute for Creative Technologies

Motion Textures: A Two-Level Statistical Model for Character Motion Synthesis
Yan Li
Tianshu Wang
Heung-Yeung Shum
Microsoft Research Asia

Object-Based Image Editing
William Barrett
Alan Cheney
Brigham Young University

Ray Tracing on Programmable Graphics Hardware
Timothy J. Purcell
Ian Buck
Stanford University

William R. Mark
Stanford University (now at NVIDIA Corporation)

Pat Hanrahan
Stanford University

SAGE Graphics Architecture
Michael F. Deering
David Naegle
Sun Microsystems, Inc.

Self-Similarity Based Texture Editing
Stephen Brooks
Neil Dodgson
University of Cambridge

Stylization and Abstraction of Photographs
Doug DeCarlo
Anthony Santella
Rutgers University

Transferring Color to Greyscale Images
Tomihisa Welsh
Michael Ashikhmin
and Klaus Mueller
Stony Brook University

MEDIA INFORMATION
> News Releases
> Fact Sheets
> Add Me To Your Mailing List
> SIGGRAPH 2002 Committee
> SIGGRAPH 2002 Logos
> Media Events
> Media Facilities
> Newswire Service
> AV Recording Guidelines
> Media Registration
 

Back To Main Page
conference exhibition Call for Participation Presenters Media Registration Search SIGGRAPH 2002 Homepage