Emerging Technologies Fact Sheet

Co-Chairs:
Kathy Ryall, Principal Technical Staff, MERL Technology Lab
John Sibert, Professor, Department of Computer Science, The George Washington University

Conference: Sunday 5 August - Thursday 9 August
Exhibition: Tuesday 7 August - Thursday 9 August
Art Gallery/Emerging Technologies Reception: Sunday 3 - 5:30 pm

The Facts

  • The SIGGRAPH 2007 Emerging Technologies exhibition features a broad range of installations from research labs, universities, independents, and industry giants that explore the dynamics between humans and digital systems.
  • Of 75 submissions, 23 installations were selected to be showcased at SIGGRAPH 2007.
  • Submissions came from six different countries including France, Hungary, Japan, Korea, and Sweden.
  • All Emerging Technologies contributors will give presentations detailing their installation and research at SIGGRAPH 2007.
  • High-resolution images are available for media usage in print and web publications. Contact the Media Office for details.

A Quote from the SIGGRAPH 2007 Emerging Technologies Co-Chair:

"The SIGGRAPH 2007 Emerging Technologies program provides a unique look into the future capabilities of computer animation technologies in very practical, everyday environments," stated John Sibert, Emerging Technologies Co-Chair from The George Washington University. "This year's selection of technologies explores how advance computer technology significantly impacts human interaction."

SIGGRAPH 2007 Emerging Technologies highlights include:


String Walker

Contact: Hiroo Iwata, University of Tsukuba

String Walker is a locomotion interface that uses eight strings actuated by motor-pulley mechanisms mounted on a turntable. String Walker enables users to maintain their positions while walking in various directions in virtual environments. Proprioceptive feedback for walking is not provided in most virtual environments.

Potential Future Use:
Research on locomotion interfaces is still in a preliminary state, but some virtual-environment applications, such as training or visual simulation, require good locomotion sensation. Over the next decade, effective locomotion devices will be developed for these applications.

The Sound of Touch

Contact: David Merrill, Massachusetts Institute of Technology Media Lab

The Sound of Touch enables people to manipulate sound samples in a way that is much more immediate and intuitive than current digital tools. The system's technology and interface designs adopt characteristics of acoustic instruments, making samples that are recorded on-the-spot malleable and flexible through continuous gestural interaction with physical textures and resonant objects.

Potential Future Use:
Because it makes sonic exploration so intuitive, a generation of musicians could adopt this system as their preferred synthesis technique in the next 10 years. Ultimately, the system could become a commercial product that would enable people to paint with sound wherever and whenever they want, either for professional sound-design projects or just for play.

Gravity Grabber: Wearable Haptic Display to Present Virtual Mass Sensation

Contact: Kouta Minamizawa, The University of Tokyo

The Gravity Grabber is a new form of ubiquitous haptic interaction that delivers weight sensations of virtual objects. Gravity Grabber is derived from the novel insight that fingerpad deformation provides a reliable sensation of weight even when proprioceptive sensation is absent. The goal of this project is to meet the increasing demand for realistic haptic feedback with a simple haptic display that can present realistic sensations of objects. The project team focused on the mass of a virtual object, which contributes to weight and inertia in haptic interaction. If the virtual mass is presented by a haptic device, the user perceives a more realistic sensation of the virtual object.

Potential Future Use:
As motors and batteries evolve, this device could be downsized and unwired for use in daily life, for example as a grasping controller in entertainment systems or as a force-feedback device for operating a virtual reality environment. Because it is small enough to be worn on a finger, it can be used in combination with conventional mouse-based interfaces. And it provides ubiquitous teleoperation, since the wearable and wireless device can be used to manipulate a robot from any location.

CoGAME: Manipulation by Projection

Contact: Kazuhiro Hosoi, The University of Tokyo

CoGAME is an example of an application enhanced by the "manipulation-by-projection" technique. This cooperative game allows players to visually and intuitively control a robot with projectors. Players interchangeably move and connect their projected images to create a path that leads the robot to its goal.

Potential Future Use:
The CoGAME interface could be used in a wide range of applications in a robot-rich future. For example, a user could project a furniture layout onto a real office environment, and CoGAME-controlled robots could move and place the furniture by following the projected image.

TransPen & MimeoPad: A Playful Interface For Transferring a Graphic Image to Paper by Digital Rubbing

Contact: Woohun Lee, Korea Advanced Institute of Science and Technology

With these novel drawing tools, children and adults can use rubbing motions to transfer a digital image directly to paper and produce a drawing with a personal touch and natural texture, just as in traditional rubbing.

Potential Future Use:
In approximately three years, people will be able to copy graphic images from a tablet PC onto paper with TransPen's digital rubbing technique. Users will enjoy the advantages of pen-based computing more intuitively. It will also be possible to commercialize TransPen & MimeoPad as children's drawing tools. Children will enjoy using TransPen to copy drawings (for example, popular cartoon characters) from RFID-embedded boards.

An Interactive 360-Degree Light Field Display

Contact: Andrew Jones, University of Southern California Centers for Creative Technologies

This display renders the light field of an object - with correct geometric, accommodation and vergence cues in a horizontal plane - by rendering and projecting imagery at 5,000 frames per second onto a spinning anisotropic reflector. Motion-tracked vertical parallax is then employed to allow for unrestricted 3D movement with correct geometric cues.

Potential Future Use:
As this technology matures, it will be capable of full-color, high-refresh-rate imagery in another three years and will be commercially available for high-end visualization at about that time. If price-points continue to fall for graphics and display technologies as they have been for the past 10 years, this approach will be viable at the consumer level within this decade.

Complete Emerging Technologies information