Information for media Main
Fact Sheet SIGGRAPH 98
Enhanced Realities Fact Sheet

SIGGRAPH 98
19-24 July 1998
Orange County Convention Center
Orlando, Florida

Remember when you first discovered coloring books? Creative energy flowed freely because Crayons were easy to use. Then you encountered computers and perhaps could not even find the lines you were told to stay inside! Enhanced Realities, an inspired technology showcase at SIGGRAPH 98, rekindled attendees' creative energy with innovations that facilitate human/idea/machine interaction striving to recapture the reality of creative excitement.

Nestled among SIGGRAPH 98's Exhibition of today's commercially available technology, Enhanced Realities and the Art Gallery burst into kinetic energy barely contained within a central hall oasis of exploration and discovery. Researchers, technologists, and artists who comprise the unique SIGGRAPH community worked side-by-side, pointing the way to our most creative emerging achievements on the horizon.

Of the more than 50 proposals submitted to Enhanced Realities, just the top layer were selected, the 17 most impressive and ground-breaking innovations for presentation at SIGGRAPH 98. Contributors were chosen based on their creativity, technical innovation, presentation, and the potential cultural significance of their work. Selected projects envision our augmented future with clever multi-modal interfaces that challenge our ideas about computing in the physical world.

"The goal of the Enhanced Realities program is to make us return to that child-like wonder of discovery and to inspire us with technological innovations that immerse us in a new, enhanced reality. We want to minimize the path from brain to experience," said Janet McAndless, Sony Pictures Imageworks, Enhanced Realities Chair. "Enhanced Realities takes SIGGRAPH 98 attendees as close as possible to hard-wiring our brains and will prompt us to keep questioning, wondering, and discovering."

Highlights

PingPongPlus

PingPongPlus, a digitally enhanced version of the classic ping-pong game, makes possible a new gaming experience with changing rules as well as user-designed game play. A specially developed ball tracking system offers an added dimension of audio/visual feedback from a conventional ping-pong table. At a minimum based on user selected options, the "reactive table" displays patterns of light and shadows in addition to modulated sound according to the rhythm and style of game play. This project has led researchers in the MIT Media Lab's Tangible Media Group to explore a design space along the axis of competition-collaboration and augmentation-transformation.

HoloWall: Interactive Digital Surfaces

The HoloWall interactive wall environment lets visitors interact directly with digital objects or information displayed on the rear-projected wall surface without any special pointing devices. The combination of infrared cameras and infrared lights installed behind the wall enables recognition of a human body, hands, or any other physical objects that are close enough to the wall surface, and visitors can use both hands simultaneously. Simple active devices such as an infrared remote controller or a flashlight beam and user characteristics such as the shape of the body would also be inputs to the system. The HoloWall recognizes a rich set of interaction vocabularies that our bodies naturally have.

Foot Interface: Fantastic Phantom Slipper

As people move on foot in the real world, they should move freely with their own feet in virtual environments. With this in mind, researchers and students at the Tokyo Institute of Polytechnics have developed a slipper-like, wearable, multi-modal interface to interact with cyber worlds. The movement of the feet is measured in real-time with an optical motion capture system using a semiconductor position sensor, and feedback signals are transmitted through skin sensations of the feet. Complicated feedback information, such as object movement around or under the feet, is transmitted by phantom sensation elicited by multiple tactile stimuli. Optical markers for motion capture and vibrators for tactile stimulation are set in the slipper which the user wears. At SIGGRAPH 98, these techniques allowed players in the arcade-like game Fantastic Phantom Slipper to interact with virtual objects projected onto a floor screen using only their feet.

Mass Hallucination

Mass Hallucination presents a mirror-like imaging display that changes according to the number of people watching it, their behavior, and whether they've watched the device before. While this reflexive device encourages crowds of people to collectively manipulate the display with their body or face states, it is also personal in its ability to recognize the appearance of a particular user for short to medium periods of time and to respond by tailoring the display accordingly. A progression of Interval Research's previous work, Magic Morphin' Mirror (SIGGRAPH 97 Electric Garden), this display captures video along the same optical axis as video is displayed, so images of people observing can be directly manipulated, composited, or distorted on the display. In contrast to the previous work, which only considered a single user at a time and had no persistence after they left, Mass Hallucination is designed to visually track a crowd of people and provide a shared graphical experience. In addition, the display tracks users over time through multiple sessions to show that continuity and consistency of the experience is possible across a group of simultaneous users, or with a single user at a time.

Media and Mythology

In ancient times, mythology was the high tech method for storing data on a society's history, rituals, and ethical systems. The paradigm in use for these early information systems was storytelling. Media and Mythology explores the link between traditional cultural mythologies and new media technology. The VR multi-user game Man and Minotaur within the Media and Mythology installation allows players to portray the two ancient combatants and the gods who taunt them within a fully immersive, synthetic version of Daedalus' Labyrinth in ancient Crete. Several aspects of the Media and Mythology installation resided online in conjunction with the SIGGRAPH 98 Digital Pavilions program, including Video Totem, a facility to collaboratively express newly created mythologies on a large digital totem pole, and Hits and Mythos, a multidimensional web catalog of mythical systems from around the world and across time.

Stretchable Music with Laser Range Finder

Stretchable Music with Laser Range Finder combines an innovative, graphical interactive music system with a state-of-the-art laser tracking device. An abstract graphical representation of a musical piece is projected onto a large vertical display surface. Users are invited to shape musical layers by pulling and stretching animated graphical objects with natural, unencumbered hand movements. Each of the graphical objects is designed to represent and control a particular bit of musical content. Objects incorporate simple behaviors and simulated physical properties to generate unique sonic personalities that contribute to their overall musical aesthetic. A scanning laser range finder can track multiple hands in front of the projection surface with high accuracy for up to four independent points in a plane. Bare hands can be tracked without sensitivity to background light or complexion to within a four-meter radius.

The Object Oriented Displays

This project demonstrates significant advances in the MEDIA-3 (Media Cube) technology shown in the Electric Garden at SIGGRAPH 97. Each of the three developed prototypes is a liquid crystal display device that produces an opposite effect of well-known immersive virtual world display systems by placing the user in a viewing position outside an inner virtual world rather than placing him within the virtual environment. With Object Oriented Displays, a user can actually operate a virtual object as though it were a physical object in the real world. Two new display designs, the MEDIA-A and the MEDIA X'tal (Media Crystal), feature the ability to display a seamless, flexibly shaped virtual object -- an improvement over the cubic constraints of the MEDIA-3, whose display frame occluded portions of the internal image. In addition, the weight of the new display devices has been significantly reduced from that of the MEDIA-3, allowing much easier and more flexible handling. All three Object Oriented Display prototypes were available for use and comparison in Enhanced Realities.

Swamped!

Swamped! is a multi-user interactive environment in which instrumented plush toys are used as an intuitive and tangible interface to influence autonomous animated characters. Each character has a distinct personality and decides in real time what it should do based on its perception of its environment, its motivational and emotional state, and input from its "conscience," the guest. A guest can influence what a given character does and how it feels by manipulating its physical body. For example, users can direct a character's attention by moving the stuffed animal's head, comfort it by stroking the stuffed animal's belly, or have it wave at another character by waving the stuffed animal's arm. Automatic camera control is used to help reveal emotional content. By combining research in autonomous character design, automatic camera control, tangible interfaces, and action interpretation, Swamped! seeks to create a rich, novel and evocative experience.

AR2 Hockey

AR2 Hockey (Augmented Reality AiR Hockey) presents a collaborative augmented reality (AR) system where players can share a physical game field, mallets, and a virtual puck to play an air-hockey game. AR enables us to enhance physical space with computer generated virtual space. In addition, the collaborative AR allows multiple participants to simultaneously share a physical space surrounding them as well as a common virtual space, visually registered with the physical one. Since real-time, accurate registration between both spaces and players is crucial for playing the game, a video-rate registration algorithm is implemented with commercial head-trackers and video cameras attached to optical see-through HMDs. This collaborative AR system achieves higher interactivity than a totally immersive collaborative VR system.

Main Comments and questions