Conference Emerging Technology Comments Search Site Map Main
Emerging Technologies Projects
 

4+4 Fingers Direct Manipulation With Force Feedback
A new interface device for direct manipulation environments with force feedback. In this demonstration application, the user grasps, rotates, and moves a virtual Rubik's cube.
 
Somsak Walairacht
Precision and Intelligence Laboratory
Tokyo Institute of Technology
4259 Nagatsuta, Midori-ku
Yokohama 226-8503 JAPAN
somsak@pi.titech.ac.jp


10,000 Year Clock Dial and Rosetta Disk
The 10,000 Year Clock Dial is from a nine-foot-tall early prototype of the 10,000 Year Clock designed by Danny Hillis. Reading from the outside in, it includes the Gregorian Year in five digits, the sun position, moon phase, and current night sky in the center.
 
Alexander Rose
The Long Now Foundation
P.O. Box 29462
San Francisco, California 94129 USA
+1.415.561.6582
zander@longnow.org


ActiveCube
A tangible user interface for constructing and interacting with virtual 3D objects using actual physical cubes. Each cube is a bi-directional interface device that contains both input and output sensors, and provides a flexible direct manipulation interface that maximizes the user's intuition, sensitivity, and proprioception.
 
Yoshifumi Kitamura
Osaka University
2-1 Yamadaoka
Suita 565-0871 JAPAN
kitamura@eie.eng.osaka-u.ac.jp


Augmented Groove: Collaborative Jamming in Augmented Reality
Augmented Groove explores the use of augmented reality and tangible interfaces for conducting multimedia musical performances. Users modulate musical elements by manipulating simple physical cards, and the cardıs motions control elements of the music. At the same time, the system presents an augmented real scene through a visor.
 
Ivan Poupyrev
MIC Labs, ATR
2-2 Hikaridai
Soraku-gun
Seika
Kyoto, JAPAN
poup@mic.atr.co.jp


Autostereoscopic 3D Workbench
A reality-enhanced, autostereoscopic 3D workbench that immerses users in a 3D workspace as they manipulate virtual 3D objects. The display presents 3D images in the foreground with appropriate parallax and focus.
 
Hideki Kakeya
Communications Research Laboratory
4-2-1 Nukui-Kitamachi
Koganei 184-8795 JAPAN
kake@crl.go.jp


Autostereoscopic Display for an Unconstrained Observer
A system for autostereoscopic display that solves a long-standing problem: how to deliver a true stereoscopic image to unencumbered observers, while allowing them to freely move and rotate their heads. This technology may have a significant effect on CAD applications, CHI applications, and entertainment graphics.
 
Ken Perlin
New York University
Media Research Lab/Center for Advanced Technology
719 Broadway, 12th Floor
New York, New York 10003 USA
perlin@mrl.nyu.edu


CYPHER: Cyber Photographer in Wonder Space
This three-component demonstration allows users to build a virtual world by locating wooden blocks on a table. A robot then photographs the users from a desirable perspective for placement within the virtual world. Finally, the system uses its knowledge of art masterpieces to compose a virtual photo from within the virtual world.
 
Shoji Tanaka
ATR Media Integration & Communications Research Laboratories
2-2 Hikaridai Seika-cho
Soraku-gun
Kyoto 619-0288 JAPAN
gon@mic.atr.co.jp


Danger Hamster 2000
A hamster-powered hamster ball rolls randomly around an enclosed area as a video-computer system tracks its position and projects a computer-generated face onto it. Attendees use a character-design system to create the projected faces and control their appearance, speech, moods, and attitudes towards objects, and the faces react visually and verbally to events.
 
Kim Binsted
Interaction Lab
Sony CSL
3-14-13 Higashi-Gotanda
Shinagawa-ku
Tokyo 141 JAPAN
kimb@csl.sony.co.jp


Gait Master
Advanced applications of virtual reality often require a good sense of locomotion. The Gait Master is a locomotion interface that uses a two-degree-of-freedom motion platform for each foot to deliver a sense of walking on uneven surfaces. The user experiences stairs and sloped surfaces in a virtual space.
 
Hiroo Iwata
University of Tsukuba
Institute of Engineering Mechanics and Systems
Tsukuba 305-8573 JAPAN
iwata@kz.tsukuba.ac.jp


HoloSpace
Research examples of dazzling digital holographic stereograms, including collaborative design and engineering examples, advertising and commercial applications, and 3D portraits. These holograms are full-color and allow full vertical and horizontal parallax.
 
Emilio Camahort
Department of Computer Sciences
The University of Texas at Austin
Taylor 2.124 - Mail Code C0500
Austin, Texas 78712-1188 USA
ecamahor@cs.utexas.edu


InTheMix
An interactive, aural environment of musical pieces that react to movement and attention. The sound space immerses the participant via position-tracked headphones and positional 3D audio technology. Sound effects, voices, and virtual musicians respond to participantsı movement. InTheMix supports multiple and remote participants in a shared aural environment.
 
William L. Chapin
AuSIM, Incorporated
4962 El Camino Real, Suite 101
Los Altos, California 94022 USA
wchapin@ausim3d.com


Jamodrum Interactive Music System
Jamodrum is a multi-user interactive music system that combines drumming and real-time computer graphics in a collaborative environment. Up to three players collaborate in a musical improvisation via velocity-sensitive input devices, while their performance is augmented with computer graphics imagery on a tabletop surface.
 
Tina Blaine
Interval Research Corporation
1801 Page Mill Road, Building C
Palo Alto, California 94304 USA
sabean@sirius.com


LaserWho
A large-scale, gesture-based system that allows users to explore the complex patterns of affiliation within a community. LaserWho is a tool for interaction with and visualization of conceptual data structures. It recreates the vitality of the real-world cityscape and provides social insights that are obtainable only in the information-based virtual world.
 
Judith Donath
MIT Media Lab
20 Ames Street, e15-449
Cambridge, Massachusetts 02139 USA
judith@media.mit.edu


Magic Book: Exploring Transitions in Collaborative AR Interfaces
Magic Book explores transitions between physical reality, augmented reality, and immersive virtual reality in a collaborative setting. Though it looks like a physical storybook, it uses video-based recognition and augmented-reality technologies, to generate displays that appear to rise off the page as virtual scenes. Together, readers explore storybook scenes in physical, augmented, and immersive VR settings.
 
Lily Shirvanee
University of Washington
Human Interface Technology Laboratory
Fluke Hall, Room 215
Box 352142
Seattle, Washington 98195-2142 USA
lilys@hitl.washington.edu

Medieval Chamber
In the Medieval Chamber, real-time video processing tracks multiple stick-like objects held by viewers. As they move these objects, viewers move objects in the virtual world (a sword, a ball-and-chain, and torch) that are rendered in real time in a medieval chamber with shadows, reflections, transparency, and point lighting.
 
Richard Marks
Sony Computer Entertainment
919 East Hillsdale Boulevard, 2nd Floor
Foster City, California 94404-2175 USA
richard_marks@playstation.sony.com


Microtelepresence
Microtelepresence extends telepresence to the microscopic scale. It immerses users in a microscopic world inhabited by live insects. Users view the display with a stereoscopic, head-mounted display with magnetic tracking, which drives a stereoscopic video microscope coupled to a robotically controlled motion platform.
 
Tom Malzbender
Hewlett-Packard Laboratories
1501 Page Mill Road 3U-4
Palo Alto, California 94304 USA
malzbend@hpl.hp.com


Musical Trinkets: New Pieces to Play
In Musical Trinkets, users interact with 15 tagged objects to perform music that is smoothly varied through tag manipulation (proximity, orientation, pressure). The physically very different objects (some can be worn as rings, others can sit on the table or spin) are continuous controllers. Each varies a different set of musical parameters or rules.
 
Joseph Paradiso
MIT Media Laboratory
20 Ames Street, E15-351
Cambridge, Massachusetts 02139 USA
joep@media.mit.edu


In the Muu: Artificial Creatures as an Embodied Interface
Muu is an artificial creature with a simple, humorous physical body that creates a sense of social bonding with humans, as opposed to merely exchanging information. The creature also works as an embodied interface that mediates previously established human bonding.
 
Michio Okada
ATR Media Intergration & Communications Research Laboratories
2-2 Hikaridai Seika-cho Soraku-gun
Kyoto 619-0288 JAPAN
okada@mic.atr.co.jp


Networked Theater: A Movie Production System Based on a Networked Environment
A new virtual movie or entertainment production system that uses a networked virtual environment and scene-direction and authoring software. Performers control CG characters with a motion-capture system connected to a client. Producers create movies by scripting actions, events, and scene changes based on networked information about the characters.
 
Kazuhiko Takahashi
ATR Media Integration and Communications Research Laboratories
2-2 Hikaridai Seika-cho Soraku-gun
Kyoto 619-0288 JAPAN
kylyn@mic.atr.co.jp


Panoscope 360°
A personal panoramic viewer that displays a full 360-degree cylindrical image from a single video or data channel.
 
Luc Courchesne
Université de Montréal
Design Industriel
2940 Cote-Ste-Catherine, Suite 1006
CP 6128 Succursale Centre-ville
Montréal, Québec H3C3J7 CANADA
luc.courchesne@umontreal.ca


Plasm: In the Breeze

Remember that old tire swing from your youth? Plasm: In the Breeze captures the full-body experience of swinging in that tire over a synthesized ³creek². Of course, because itıs synthetic, the creek has a life of its own.
 
Peter Broadwell
Plasmatic Arts
2325 Cornell Street
Palo Alto, California 94306-1314 USA
peter@plasm.com


Retinal Direct Imaging
A retinal direct-projection display that uses laser illumination and a holographic optical element to create a Maxwellian view. The system converges coherent parallel rays at the center of the human pupil and projects the rays directly on the retina directly.
 
Takahisa Ando
Laboratories of Image Information Science and Technology
WTC Bldg. 21F, Mail Box 82
1-14-16, Nanko-Kita, Suminoe-ku
Osaka 559-0034 JAPAN
ando@image-lab.or.jp


RV-Border Guards: A Multi-Player Entertainment in Mixed Reality Space
A collaborative target shooting game that uses state-of-the-art mixed reality technology (RV = reality/virtuality). The real scene is presented through video-based scenes augmented by computer-generated characters and effects. Players try to shoot game objects with special hand and arm gestures.
 
Toshikazu Ohshima
Mixed Reality Systems Laboratory, Inc.
6-145 Hanasaki-cho
Nishi-ku
Yokohama 220-0022 JAPAN
ohshima@mr-system.co.jp


Ultra-High Resolution Reality Center
An entertaining and highly informative demonstration of the state-of-the-art in high-resolution, large-format stereoscopic immersive display technologies. The imagery follows a central theme involving immersive visualization of subject matter that is found in nature, from perspectives that are not, for various reasons, available to the naked eye.
 
Andrew Joel
Barco Projection Systems America
3240 Town Point Drive
Kennesaw, Georgia 30144 USA
Andrew.Joel@barco.com


V-TOYS: Visually Interactive Toys
A visually interactive robot that understands and reacts to human presence and visual communication. V-TOY detects people, performs facial gestures, recognizes people, greets them verbally and visually, and invites users to test how well it can understand and mimic human facial expressions.
 
Yaser Yacoob
University of Maryland, College Park
A.V. Williams UMIACS
College Park, Maryland 20742 USA
yaser@umiacs.umd.edu
 
Ismail Haritaoglu, Alex Cozzi, David Koons, Myron Flickner
IBM Almaden Research Center

VaRionettes

A vision-based interface to actions and gestures of a virtual actor (avatar). Gestures and motion parameters of the userıs hands are acquired by a computer-vision system that controls the movements of the avatarıs head and hands, and its walking and running behaviors.
 
Jakub Segen
Lucent Bell Labs
101 Crawfords Corner Road
Room 4E-632
Holmdel, New Jersey 07733-3032 USA
segen@lucent.com


Virtual Hockey
This game uses an overhead camera to track user movements. The goal of the game is simple: have fun. There is no need to hold a physical object. The user's hand becomes a paddle that is used to hit the puck and defend the goals. When defender objects are placed on the table, the puck bounces off the defenders just as it bounces off the wall or the user's hand. A reset button is provided to set the score back to zero.
 
Richard May
Battelle Memorial Institute
University of Washington
Human Interface Technology Laboratory
215 Fluke Hall, Mason Road
Seattle, Washington 98195 USA
rmay@televar.com


X'tal Head: Face-to-Face Communication by Robot
X'tal vision uses retro-reflective materials and head-mounted projectors to apply images to objects. X'tal Head is a novel "talking head" system that presents a realistic, stereoscopic head image.
 
Masahiko Inami
The University of Tokyo
Tachilab. MEIP
The University of Tokyo
7-3-1 Hongo, Bunkyo-ku
Tokyo 113-8656 JAPAN
media3@star.t.u-tokyo.ac.jp


You Were There
Essentially a "people tracker", this demonstration of automatic identification and data capture (AIDC) technology tracks location information on individuals as they move through a space. It hints at useful applications for traffic-flow planning and real-time space adjustments, and raises individual privacy issues.
 
Dino Schweitzer
Capstone Management Group, Inc.
5475 Mark Dabling Boulevard, Suite 108
Colorado Springs, Colorado 80918 USA
dino@siggraph.org

 

SIGGRAPH 2000 Register Now
Comments Search and Site Map Main