Perceptually Inspired Methods for Naturally Navigating Virtual Worlds
Tuesday 13 December | 14:15-18:00 | Room S226
In recent years many advances have enabled users to naturally navigate large-scale graphical worlds. The entertainment industry is increasingly providing visual and body-based cues to users to increase the natural feel of their navigational experience. So far, however, none of the existing solutions fully support the most natural locomotion through virtual worlds. Techniques and technologies which have the advantage of insights into human perceptual sensitivity thus have to be considered. In this context, by far the most natural way to move through the real world is via a full body experience where we receive sensory stimulation to all of our senses, for example when walking, running, biking or driving. With some exciting technological advances, people are now beginning to get this same full body sensory experience when navigating computer-generated, three-dimensional environments. Enabling an active and dynamic ability to navigate large-scale virtual scenes is of great interest for many 3D applications demanding locomotion, such as video games, edutainment, simulation, rehabilitation, military, tourism or architecture. Today it is still mostly impossible to freely move through computer generated environments in exactly the same way as the real world. Unnatural and artificial approaches are instead applied, providing only the visual sensation of self-motion. Computer graphics environments were initially restricted to visual displays combined with interaction devices - for example the joystick or mouse - providing often unnatural inputs to generate self-motion. Today, more and more interaction devices like Nintendo Wii, Microsoft Kinect or Sony EyeToy enable intuitive and natural interaction. In this context many research groups are investigating natural, multimodal methods of generating self-motion in virtual worlds based on such consumer hardware.
Level
Beginner
Intended Audience
The course is aimed at scientific researchers as well as developers of interactive applications who would benefit from intuitive and natural exploration concepts.
Prerequisites
The course is for participants with backgrounds in at least one of the following disciplines: computer graphics, entertainment psychology, augmented reality and virtual reality.
Course Schedule
Session 1:
14:15-14:45 Frank Steinicke: Introduction and Basics
14:45-15:30 Betty Mohler: Hardware-Based Walking Simulators
15:30-15:45 Discussion
Presenter(s)
The course will be organized by Frank Steinicke, Professor and Head of the Immersive Media Group at the Department of Computer Science and Human Computer Media at the University of Würzburg.
He has given several lectures and courses about computer graphics, human-computer interaction (HCI), virtual reality and locomotion. He is co-editor of the Springer book Human Walking in Virtual Environments, which will be published next year.
Frank Steinicke is a member of several International Program Committees and is Principal Investigator of international research grants including the interdisciplinary project LOCUI (Virtual Locomotion User Interfaces), funded by the German Research Foundation.
Anatole Lecuyer is a research scientist at the French National Research Institute for Computer Science and Control (INRIA), in Rennes, since 2002. He received his PhD in computer science from the University of Paris XI, 2001. His main research interests include virtual reality, 3D interaction, haptic interaction and brain-computer interfaces. He is an expert in virtual reality for the European Commission and the French National Research Agency. Lecuyer has been the member of International Program Committees of major international conferences (World Haptics, Eurohaptics, IEEE 3DUI, ACM Symposium on Virtual Reality Software and Technology, IPT-Eurographics Symposium on Virtual Environments) and an organizer of courses and workshops in major Virtual Reality (VR) events like IEEE VR and Eurohaptics. Anatole Lecuyer is associate editor of the ACM Transactions on Applied Perception. He is the principle investigator of several national and international grants, including the European Project Natural Interactive Walking and the French projects OpenViBE1 and OpenViBE2.
Betty Mohler is currently a post-doctoral fellow and group leader of the Perception and Action in Virtual Environments at the Max Planck Institute for Biological Cybernetics (Tuebingen, Germany). She graduated with a PhD in Computer Science from the University of Utah in January 2007. She worked in the Perception and Computer Graphics research group under her advisors Dr. Bill Thompson and Dr. Sarah H. Creem-Regehr. Her research interests include computer graphics, space perception, and locomotion in immersive virtual environments. She received her BSc in Computer Science in May 2001 at Millersville University, under the supervision of Dr. Roger Webster. Betty Mohler is involved in several national and international grants.