"Rebuilding Notre Dame" is a 17 minutes long cinematic VR documentary tribute to the Notre Dame Cathedral in Paris. Featuring stunning footage filmed weeks before the fire and incredible access at the heart of the destroyed cathedral after the fire, this film allows everyone to dive at the heart of the iconic tragedy of 2019 that shook the entire world.
Through the personal stories of the leaders whom will decide the future of this iconic cathedral, "Rebuilding Notre Dame" delivers an emotional story celebrating this gem of human history and architecture.
This experience aims at allowing everyone to keep on visiting Notre Dame thanks to virtual reality, as a way to keep the monument alive during the restoration works.
“SHE” is a VR interactive short film, and its story revolves around a boy who is struggling with gender identity and the difficulties he faces. This work uses multiple strategies for narrative guidance, such as audio, visual, light, eye contact, and interactive guidance, in a 360 environment to fully immerse viewers in the plot. We also built a haptic device to enhance the physical sensations of the VR experience. In sum, this work offers viewers the opportunity to explore and understand the indecision and moral conflicts experienced by the protagonist.
Sandbox play has an important role in today’s early childhood education. Modeling various objects with sand has a positive effect on the development of imagination and creativity of children. In this study, to improve a reality of a sand sculpture, we propose a system that augments appropriate auditory information on the sand sculpture. Since sand sculptures cannot have any internal electrical devices, we adopt parametric speakers, which can emit sharp directional sound. The emitted sound wave reflects diffusely on the surface of the sand sculpture, as if the sculpture itself were emitting the sound.
In this exhibit, we present a playful virtual reality (VR) experience for two co-located users who are both re-directed simultaneously. The redirection is seamlessly integrated with gameplay and interactions to perfectly match the given environment. This allows to explore a virtual space by foot which is 5 times as large as the 5 × 3m2 real-world play space that is shared by both users.
Users can experience sufficient visual and auditory feedback with a current head-mounted display and can interact with objects in the virtual environment with controllers. Still, current controllers only provide vibration feedback in addition to track the local motion. In order to improve haptic feedback during manipulation, previous research focuses on utilizing prop, wearable devices, or retrofit of the VR controller. Nevertheless, props only provide the shape and texture; the wearable device could not simulate multiple tactile sensations on a texture; while the VR controller could not simulate the shape of the object. However, the combination of wearable device and VR controller can provide some other tactile sensations, such as vibration, wind, and thermal. Therefore, we present FoodBender, an immersive VR game with multisensory feedback, which utilizes an attachable haptic device to activate utensil for playing in the immersive game. With the utensil on the device, we can simulate virtual objects’ shape and texture. Furthermore, user can receive additional vibration, wind, thermal feedback from the attachable device. In our game, physical kitchenware are used to simulate virtual weapons with various haptic feedback in the virtual environment.
This demo presents a mixed reality (MR) application that enables free-viewpoint rendering of interactive high-quality volumetric video (VV) content on Nreal Light MR glasses, web browsers via WebXR and Android devices via ARCore. The application uses a novel technique for animation of VV content of humans and a split rendering framework for real-time streaming of volumetric content over 5G edge-cloud servers. The presented interactive XR experience showcases photorealistic volumetric representations of two humans. As the user moves in the scene, one of the virtual humans follows the user with his head, conveying the impression of a true conversation.
We present a novel mixed reality (MR) telepresence system, which enables a local user to interact with a remote user through full-body avatars in their own rooms. If the remote rooms have different size and furniture arrangement, directly applying the user’s motion to the avatar leads to mismatch of placement and deictic gesture. To overcome this problem, we retarget the placement, arm gesture and head movement of the local user to the avatar in remote room to preserve the environment and interaction context of the local user. This allows avatars to utilize real furniture and interact with the local user and shared objects as if they were in the same room.
The virtual reality was originally developed for people with normal vision and therefore, people with visual impairment (PVI) had difficulties in accessing virtual reality (VR). In this project, we developed a virtual showdown, a VR sports game for PVI which supports multimodal interfaces. The showdown is one of the sports in Paralympic games, in which PVI plays by using audio feedback and haptic feedback. To implement the showdown in VR, we used the Head Related Transfer Function (HRTF) which allows PVI to recognize the location of a virtual ball with a 3D spatial sound. Our showdown game supports two modes: (1) player vs. artificial intelegenc (AI) agent (PvA) mode where PVI plays against an AI agent, and (2) player vs. player (PvP) mode where two players can play against each other remotely. With the remote virtual showdown, PVI not only experiences the virtual reality but also exercises themselves physically. Through the virtual showdown, PVI can play the game with ordinary sighted people.
This paper presents a mixed reality system for remote interactive training of first-person view drone flight control. A remote trainer conducts training scenario design and management in a virtual environment. A drone player practices on a first-person view drone flight control in a mixed reality environment according to the training scenario designed by the trainer. Both users can interact and communicate with each other while sharing the same virtual environment, even if they are present in separate locations.
Using a Virtual Reality (VR) headset significantly improves the immersive experience of a virtual environment. However, it separates the user from the physical world for interacting with daily objects like smartphones. This demo shows a prototype system that enables VR users to operate a full-featured smartphone in VR, in the same way they do in reality without taking the VR headset off. We explain the system design and strategies adopted to run the system at high resolution and with low latency. We also present a virtual hand rendering method to improve the phone interaction experience.
Virtual Reality (VR) has been used increasingly as a part of VR classroom in which students can explore outside the classroom experiences inside the classroom. However, the instructor is unaware of the students’ activities and engagement inside the virtual experiences, because the instructor cannot observe a large group of students inside the VR classroom. Thus, it hinders interactions between the instructor and students. To solve this challenge, we present a visualization method that allows the instructor to observe VR users (i.e., students) at scale using Augmented Reality. Specifically, the virtual environment and students’ gazes are visualized for the instructor and optimized to reduce visual clutter, so the instructor has overall awareness of the entire VR classroom.
Millions of people around the world suffer from chronic pain. Chronic pain can be managed with drugs or psychological techniques. However, drugs have side effects and psychological treatment is expensive. In our Virtual Reality experience patients learn how to manage their pain by following voice-over instructions of a clinical psychologist and through interaction with a virtual pain character. This Virtual Reality experience mimics one of the sessions from a psychological intervention tailored for chronic pain sufferers. The objective of our work is to develop a self-contained training program that would supplement the clinical psychologist and would be more entertaining, scalable and accessible than traditional therapy.