SIGGRAPH '21 Immersive Pavilion: Special Interest Group on Computer Graphics and Interactive Techniques Conference Immersive Pavilion

Full Citation in the ACM Digital Library

SESSION: Immersive Storytelling

Mementorium: Designing for playful and interactive learning about gender and sexuality-based marginalization

Mementorium is a heartfelt story about identity and belonging told through a virtual reality (VR) branching narrative. Mementorium’s design builds upon our previous designs and research on queer reorientations to computing and queer approaches to embodied learning in VR. The design is accomplished through the branching narrative, which supports listening to marginalized experiences and emotional co-construction of the story, and through the interaction design, which encourages play and embodied learning and centers authentic practices of belonging and becoming. This paper introduces Mementorium’s narrative and interaction design and research that informed the design. Mementorium is an approximately 30-minute single-player, room-scale interactive VR experience intended for a general audience to learn about gender and sexuality-based marginalization in science and technology.

Once Upon a Sea: A poetic, interactive XR documentary

ONCE UPON A SEA is an immersive virtual reality experience that transports participants to the Dead Sea, and provides access to one of the wonders of the world that has become inaccessible in the past 35 years. The Dead Sea carries a rich history, an undeniable healing powers, and an indescribable magnetism that can now be experienced in a virtual format. The experience takes place in photoreal volumetric captures of some of the most significant and beautiful sites in the Dead Sea, highlighting its beauty, diverse inhabitants, and its recent demise. In the past 30 years, the Dead Sea has receded dramatically due to human intervention and political neglect. The sweet water that fed the Sea was used for irrigation and potash evaporation pools left behind a ravaged land, ridden with sinkholes. Today, all beaches but one are inaccessible to the public due to these dangerous sinkholes, representing Israel's worst ecological crises. The destruction is progressing quickly, causing many socio-political battles as well as financial & personal distress to local residents and individuals who have dedicated their lives to the Sea. Israelis, Jordanians, and Palestinians, all whose countries border this body of water, have been affected by the demise of the Dead Sea. If nothing will be done, the Dead Sea as we know it will be gone for good. ONCE UPON A SEA is our call to action.

Bystanding: The Feingold Syndrome: Step outside your shoes: exploring the Bystander Effect through Virtual Reality

‘Bystanding: The Feingold Syndrome’ is an immersive interactive VR docufiction exploring the drowning and rescue of Israeli rowing champion Jasmine Feingold. In 2009, Feingold lost consciousness and capsized while rowing in Tel Aviv's Ha'Yarkon River. She stayed submerged for nearly five minutes. During that time, none of the dozens of bystanders on the riverbank took any action to help her, until finally one person did.

Using novel techniques of volumetric capture, photogrammetry, animations, and 360º videos, ‘Bystanding’ recreates the incident and allows participants to embody bystanders’ points of view. Each point of view is represented as a wholly different memory, providing a glimpse into the individual's stream of consciousness.

A New Normal: A story about warmth, hope, and family, A New Normal explores themes of ‘home' and 'belonging' combining elements and processes of traditional narrative filmmaking with augmented reality.

As we learn to keep social and familial connections alive amidst the pandemic and rely heavily on digital media, I realize this has been my own ‘normal’ for most of the past decade. After immigrating to the US from India, my ability to travel home and be physically present with my family became limited, particularly after the 2016 elections. Thinking about home and belonging today, my thoughts go to my sister in Copenhagen and my parents in New Delhi. Using Augmented Reality as a storytelling tool for this film, allows me to see the evolving definition of ‘home’ through the eyes of my family. My experiments in early 2020 as an artist in the Adobe Aero AR Residency, helped explore the possibilities and limitations of AR, as experienced through the Adobe Aero application. Following this, I was interested in seeing how AR could fit into the scope of a narrative project and how it would interact with a traditional animation or video workflow when used by a filmmaker. I discovered many efficiencies during the production process, the most powerful being in-camera, real-time compositing. The process of creating ‘A New Normal’ combines the technical and conceptual workflows of both animation and documentary filmmaking. One scene has been made available to view in AR via the Adobe Aero app on iOS. Shot entirely in my apartment while sheltering in place, A New Normal is a product of remote collaboration between artists based in Copenhagen, Madrid, New Delhi, Mumbai and Los Angeles. In this short, my everyday digital conversations with family are overlaid with AR versions of my parents and sister in my own environment, to co-create a mixed reality experience that for the time being, is the closest thing I have to the feeling of home.

SESSION: Immersive Interaction - New Modalities

Vermillion: Oil Painting Simulation In Virtual Reality: A new tool for digital artists offering the analog control of traditional painting with the benefits of a virtual environment.

Digital art has come a long way since the first release of MS Paint. And while artists can already work wonders with the tools at their disposal, there's always been a very distinct barrier between traditional mediums and the digital ones. Instead of brushes on a canvas, a stylus on a screen is used. Instead of mixing paints of different colors and thickness, a color picker and transparency slider are presented. The instinctiveness and versatility of bristles on canvas have been replaced with a selection of discrete tools - draw, blur, smudge,... It's also not possible to make strokes with large gestures, to work on your piece from arm's length, like the old masters did. The typical texture that physical paint has is also lost in drawing programs. Vermillion offers a new tool for digital artists, be they novices who always wanted to try their hand at following along with Bob Ross, or veteran Photoshop users who long for more analog control. It simulates the full oil painting experience. A selection of specialist brushes is offered, each with unique characteristics and their distinct brush pattern. The bristles bend as they are pushed against a surface, soaking up the paint that's on the palette. The paints mix as they would in the real world, not as RGB colors do on a monitor. Paint thinner can be used to control the thickness and flow of the paint. The paint can be pushed, pulled, blurred and blended on the canvas, and shows the relief and pattern of your strokes. The resulting piece can be exported to be presented as a new digital work of art. 

SecondSight: Demonstrating Cross-Device Augmented Reality

SecondSight is a framework for rapid prototyping of cross-device interfaces that combine an optical see-through Augmented Reality (AR) head-mounted display (HMD) with a smartphone. SecondSight contains code for simulating AR HMDs with different Fields of View (FOV), providing gesture and head pointing input, and allowing virtual content to be placed in different coordinate frames. Overall, this gives AR researchers flexibility to explore different types of cross-device interfaces and interaction techniques with different types of display, input, and content placement.

Figmin XR: AR content creation platform

Figmin XR is a spatial computing application that allows non-technical users to easily create, collect & play with digital content.

Featuring: Figmin XR is a multi-purpose app for the consumer AR market. It runs on Magic Leap 1, Hololens 2 and Nreal light.

See it in action:

https://youtu.be/0z3P21WLNiU

Home Studio: DIY Interior Design in Mixed Reality

Virtual staging of real estate listings increases the appeal of a property by letting prospective buyers envision a living space remotely. However, existing tools employed to stage homes limit the scale of the visualization to a set of fixed images provided by customers or require 3D artist expertise to reconstruct the space. The adoption of 3D Matterport scans has accelerated due to the Covid-19 pandemic as a means to enable virtual tours and adhere to social distancing guidelines. We present Home Studio, a virtual staging tool that empowers non-experts, letting them furnish any Matterport scene and create photo-realistic renders in a matter of minutes. Our tool lets customers dive into their designs using a virtual reality headset to assess the final product in an immersive experience.

SESSION: Immersive Medicine

BodyMap: Medical Virtual Reality Education and Simulation

BodyMap is a medically accurate representation of the human body that can be manipulated in 3D virtual reality. Usersmay interact with the virtual body in numerous ways, including walking into the virtual body for a detailed inspection ofinternal organs, grabbing out anatomy structures for a closer look, and simulating instrument insertion techniques withinstant haptic feedback. This immersive learning approach enables users to transfer their knowledge gained fromanatomy textbooks and cadaver dissection lessons into an immersive environment and enhance it by continuousrepetition. This way of learning increases confidence in students’ own abilities and knowledge.

Walking Balance Assessment with Eye-tracking and Spatial Data Visualization

Virtual Reality (VR) based assessment systems can simulate diverse real-life scenarios and help clinicians assess participants’ performance under controlled functional contexts. Our previous work demonstrated an assessment paradigm to provide multi-sensory stimuli and cognitive load, and quantify walking balance with obstacle negotiation by motion capture and pressure sensing. However, we need to fill two gaps to make it more clinically relevant: 1. it required offline complex data processing with external statistical analysis software; 2. it utilized motion tracking but overlooked eye movement. Therefore, we present a novel walking balance assessment system with eye tracking to investigate the role of eye movement in walking balance and spatial data visualization to better interpret and understand the experimental data. The spatial visualization includes instantaneous in-situ VR replay for the gaze, head, and feet; and data plots for the outcome measures. The system fills a need to provide eye tracking and intuitive feedback in VR to experimenters, clinicians, and participants in real-time.

Covid-19 - VR Strikes Back: innovative medical VR training

In this work, we present “Covid-19 VR Strikes Back” (CVRSB), a novel Virtual Reality (VR) medical training application focusing on a faster and more efficient teaching experience for medical personnel regarding the nasopharyngeal swab and the proper Personal Protective Equipment (PPE) donning and doffing. Our platform incorporates a diversity of innovations: a) techniques to avoid the uncanny valley observed in human representation and interactivity in VR simulations, b) exploitation of Geometric Algebra interpolation engine capabilities and c) supervised machine learning analytics module for real-time recommendations. Our application is publicly available at no cost for most Head Mount Displays (HMDs) and Desktop VR. The impact and effectiveness of our application is proved by recent clinical trials.

Panjam - Reimagining Music Learning to Support Healthy Aging and Wellbeing

In this paper, we present Panjam - a prototype VR application that reimagines music learning for seniors to support healthy aging and wellbeing. Panjam uses the intuitive interface of the steelpan paired with immersion, presence, multi-sensory feedback, and gamification of the VR system to induce active engagement, self-driven practice and learning.

SESSION: Real-Time Immersive

Garage: GPU particle-based AR content for futuristic experiences

In futuristic AR, we believe that both real and virtual objects surrounding the player should be interactive and controllable. With this in mind, we introduce ”Garage”, which is an AR project based on our particle-based system capable of outputting a variety of interactive content. In our system, everything in the player’s surrounding environment is presented as particles and adequately simulated in real-time on a desktop PC equipped with a discrete GPU and rendered to in-house HMD, while their color and depth information is captured and sent by RGB camera and LiDAR sensor. LiDAR data is converted to particles using an algorithm that calculates world coordinated positions from depth values. Our system content and provided experiences have time flexibility and spatial control features that cannot be achieved in conventional polygon mesh-based systems. Additionally, the AR information output is presented in an appealing visual style that conserves GPU resources. Because of these features, Garage has a profound impact on AR immersion, and can augments our perception of the world in a limitless and creative manner. To demonstrate the abilities of our system and build new interactive AR experiences, in this project, we designed a game and created several demos that can give players interesting glimpses into the future world of augmented reality.

Welcome to the Other Side: How a social VR concert production broke the world's record of audiences across all media

Designing for Storyliving Experiences:: Tinker, A Case Study

Tinker defines a new genre of storytelling called storyliving. Through interactive and embodied immersive experiences, participants can connect with narrative in a more meaningful way. In this live, bespoke virtual reality theatre experience, the participant becomes the main character of the story. Based upon the true story of Director Lou Ward and his relationship with his grandfather who developed Alzheimer's disease, Tinker places the participant in the role of the grandchild so that Ward's story, in turn, becomes their own. Over time, the participant creates new memories alongside their grandfather in his workshop, and their connection to the grandfather and the story grows stronger with time. Tinker is a cornerstone in compelling storyliving experiences; moreover, it is because of how it is designed that participants feel connected to the story.

SESSION: Immersive Gaming

VR SuperGun: Interfacing 1980s Arcade Hardware with Online Virtual Reality

VR SuperGun is a custom hardware and software prototype allowing play of original arcade platforms through a network connection, reconstituting the material form of the arcade cabinet in digital space. It extends the format of the SuperGun, a device that contains the wiring of an arcade cabinet in consolised form.

Agence, A Dynamic Film about (and with) Artificial Intelligence

Agence is a short “dynamic film” that uses AI to power a real-time story. It was co-produced by Transitional Forms and the National Film Board of Canada (NFB). It is available on VR, PC and mobile, but for the purposes of this paper, we will be talking about the VR version, since it most closely matches the director’s vision. The film is directed by Pietro Gagliano whose work on interactive stories has spanned many years and technologies. A few years ago he started Transitional Forms to combine real-time storytelling with artificial intelligence. The intention behind that process is twofold: First, we believe that entertainment will soon be driven by AI. And secondly, artificial intelligence is poised to be humanity’s greatest tool, and stories might be the best way to make sense of it. To this end, we believe that Agence is an innovative production with bold strides in immersion, interactivity and technology. The approaches taken in this film are novel and unique in their propositions, and may open the door to many new projects that may build upon them.

HoloFight: An Augmented Reality Fighting Game

Augmented Reality (AR) provides opportunities to create exciting new kinds of digital entertainment, such as watching movies on a large virtual screen or playing games that interact with a real physical room. While a number of AR games have been built, many do not build on the control innovations found in modern console, PC, and mobile gaming [Von Itzstein et al. 2019]. To explore the space of immersive multiplayer experiences with support for control innovations found in common non-immersive video games, we present HoloFight, a multiplayer fighting game using two or more Microsoft HoloLens 2s, two or more Xbox controllers, and the various natural user interfaces supported by the Microsoft HoloLens 2.

SESSION: Immersive Gaming

Sculpture Experience: VR discovery tour of 6 sculpture masterpieces, from prehistoric to modern times

This immersive and interactive experience for OCULUS QUEST 2 allows the visitor to observe in real size six masterpieces of sculpture iconic from five periods of history, following an imaginary and poetic journey enriched by numerous media.

Speak to Awaken EP.1 Diving into Siraya: An Endangered Language Speech Interactive VR Documentary

Speak to Awaken: Ep.1 Diving into Siraya is an experimental interactive VR documentary that aims to arouse interest in this issue. Experiencers can speak the endangered Siraya language to engage their reconstructed abstract world. They can also absorb facts on the revitalization process through the perspectives of related key persons and graphics. High-resolution VR360 videos, volumetric captures, and together with the voice donation website are adopted, as a new approach for the preservation of endangered language and its culture.

Exposition of Music: VR Exhibition

Nam June Paik's first solo exhibition in 1963 titled "Exposition of Music - Electronic Television" marked the beginning of his career, and he is often referred to as the father of video art. In this project, we attempted to virtually recreate some of his exhibits and exhibition spaces that are currently unavailable to elucidate their meaning. Among the exhibits, the three most interactive and meaningful works, namely Prepared Pianos (4 units), Electronic Television (13 units), and Random Access, were recreated in virtual reality (VR) space. To compensate for the limitations imposed by simple interactions between existing VR exhibitions of art works, this project was developed with the utmost focus on preserving the artistic value of the original work while maximizing the exhibition space reproduced in VR as well as audience immersion in the work.

Orders of Magnitude: An immersive educational virtual reality application about the Universe

Orders of Magnitude is an immersive educational virtual reality application. Each player can explore all the known scales in our universe by zooming in or out. This scientifically accurate experience contains elements such as galaxies, stars, the solar system, Earth, the human brain, neurons, DNA, atoms and subatomic particles. The majority of the content is taken from various scientific databases and is represented, wherever possible, as it exists in nature. An artistic impression is used to depict elements for which little or no information is available about their visual presence.