SA '19- SIGGRAPH Asia 2019 Emerging Technologies

Full Citation in the ACM Digital Library

360Drops: Mixed Reality Remote Collaboration using 360 Panoramas within the 3D Scene*

Mixed Reality (MR) remote guidance has become a practical solution for collaboration that includes nonverbal communication. This research focuses on integrating different types of MR remote collaboration systems together allowing a new variety for remote collaboration to extend its features and user experience. In this demonstration, we present 360Drops, a MR remote collaboration system that uses 360 panorama images within 3D reconstructed scenes. We introduce a new technique to interact with multiple 360 Panorama Spheres in an immersive 3D reconstructed scene. This allows a remote user to switch between multiple 360 scenes “live/static, past/present,” placed in a 3D reconstructed scene to promote a better understanding of space and interactivity through verbal and nonverbal communication. We present the system features and user experience to the attendees of SIGGRAPH Asia 2019 through a live demonstration.

AlteredWind: Manipulating Perceived Direction of the Wind by Cross-Modal presentation of Visual, Audio and Wind Stimuli

We developed AlteredWind, a multisensory wind display system that manipulates users’ perception of wind by the integration of visual, audio, and wind stimuli. The wind displays are devices which present the sensation of wind and improve presence and immersion of virtual reality (VR). However, existing wind displays require lots of wind sources to reproduce the wind directions. Therefore we propose a method to manipulate perceived directions of the wind by the cross-modal effect in order to realize wind displays with fewer wind sources. We present images of flowing particles and three-dimensional (3D) sounds of wind together with the wind to induce the visuo-haptic and audio-haptic cross-modal effect. We produced a VR application in which presentations of the flowing particle, the 3D sounds, and the wind are integrated. The demonstration participants will be able to experience various directions of wind from only two wind sources and realize changes in their perceived directions of the wind.

Biofeedback Interactive VR System Using Biological Information Measurement HMD

By quantifying an user’s emotions during a virtual reality (VR) experience, we can interactively change the content of VR based on the user’s state of being. To quantify these emotions, the measurement of certain biological indices is necessary. The existing technology currently being used to gather different biological information is faced with challenges, such as user load of mounting sensors that the user needs to wear, and a noisy output arising from the user’s body movements. In this paper, we present the development of a biological information measuring device that can be easily attached onto a head mounted display (HMD). As the proposed device allows for the HMD to be strongly attached to the face, it is believed to be robust to body movements, and consequently, the mounting load of the sensor also becomes negligible. To easily attach the sensing device to the existing HMD, a pulse wave sensor and a respiratory sensor were integrated on the nose. The pulse wave was measured from an optical pulse wave sensor, and the respiration from a thermopile. Experiments were conducted to verify that these measurements were sufficiently accurate for the estimation of the user’s tension and excitement. This research therefore highlights the possibility that emotions can be quantified using only an HMD during a VR experience. Using the proposed device, we also presented a biofeedback interactive VR system.

Brobdingnagian Glass: A Micro-Stereoscopic Telexistence System

Co-Limbs: An Intuitive Collaborative Control for Wearable Robotic Arms

The promising possibilities offered by supernumerary robotic wearable arms are limited by the lack of an intuitive and robust user interface to control them. Here, utilizing admittance control, we propose a ’Collaborative limbs’, or ’Co-limbs’ user interface for wearable robot arms. The key feature of this user interface is its intuitiveness enabling even first time users to immediately move and use the, normally stiff, robot arms for assistive tasks and even teach the robot simple and useful movements. We demonstrate the diverse range of applications enabled by this simple but powerful user interface through example demonstrations in the Passive Assist, Power Assist and Playback modes.

Demonstration of ThermAirGlove: A Pneumatic Glove for Material Perception in Virtual Reality through Thermal and Force Feedback

Enhancing Suspension Activities in Virtual Reality with Body-Scale Kinesthetic Force Feedbacks

Hanger Drive: Driver Manipulation System for Self-balancing Transporter Using the Hanger Reflex Haptic Illusion

In recent years, self-balancing transporters have become popular for medium-distance transportation such as police patrols and sightseeing tours, and are expected to further gain prevalence. Therefore, it is important to develop danger avoidance and automated driving systems for self-balancing transporters. However, automated control of a self-balancing transporter is challenging because the user balances on the vehicle while riding. In this study, we control the driving direction of a self-balancing transporter indirectly by controlling the motion of the user who is riding the vehicles. We develop a system to controlling the turning direction of self-balancing transporter using Hanger Reflex and its application.

Hapballoon: Wearable Haptic Balloon-Based Feedback Device

We developed The “Hapballoon”, a novel device that is worn on the fingertips. The device can present three types of sensations: force, warmth, and vibration. Force sensations are presented when the inflated balloons on individual devices contact one another. The device is easy to attach on the finger, and does not obstruct common optical finger tracking methods that track the back side of the hand in virtual reality (VR) applications. Each module weights approximately 6 grams, and the balloons are inflated via an air tube connected to a device on the user’s arm. This wearable haptic presentation device may improve the realism of various VR applications.

Haptiple: A Wearable, Modular and Multiple Haptic Feedback System for Embodied Interaction

Humans feel tactile sensations over their whole body. However, the parts of the body on which many haptic devices can be worn are restricted, and hence, these devices do not provide multiple haptic feedback over the whole body. Multiple haptic feedback can enhance user experience for various applications. Therefore, we propose a multiple haptic feedback system called Haptiple, which is a wearable and modular system for embodied interactions based on a wireless platform. The system consists of three types of modular haptic devices (vibro-tactile, pressure, and thermal/wind) that can be placed on multiple body parts such as the hand, wrist, ankle, and chest. We demonstrate interactive experiences to show the feasibility of our concept.

IlluminatedFocus: Vision Augmentation using Spatial Defocusing

We propose IlluminatedFocus, augmented reality (AR) glasses enabling depth-independent spatial defocusing of a human vision. Our technique spatially manipulates the depth-of-field by synchronizing a periodic focal sweep of head-worn electrically tunable lenses and fast illumination control with a high-speed projector. In this demonstration, we show a system that switches focused and defocused views independently at each area of a 3D real scene. We realize various vision augmentation applications based on our method to show its potential to expand the application field of optical see-through AR.

Inclination Manipulator

This study aims to use the redirected walking technique that makes users of virtual reality systems feel like they are walking on an inclined surface while actually walking on a flat one in a real space. However, when the appearance of the virtual inclination becomes too significant, the absence of rotational force components applied to the body causes a feeling of incongruity. In this paper, we propose a novel technique to effectively manipulate the perception of inclination of a virtual walking surface using haptic cues to complement the rotational force components. Based on this approach, we developed a system to generate the rotational force components by loading a weight at the user’s back, which shifts when the configuration of virtual inclination changes. The experiments conducted using the proposed system verified the feasibility of our approach.

Levitar: Real Space Interaction through Mid-Air CG Avatar

In this paper, we propose a system for using computer graphic (CG) avatars to re-design human-to-human interactions in real space. Virtual reality (VR) social networks enable users to interact with each other through CG avatars and to choose their appearances freely. However, this is only possible in VR space. We propose a system that takes the avatar from VR space to real space with the help of mid-air imaging technology. The video captured from the mid-air image position is presented to the user via a head-mounted display. Our technical contribution is the design of a mid-air stereo camera in which the gaze direction is synchronized with the user’s head movements. A simple mechanism that rotates the mirror and the camera independently enables a complex horizontal and vertical gaze control of the mid-air stereo camera.

Licker: A Tongue Robot for Representing Realistic Tongue Motions

We present Licker, a flexible tongue robot that is capable of mimicking a human tongue motion. The aim of this robot is to grow social bonding, regardless of species by licking. At first, we analyzed the human tongue motion and found four basic motions. Based on this result, we developed an originally designed tongue motion robot. Then, we also carefully designed tactile feeling of the tongue, such as softness of tongue itself and slimy feeling of saliva. By using this robot, we could confirm through demonstrations that this robot can present realistic tactile feeling being licked.

Light’em: A Multiplexed Lighting System

The main contribution of this study is to realize the multiplexed lighting environment using time-division active shutter glasses. These days, the improvement of an indoor environment is actively studied to realize desirable condition. Visual stimulation, especially from peripheral vision, affects lots of factors such as feelings, work efficiency, mental stress, and relaxation. However, the influence of peripheral visual stimulation varies among individuals, and it is difficult to control peripheral visual stimuli individually in the same space. In most cases, the visual environment suitable for each person collide with each other and desirable visual environments for all occupants cannot be realized at the same time. Therefore, in this study, we propose a lighting environment multiplexing system. The proposed system makes independently controllable multiple lighting environments for multiple occupants in one space. We utilize active shutter technology for visual multiplexing in the system. With this system, completely different lighting environments can be simultaneously realized in the same space, such as “environment where the whole space is bright with a white light” vs. “environment where the incandescent lamp shines softly only partially.” This study presents the implementation and evaluation of the proposed system, which can control intensity, color, and distribution in the multiplexed lighting environment.

NavigaTorch: Projection-based Robot Control Interface using High-speed Handheld Projector

We propose “NavigaTorch”, a projection-based robot control interface that enables the user to operate a robot quickly and intuitively. The user can control and navigate a robot from the third-person viewpoint using the projected video as visual feedback. We achieved flickerless image feedback and a quick response in robot operation by developing a handheld pixel-level visible light communication (PVLC) projector. Our contribution is the development of a robot control interface based on high-speed projection technology and the exploration of the design methodology of projection-based robot control using a handheld projector.

PinpointFly: An Egocentric Position-pointing Drone Interface using Mobile AR

We propose PinpointFly, an egocentric drone interface that allows users to arbitrarily position and rotate a flying drone using position control interactions on a see-through mobile AR where the position and direction of the drone are visually enhanced with a virtual cast shadow. Unlike traditional speed control methods, users hold a smartphone and precisely edit the drone’s motions and directions by dragging the cast shadow or a slider bar on the touchscreen.

Polyvision: 4D Space Manipulation through Multiple Projections

Seeing is believing. Our novel virtual reality system, Polyvision, applies this old saying to the fourth dimension. Various shadows of an object in a four-dimensional (4D) space are simultaneously projected onto multiple three-dimensional (3D) screens created in a virtual environment to reveal its intricate shape. The understanding of high-dimensional shapes and data can essentially be enhanced when good visualization is complemented by interactive functionality. However, a method to implement an interface for handling complex 4D transformations in a user-friendly manner must be developed. Using our Polyvision system, the user can manipulate each shadow as if it were a 3D object in their hand. The user’s action on each projection is reflected to the original 4D object, and in turn its projections, in real time. While controlling the object’s orientation minutely on one shadow, the user can grasp its global structure from multiple changing projections. Our system has a wide variety of applications in visualization, education, mathematical research, and entertainment, as we demonstrate with a variety of 4D objects that appear in mathematics and data sciences.

PortOn: Portable mid-air imaging optical system on glossy materials

PortOn is a portable optical system that can form mid-air images that stand on a glossy surface such as a table or the floor. PortOn is composed of micro-mirror array plates (MMAPs), an image light source, a mirror, and polarizing elements. PortOn projects light to form a mid-air image at a position that is easy for a human to see when it is placed on a flat surface.

Our contribution is a practical optical design that can be easily installed. We designed the arrangement of the MMAPs, mirror and light source to form a mid-air image by placing it on a flat and glossy surface. The advantage of our method is to erase unnecessary light and show beautiful mid-air image clearly by applying view angle control and polarization operating to the mid-air imaging system. With this method, it is possible to display computer graphics in the real world easily and realize mixed reality interaction.

ReFriction: Remote friction control on polystyrene foam by ultrasound phased array

We propose a system that can remotely change the friction feel of the polystyrene foam surface using an Airborne Ultrasound Tactile Display (AUTD). In previous studies, the superficial friction of an object is mainly controlled by ultrasonic vibration of an object to reduce the friction using the squeeze film effect or by static electricity to increase superficial friction. In contrast to these studies, our approach changes the friction remotely, thus no device is needed on the target surface. This allows the tactile display to be a large screen or disposable, which can be used as an intuitive touch panel in applications where hygiene is important, such as during surgery. Furthermore, changing the surface friction of three-dimensional objects of any shape is also considered as another application. In this system, the position of the fingertip is detected by a device (Xperia Touch) equipped with a projector and a fingertip detection sensor, and the ultrasound focus is changed according to the fingertip position in real time using AUTD. As a result, the feel of friction on the polystyrene foam is altered by the focused ultrasound. Various friction patterns can be presented according to the image of the projector.

Simple is Vest: High-Density Tactile Vest that Realizes Tactile Transfer of Fingers

We developed a high-density tactile vest that presents the haptic sensation of the five fingertips to the back rather than to the fingertip as a new haptic presentation method for objects in a virtual reality (VR) environment. The device adopts 144 eccentric mass vibration motors actuated individually and five Peltier elements. The voice coils present the contact points on the finger to the back and abdomen. When holding an object with fingers in VR scene, the sense of touch is presented to the wide area of the abdomen and back, so an accurate sense of which part of the finger is contacted can be comprehended. Compared with a fingertip-mounted display, it is possible to address issues of weight and size that hinder the free movement of fingers, while high-density distributed information can be presented. Furthermore, abdomen and back are totally free space in typical VR scenarios.

StickyTouch: An Adhesion Changeable Surface

We propose StickyTouch, a novel tactile display that represents adhesive information on a surface. Adhesion control can be achieved by a temperature sensitive adhesive sheet whose temperature is locally controlled by peltier devices arranged in a grid. We implemented a proof-of-concept prototype in which the adhesive size was 150 × 150 mm and proposed example applications of the display.

SwarmCloak: Landing of a Swarm of Nano-Quadrotors on Human Arms

We propose a novel system SwarmCloak for landing of a fleet of four flying robots on the human arms using light-sensitive landing pads with vibrotactile feedback. We developed two types of wearable tactile displays with vibromotors which are activated by the light emitted from the LED array at the bottom of quadcopters. In a user study, participants were asked to adjust the position of the arms to land up to two drones, having only visual feedback, only tactile feedback or visual-tactile feedback. The experiment revealed that when the number of drones increases, tactile feedback plays a more important role in accurate landing and operator’s convenience. An important finding is that the best landing performance is achieved with the combination of tactile and visual feedback. The proposed technology could have a strong impact on the human-swarm interaction, providing a new level of intuitiveness and engagement into the swarm deployment just right from the skin surface.

Synesthesia Wear : Full-body haptic clothing interface based on two-dimensional signal transmission

In this paper, we present Synesthesia Wear, a full-body, customizable haptic interface, and demonstrate its capabilities with an untethered spatial computing experience. This wear not only looks as flexible as ordinary cloth, but also the attached modules are powered and controlled through this conductive textile via two-dimensional signal transmission(2DST) technology. The haptic modules have a pin, badge-like connector, and it can be freely attached on the conductive textile(Fig. 1(b)), enabling users to personally customize the experience to their own liking. This module can render variations of haptic feedback to the torso and all limbs of the body, and visualize representations using LED light patterns. We also designed a spatial computing experience(Fig. 1(c)) inspired by the perceptual phenomenon called synesthesia. The system provides an sensory blended experience in which the user walks around freely in real space. The user can interact with real and virtual environments, in a reality-overlapping way by wearing Synesthesia Wear.

Three-dimensional Interaction Technique Using an Acoustically Manipulated Balloon

We propose a system that uses an acoustically manipulated balloon as a visual and tangible interface for the representation of a mid-air virtual object in a full-body augmented reality environment. In this system, airborne ultrasound phased-array transducers on the ceiling actuate a spherical balloon inflated with a mixture of helium and air. This configuration permits (1) a full-body workspace with lateral scalability, (2) a long flight time, (3) good visibility, and (4) easily tangible access to the balloon. A projector-camera system projects a 2D or 3D perspective-correct image onto the balloon. The user can manipulate the corresponding virtual object by physically manipulating the balloon.

TwinCam Go: Proposal of Vehicle-Ride Sensation Sharing with Stereoscopic 3D Visual Perception and Vibro-Vestibular Feedback for Immersive Remote Collaboration

Personal vehicles such as the Segway have been actively used for security patrols or supervision of construction sites, because of their mobility capabilities. In the current study, we proposed a vehicle-ride sensation sharing system enabling a rider to remotely collaborate with a driver, and to receive both 3D visual perception and vibro-vestibular sensation. We developed a prototype personal vehicle system with two 360° cameras attached to the Segway with a stabilizer to capture stereoscopic 3D images and send them to each eye of a head-mounted display worn by a remotely collaborating rider. We also developed a prototype of vibro-vestibular display by modifying a conventional wheelchair with a simple lightweight mechanism for actuation and vibration by two DC motors. In our presentation algorithm, each wheel of the wheelchair is accelerated or decelerated proportionally to the acceleration of each wheel of the Segway. When the velocity of each wheel was almost constant and the acceleration was nearly zero, the wheelchair slowly moved to the initial position, with movement that the rider could not perceive, to keep the wheelchair accelerating or decelerating in a limited space.