SIGGRAPH '20: ACM SIGGRAPH 2020 Emerging Technologies

Full Citation in the ACM Digital Library

SECTION: Session: Beyond the Hands: Advances in Haptics

Super Haptoclone: Upper-Body Mutual Telexistence System with Haptic Feedback

We demonstrate a mutual telepresence system with a workspace that can cover the whole upper body. A pair of micromirror arrays produces a high-fidelity light field of the other party and the clothes worn by the users provide haptic feedbacks in the interaction. In the demonstration system, we use a technique that broadens the view angle of the 3D optical images by introducing a symmetric mirror structure, where users can see both their body and the partner’s mid-air image simultaneously. The tactile sensation is presented by a jacket-type haptic device in synchronization with the contact with the optical image.

Feel it: Using Proprioceptive and Haptic Feedback for Interaction with Virtual Embodiment

Virtual embodiment has become popular for enhancing virtual interaction in terms of sharing object information. A user can control a character or object in a virtual environment to provide immersive interactive experience. However, one of the limitations for the virtual interactions was the incapability to receive feedback apart from visual hints. In this demonstration, we present using servo motor and Galvanic Vestibular Stimulation to provide feedback from a virtual interaction. Our technique transforms information of the virtual objects (e.g.: weight) into haptic and proprioceptive feedback that stimulates different sensations to a user. We present the user experience to the attendees of SIGGRAPH 2020 through a live demonstration in a virtual environment controlled using a virtual robotic arm.

TorsionCrowds: Multi-Points Twist Stimulation Display for Large Part of the Body

Humans perceive mechanical phenomena on the skin from the distribution of stimuli. In this paper, we propose TorsionCrowds, a novel tactile display that presents force stimulus distribution over a large area of the body. The system consists of multi-channel twist skin deformation modules and the passive mechanisms for fitting to the human body curve. By stimulating elements arrangement based on perceptual experiments, the user can perceive continuous force distribution in a large area of the body sharply with a high dynamic range of intensity. In the demonstration, the user can experience the force distribution from the system suitable for each scenario such as visual haptization, self-motion presentation, proprioception substitution.

HARVEST: High-Density Tactile Vest that Represents Fingers to Back

Human tactile perception becomes worse with age, especially the two-point discrimination threshold of fingertip is significantly degraded. We developed a device that transfers tactile sensation of the fingertip to the back using a tactile glove and a tactile vest, to restore the original tactile perception ability of the fingertip at the back. Contrary to our previous report in which 100 eccentric type vibrators were used for VR environment, we employed voice coil type vibrators, which has much higher response, and a sensing glove to use in a real environment. Pressure distribution and force applied to the thumb and index finger was measured at 2 mm interval, and presented on the back using 144 independently driven vibrators.

SECTION: Session: Seeing is Believing: XR Displays, Holograms, and Dyes

ThinVR: VR displays with wide FOV in a compact form factor

We demonstrate ThinVR as a new approach to simultaneously address the bulk and limited FOV of today’s head-worn VR displays. ThinVR enables a VR display to provide 180 degrees horizontal FOV in a thin, compact form factor. Our approach is to replace traditional large optics with a curved microlens array of custom-designed heterogeneous lenslets and place these in front of a curved display. Custom-designed heterogeneous optics were crucial to make this approach work, since over a wide FOV, many lenslets are viewed off the central axis. We show the viability of the ThinVR approach through two demonstrations, using both dynamic and static displays.

TeslaMirror: Multistimulus Encounter-Type Haptic Display for Shape and Texture Rendering in VR

We propose a novel concept of a hybrid tactile display with multistimulus feedback, allowing the real-time experience of the position, shape, and texture of the virtual object. TeslaMirror consists of a gross shape display based on 6 DOF robot, fine shape display based on the array of inverted five-bar linkage mechanisms, and texture display with electrotactile and vibrotactile devices. The proposed technology will deliver high fidelity shape and texture feedback to the user without any wearable haptic device, thus, it will allow the user to have intuitive and realistic haptic interaction in a large area with no limitation and disturbance of motion.

The preliminary user study was conducted to evaluate the ability of TeslaMirror to reproduce shape perception with the under-actuated multi-contact display. The results revealed that potentially this approach can be used in the VR systems for rendering versatile virtual shapes with high fidelity haptic experience.

Photo-Chromeleon: Re-Programmable Multi-Color Textures Using Photochromic Dyes

In most situations in the real world, once an object is fabricated, the color is permanent and cannot be changed again. For the Emerging Technologies exhibit, we will present a technology to create re-programmable multi-color textures that are made from a single material only. This technology is based on photochromic dyes that can switch their appearance from transparent to colored when exposed to the light of a certain wavelength. By mixing cyan, magenta, and yellow (CMY) photochromic dyes into a single solution and leveraging the different absorption spectra of each dye, we can control each color channel in the solution separately. Our approach can transform single-material fabrication techniques, such as coating, into high-resolution multi-color processes. In addition, as the color-changing process is fully reversible, users can recolor objects multiple times.

Neural Holography

Holographic displays promise unprecedented capabilities for direct-view displays as well as virtual and augmented reality (VR/AR) applications. However, one of the biggest challenges for computer-generated holography (CGH) is the fundamental tradeoff between algorithm runtime and achieved image quality, which has prevented high-quality holographic image synthesis at fast speeds. Moreover, the image quality achieved by most holographic displays is low, due to the mismatch between physical light transport of the display and its simulated model. Here, we develop an algorithmic CGH framework that achieves unprecedented image fidelity and real-time framerates. Our framework comprises several parts, including a novel camera-in-the-loop optimization strategy that allows us to either optimize a hologram directly or train an interpretable model of the physical light transport and a neural network architecture that represents the first CGH algorithm capable of generating full-color holographic images at 1080p resolution in real time.

SECTION: Session: Enhancing Gameplay Through Haptic Feedback

ViBaR: VR Platform Using Kinesthetic Illusions to Enhance Movement Experience

We propose a virtual reality (VR) experience platform based on kinesthetic illusions. While presenting sensation of movement is important for enhancing immersion and presence in VR experience, large devices and complicated systems are typically required. We found the kinesthetic illusion could create the motion of a bar that we hold, and the whole-body movement illusion was also induced with the illusion of the bar. We implemented a VR experience platform using these phenomena, which does not require physical movement. This platform is suitable for home use since it does not require large-scale and complicated devices, and it can provide an experience without body restrictions.

The Tight Game: Implicit Force Intervention in Inter-personal Physical Interactions on Playing Tug of War

Physical assistance can alleviate individual differences of abilities between players to create well-balanced inter-personal physical games. However, ‘explicit’ intervention can ruin the players’ sense of agency, and cause a loss of engagements in both the player and audience. We propose an implicit physical intervention system ”The Tight Game” for ‘Tug of War’ a one-dimensional physical game. Our system includes four force sensors connected to the rope and two hidden high torque motors, which provide realtime physical assistance. We designed the implicit physical assistance by leveraging human recognition of the external forces during physical actions. In The Tight Game, a pair of players engage in a tug of war, and believe that they are participating in a well balanced, tight game. In reality, however, an external system or person mediates the game, performing physical interventions without the players noticing.

Hopping-Pong: Computational Curveball in Table Tennis by Noncontact Ultrasound Force

Augmented sports is the attempts to enhance sports as entertainment and bridge the skill gap in sports between players by computer technologies. As an augmentation method, physically interfering with sports is proposed such as changing ball trajectories. Previously, we proposed changing a ping-pong ball trajectory by ultrasound force[Morisaki et al. 2019]. In this paper, we propose an interactive table tennis system based on the ultrasound force technique. In our system, players can shoot a curveball at voluntary timing by noncontact ultrasound force. The curveball by ultrasound can cause an opponent player to swing and miss. Even if the trajectory change is slight, a miss-shot can occur when the ball hits an unintended racket position. This system enhances table tennis as entertainment and supports beginners.

Actuated Club: Modification of Golf-Club Posture with force feedback and motion prediction in VR environment

In the golf, the trajectory and posture of the golf club during the swing influence on the direction and trajectory of the hit ball. While the instructor and the system instruct how to improve the swing, it is difficult to correct the swing motion during the swing due to lack of information quantity, time for interpretation of presented information, and lack of the sense of agency in the swing motion. Therefore, in this study, we realize the modification of the swing motion during the swing using the trajectory prediction of the golf swing and the non-grounding type force display. To cope the delay between the command of the modification and the swing correction, this system learns the golf swing trajectory of the user in advance, and predicts them. If the result of the prediction suggests the club deviates from the ideal trajectory or posture, the gyroscope presents torque in the ideal trajectory direction to correct the trajectory or posture. By practicing VR golf using the proposed system, the user can always practice golf-swing with the ideal trajectory, and can make the swing practice efficient.

SECTION: Session: Mapping the Future: New Techniques in Projection Technology

Realistic Dynamic Projection Mapping Using Real-Time Ray Tracing

We propose the introduction of modern rendering techniques, such as path tracing, to realistic dynamic projection mapping. Path-tracing rendering requires significant time, but dynamic projection mapping requires high-frame-rate rendering, e.g., 500-1000 fps. To overcome this gap, our method leverages the concept of persistence of vision. We find that path-traced noisy images can be integrated perceptually, even in dynamic scenes, by being projected them at 947 fps. The reproduced images can be consistent with the physical motion and observer viewpoint at low latency while being highly realistic.

A Method for Appropriate Occlusion between a Mid-air 3DCG Object and a Hand by Projecting an Image on the Hand

High-Speed Focal Tracking Projection Based on Liquid Lens

A high-speed projection system with a dynamic focal tracking technology based on a variable focus lens will be illustrated. The traditional projection was limited on 2D space, due to their narrow depth-of-field projection range. The proposed system included a high-speed variable focus lens, a high-speed camera, and a high-speed projector, so that the depth information would be detected and then served as feedback to correct the focal length and update the projection information in high-speed. As a result, the information would be well-focused projected even on a 3D dynamic moving object.

Interactive Stickies: Low-latency projection mapping for dynamic interaction with projected images on a movable surface

SECTION: Session: Advances in Image Rendering and Display

ViPlate: Suppressing Mid-Air Image Degradation by Vibrating a Retro-Transmissive Plate

We propose an imaging system, “ViPlate,” for suppressing degradations of a mid-air image. A retro-transmissive plate (RT plate) can form a mid-air image that can be seen by multiple users simultaneously with their naked eyes. However, the actual RT plate causes certain image degradations, for instance, visible lattice patterns of the RT plate, discontinuous distortions, and blurring. To improve a user’s viewing experience, it is desirable to improve the image quality. ViPlate smoothens the discontinuous distortions and makes the lattice pattern invisible by vibrating the RT plate. Therefore, mid-air images can be made clear and easily visible.

Simulfocus imaging: quasi-simultaneous multi-focus imaging using Lock-in Pixel imager and TAG lens

This installation demonstrates a new imaging method to capture multi-focus images quasi-simultaneously (simulfocus imaging) by synchronous control of a four-tap lock-in pixel imager and a tunable acoustic gradient index (TAG) lens.

Light-Field Displays: a View-Dependent Approach

Most 3D displays suffer from the vergence-accommodation conflict, which is a significant contributor to eyestrain. Light-field displays avoid this conflict by directly supporting accommodation but they are viewed as requiring too much resolution to be practical, due to the tradeoff between spatial and angular resolution. We demo three light-field display prototypes that show a view-dependent approach which sacrifices viewer independence to achieve acceptable performance with reasonable display resolutions. When combined with a directional backlight and eye tracking system, this approach can provide a 3D volume from which a viewer can see 3D objects with accommodation, without wearing special glasses.

SECTION: Session: Miscellaneous

Elemphasize: Emphasizing Mechanical Tactile Sensation via Electrical Stimulation

In order to provide a tactile sensation that is appropriate for tense situations such as being slapped or slashed, it must be safe as well as strong and natural. Mechanical stimulation can present a natural sensation, but too much intensity can lead to injury. Electrical stimulation can present high intensity stimuli without mechanical damage, but the naturalness of tactile sensation is inferior due to its bizarre sensation. We proposed a method to combine mechanical stimulation and electrical stimulation to achieve strong sensation without losing natural sensation. An electrode is attached to the physical object and the subjective intensity of the mechanical stimulus is enhanced by the electrical stimulus when contact with the skin. We also present several scenarios such as application to a thrilling VR content and training to get used to fear of injection pain.

Knitted RESi: A Highly Flexible, Force-Sensitive Knitted Textile Based on Resistive Yarns

We demonstrate a force sensing knitted textile, based on a piezo-resistive yarn. The resulting elastic, stretchable, and robust textile exhibits sensors based on the widespread Force Sensitive Resistor (FSR) principle. As a proof-of-concept, we implemented a knit consisting of multiple piezo-resistive wales, each intersecting with a single vertical piezo-resistive insert, spawning discrete FSRs at the respective intersection points. While enabling monitoring of external stress, such as pressure, stretch, and deformation, the textile features inherent pleasant haptic qualities.

Multiplexing Display using Vector Error Propagation and Smooth Active Shutter

SlideFusion: Surrogacy Wheelchair with Implicit Eyegaze Modality Sharing

For mobility-impaired people, the wheelchair is considered as a main navigation and accessibility device. However, due to the inherited design of the chair, the user is restricted to utilize his hand all the time during navigation resulting in a dual impairment. Our proposed system, SlideFusion, expands on previous work of collaborative and assistive technologies for accessibility scenarios. SlideFusion is focused on the remote collaboration of the mobility impaired person, that an operator can remotely access an avatar embedded into the wheelchair. To reduce the physical and cognitive overload on the wheelchair user, we propose the use of eye gaze modality sharing with the remote operator. The eye gaze modality enables implicit interactions that do not require the user to point or verbally say, thus leveraging indirect communication. By using this, not only accessibility can be provided to wheelchair users and their caregivers, but also hearing impairments and pronunciation disorders are applicable.