SA '22 Emerging Technologies: SIGGRAPH Asia 2022 Emerging Technologies

Full Citation in the ACM Digital Library

Augmenting Everyday Objects into Personal Robotic Devices

Augmenting familiar physical objects has presented great potential in upgrading their functions by automation, granting aesthetics, and even changing access. The recent celebration of success in personal fabrication has brought novices where they can augment everyday objects, from automating routine tasks with mobilized smart devices to devising self-sustaining smart objects by harvesting energy from involved daily interactions, for example. While the overall process involves a line of steps of capturing specifications, design mechanisms and fabricating the parts, it remains challenging for non-experts as it demands domain expertise in robotics, design, programming, and even mechanical engineering.

We introduce a series of augmented robots, smart domestic devices that are augmented from everyday objects, leveraging personal fabrication to assist daily essential interactions. Through user-demonstration of desired motions, 3D printed attachment mechanisms are auto-generated to build personal robotic devices that automate routine tasks and harvest energy.

Directing Tangible Controllers with Computer Vision and Beholder

We present Beholder, a computer vision (CV) toolkit for building tangible controllers for interactive computer systems. Beholder facilitates designers to build physical inputs that are instrumented with CV markers. By observing the properties of these markers, a CV system can detect physical interactions that occur. Beholder provides a software editor that enables designers to map CV marker behavior to keyboard events; thus connecting the CV-driven tangible controllers to any software that responds to keyboard input. We propose three design scenarios for Beholder—controllers to support everyday work, alternative controllers for games, and transforming physical therapy equipment into controllers to monitor patient progress.

E.S.P.: Extra-Sensory Puck in Air Hockey using the Projection-Based Illusion

E.S.P. (Extra-Sensory Puck) provides a new experience by introducing optical illusions to air hockey. The perception of a solid puck randomly hit by a player is altered to show various physics-defying appearances and motions to the player’s naked eye. Such altered perceptions are based on our high-speed projector-camera system generating stimulation patterns onto the moving puck. This paper presents two demonstrations. The first is the invisible puck, where the hit puck is camouflaged to disappear on the table. The second is the altered motion, where the direction and speed are altered.

Floagent: Interaction with Mid-Air Image via Hidden Sensors

This paper proposes Floagent as a human-computer interaction system that displays images in mid-air using infrared light reflected by a hot mirror. Floagent is an interaction system that allows users to focus on mid-air images without being aware of the sensors. By combining a hot mirror and a retroreflective transmissive optical element, Floagent conceals the camera from the user without affecting the mid-air image. We investigated the touch input interactions accuracy with mid-air images to evaluate the proposed system. The results show that the proposed system can effectively measure user input. Floagent enables an interaction design with a hidden sensor in which mid-air images appear to respond spontaneously to a wide variety of interaction events.

HumanConQuad: Human Motion Control of Quadrupedal Robots using Deep Reinforcement Learning

Robotic creatures are capable of entering hazardous environments instead of human workers, but it is challenging to develop a fully autonomous agent that can work independently in unstructured scenes. We propose a human motion-based control interface for quadrupedal robots that promises adaptable robot operations by reflecting the user’s intuition directly to the robot’s movements. Designing motion interface for different morphologies conveys tricky problems in solving dynamics and control strategies. We first retarget the captured human motion into the corresponding robot’s kinematic space with proper semantics using supervised learning and post-processing techniques. Second, we build the motion imitation controller to track the given retargeted motion using deep reinforcement learning with task-based curriculums. Finally, we apply domain randomization during training for real-world deployment. (Video1)

Low-Latency Motion Transfer with Electromagnetic Actuation for Joint Action

Joint action, performing a task together by multiple people, can be beneficial in transferring the embodied skill, establishing physical relationships, engaging multiple people, and completing the task more effectively. To facilitate real-time joint bodily action, we introduce a low-latency motion transfer system (total latency: 22.9 ± 2.0ms, finger attraction latency: 13.7 ± 2.1ms) using electromagnetic actuation, which allows them to share their instantaneous actions, such as pressing the button with their fingers. Our system can change the type of motion transfer at both the direction of the transfer between users (mono- or bi-direction) and the input logic (”AND” or ”OR”) based on the users’ input. We will discuss its potential as a computer-assisted joint action and the possibility that the computer system orchestrates the joint action between the users or the user and the computer.

QuadStretch: A Forearm-wearable Skin Stretch Display for Immersive VR Experience

Force feedback is important for immersive VR experience but requires a cumbersome device restricting the use of the hand. An alternative to work around this problem is the use of substituted tactile feedback. QuadStretch is a lightweight and flexible forearm-wearable device for providing substituted skin-stretch feedback to express force on the arm. Consisting of four pairs of tactors, it does not require a ground point and can express the directional sense of force accompanying arm movements. In this demo, we prepared six demo scenarios, Boxing, Archery, Wings, Climbing, Slingshot, and Fishing, to show how the expressive power of QuadStretch can enhance the VR experience.

Seeing is Feeling: A Novel Haptic Display for Wearer-Observer Mutual Haptic Understanding

We propose a new haptic display that enables mutual understanding of haptic sensation between the wearer and an observer. In addition to presenting haptic sensations by inducing skin deformation, we have achieved creating a mutual understanding of the sensations with the observer by making the haptic stimulus evident. People have the ability to predict the sensations of others by observing their sensory states. The system is composed of a part that provides haptic stimulus while creating visible skin deformation, and a mechanical structure that visually exaggerates the deformation. The proposed system realizes richer interactions by extending the entertainment experiences of the observers during live content, increase understanding of internal states through biofeedback, and be used in remote haptic communication.

Tidal Space: Interactive Home Installation for Work-From-Home Parents

The past few years of COVID-19 lockdown have made it abundantly clear that both childcare and professional work are inextricable from the home office – as many were forced to juggle between various roles and expectations while caring for their family and working from home. Tidal Space considers the manifold needs of work-from-home parents by incorporating motorized curtains and a foldable acoustic panel into the home office. Triggered by screen activity, the curtains self-adjust to serve as spatial moderators, mediating the boundaries between different types of work. From soundproof separation for focused work, to more translucent and open configurations for checking on the children, to interactive elements allowing kids and parents to play, Tidal Space aims to improve the home office experience for all.

Touchable Cooled Graphics: Midair 3D Image with Noncontact Cooling Feedback using Ultrasound-Driven Mist Vaporization

Adding tactile feedback to a midair image realizes immersive mixed reality contents. In this study, we develop a midair 3D image with a noncontact cooling sensation using ultrasound. In this system, users can feel a cooling sensation when touching the 3D image with their bare hands. The noncontact cooling sensation is rapidly displayed by ultrasound-driven mist vaporization. In the previous ultrasound haptic-optic display, only mechanical tactile feedback e.g. vibration has been used. The cooling sensation can extend the displayable material texture of the ultrasound haptic system. In the demo, we present a 3D image of ice. Participants can touch the image freely and feel its realistic cooling sensation.

Ultrasound-Driven Passive Haptic Actuator Based on Amplifying Radiation Force Using Simple Lever Mechanism

Haptics is a promising modality, which realizes intuitive human motion guidance and an immersive game experience. For a natural tactile experience, a lightweight and powerful wearable haptic device is required. In this study, we develop a lightweight passive haptic device (6.2 g) remotely driven by airborne ultrasound. This device can present a strong haptic stimulus of 400 mN (40 gf) by amplifying the applied ultrasound acoustic radiation force 19.6 times using a simple lever mechanism. Moreover, since the radiation force is presented at the sound velocity, the presentation speed of the amplified force is still high. In this demo, participants can experience a lightweight passive haptic actuator worn on their fingertips. This device can present a static force of 400 mN and low-frequency vibration in 45 ms. We also demonstrate an earring-type passive haptic device, which presents a haptic stimulus to the earlobe.