The purpose of this project was to create a Digital Twin (DT) of the Expanded Perception and Interaction Centre (EPICentre), using Extended Reality to better visualise Internet of Things (IoT) sensor data such as temperature and humidity throughout the building. An immersive application was developed for multiple platforms, allowing users to interact with the DT model through Windows Mixed Reality Platforms, Oculus Platforms and a Cylindrical Screen. This project lends itself as an engaging introduction into DTs through the immersive experience of data analytics, generalised for the public using familiar sensors-driven datasets.
The GRC Explore app is being developed on the Apple iPad for the NASA Glenn Research Center in Cleveland, Ohio. The app is visually composed of still images, 360° panoramic images, videos, and augmented reality. It was created to be able to give virtual tours of the center to interested parties, while either on or away from the center. The app is being built in XCode and leveraging AR Kit for the augmented reality sections.
We present Graphite, an iOS mobile app that enables users to transform photos into drawings and illustrations with ease. Graphite implements a novel flow-aligned rendering approach that is based on the analysis of local image-feature directions. A stroke-based image stylization pipeline is parameterized to compute realistic directional hatching and contouring effects in real-time. Its art-direction enables users to selectively and locally fine-tune visual variables—such as the level of detail, stroke granularity, and sketchiness—using the Apple Pencil or touch gestures. In this respect, the looks of manifold artistic media can be simulated, including pencil, pen-and-ink, pastel, and blueprint illustrations. Graphite is based on Apple’s CoreML and Metal APIs for optimized on-device processing. Thus, interactive editing can be performed in real-time by utilizing the dedicated Neural Engine and GPU.
Imagine a flock of live brushes on your canvas. You set their initial locations and directions of walk, mark a speed limit, tell them how to behave with each other, choose a palette for each step, and, finally, let them walk over your canvas. What will you see?
The Bubble app enables an Augmented Reality (AR) user interface to be created as a hierarchy of interactable nested bubbles. It allows users to easily navigate, explore and interact with content and other information (i.e. text, images, videos, 3D models) in a manner naturally suited to mixed reality and AR worlds. The Bubble interface has broad applications. In its current form, we use it to display exhibits at the Penn Museum.