Technologies for Augmented-Reality Systems: Realizing Ultrasound-Guided Needle BiopsiesAndrei State
Mark A. Livingston
William F. Garrett
Mary C. Whitton
Etta D. Pisano
University of North Carolina at Chapel Hill
We present a real-time stereoscopic video-see-through augmented reality (AR) system applied to the medical procedure known as ultrasound-guided needle biopsy of the breast. The AR system was used by a physician during procedures on breast models and during non-invasive examinations of human subjects. The system merges rendered live ultrasound data and geometric elements with stereo images of the patient acquired through head-mounted video cameras and presents these merged images to the physician in a head-mounted display. The physician sees a volume visualization of the ultrasound data directly under the ultrasound probe, properly registered within the patient and with the biopsy needle. Using this system, a physician successfully guided a needle into an artificial tumor within a training phantom of a human breast.
We discuss the construction of the AR system and the issues and decisions which led to the system architecture and the design of the video see-through head-mounted display. We designed methods to properly resolve occlusion of the real and synthetic image elements. We developed techniques for real-time volume visualization of time- and position-varying ultrasound data. We devised a hybrid tracking system which achieves improved registration of synthetic and real imagery and we improved on previous techniques for calibration of a magnetic tracker.
Papers | This Web Site
Final SIGGRAPH 96 Web site update: 25 October 1996.
For complete information on the next conference and exhibition, see: http/www.siggraph.org/s97/