An Interactive 360-Degree Light Field Display
|
|
This display renders the light field of an object - with correct geometric, accommodation and vergence cues in a horizontal plane - by rendering and projecting imagery at 5,000 frames per second onto a spinning anisotropic reflector. Motion-tracked vertical parallax is then employed to allow for unrestricted 3D movement with correct geometric cues. |
Enhanced Life
The display is:- Autostereoscopic, requiring no special viewing glasses
- Omnidirectional, allowing viewers to be situated anywhere around it
- Multiview, producing a correct rendition of the light field with the correct horizontal parallax and vertical perspective for any viewer situated in the viewing plane.
Goals
To create increasingly high-fidelity displays that can project a rendered object's light-field into space at all angles.
Innovations
Development of this system required innovation on a number of fronts, including:- Techniques for acquiring and rendering interactive 3D OpenGL graphics and photographed light fields.
- Projection math to generate the correct perspective on the display for any given viewer height and distance.
- An innovative spinning anisotropically diffusing mirror used to reflect high-speed frames to different viewing positions.
- Real-time update of the projector at very high frame rates using standard DVI graphics hardware.
- Development of a very-fast-update projector. This was achieved by modifying an off-the-shelf projector to use a new DLP drive card with custom-programmed FPGA-based circuitry.
Vision
As we cover our world with flat electronic displays, it is important to realize that flat surfaces represent only a small portion of our physical world. Our real world is made of objects, in all their three-dimensional glory. When we have pasted displays on every surface we can find, the next generation of displays will begin to represent the physical world around us, but this progression will not succeed unless it is completely invisible to the user: no special glasses, no fuzzy pictures, and no small viewing zones.
Contact
Andrew JonesUSC Institute for Creative Technologies
Jones (at) ict.usc.edu
Contributors
Paul DebevecUSC Institute for Creative Technologies
Mark Bolas
Perry Hoberman
University of Southern California School of Cinematic Arts
Ian McDowall
Fakespace Labs, Inc.
Hideshi Yamada
Sony Corporation
