SIGGRAPH 2004 - The 31st international conference on computer graphics and interactive techniques
Conferenece Exhibition Call For Participation Presenters Media Registration spacer

LIFLET: Light Field Live With
Thousands of Lenslets

LIFLET is a true 3D live video system that can synthesize free-viewpoint images interactively from thousands of views of 3D dynamic scenes.

Life Enhancement
2D images and videos have revolutionized our daily life. The keyword of the next step of visual media must be "3D." A true 3D live video system could enhance your visual experience.

This is the beginning of true 3D live video, which could introduce new digital media such as digital holographic video. It is suitable for various applications, including 3D broadcasting, 3D photometric archiving, and 3D content creation for movies or games. This is not a 3D display technology, but a real-time image-based rendering system that is applicable for living things and complex reflection and refraction in the real world. We are planning to extend and improve the optical system to achieve higher resolution of synthetic images.

The overall goal of this project is to provide a true 3D live video system. Two subsidiary goals:

1. To capture dynamic light fields of a 3D scene densely enough to synthesize free-viewpoint images. For this purpose, we introduce thousands of lenslets.

2. To synthesize free-viewpoint images of the scene from the captured light field, interactively. The synthesized image should be free of any distortion and maintain correct parallax and occlusions of the scene.

The system offers three technical innovations:

1. A simultaneous capturing system with thousands of lenslets. With the advice of NHK Science & Technology Research Laboratories, we developed capturing optics composed of an array of lenslets, an XGA video camera, and a depth-control lens. This system can capture thousands of views of a scene simultaneously, while the camera array system can capture up to tens or hundreds of views.

2. An interactive method of displaying free-viewpoint images of dynamic scenes. From the captured thousands of views, we can synthesize free-viewpoint images interactively. The whole process from capturing to interactive display is performed in real time.

3. A software approach to remove optical distortions. In order to extend the depth of field, we introduce the depth-control lens. Unfortunately, this lens causes an optical distortion. We apply the concept of ray tracing to remove the distortion.

Monday, 9 August
10:30 am - 12:15 pm
Room 404AB

Tomoyuki Yamamoto
The University of Tokyo

Masaru Kojima
Takeshi Naemura
The University of Tokyo

emerging technologies jury and committee
 > share the SIGGRAPH 2004 web site
Conference 8-12 August, Exhibition 10-12 August.  In Los Angeles, CA