Technical Papers

Color and Photos

Saturday, 01 December 16:15 - 18:00 |  Peridot 206

User-guided White Balance for Mixed Lighting Conditions - Picture

User-guided White Balance for Mixed Lighting Conditions

We present a practical method for high-quality white balancing in scenes with complex lighting based on user-provided scribbles. It relies on what is most intuitive to humans, relative reflectance properties. Our system enables users to produce compelling results on a broad range of inputs, even for difficult configurations.


Ivaylo Boyadzhiev, Cornell University
Kavita Bala, Cornell University
Sylvain Paris, Adobe
Fredo Durand, Massachusetts Institute of Technology


Calibrated Image Appearance Reproduction - Picture

Calibrated Image Appearance Reproduction

We present a fully calibrated model for reproducing the appearance of images under a wide range of viewing conditions and display characteristics. We base our model on human visual perception and combine aspects of tone reproduction, color appearance modelling and lightness perception to achieve accurate and visually pleasing reproduction.


Erik Reinhard, Max Planck Institute for Informatics
Tania Pouli, Max Planck Institute for Informatics
Timo Kunkel, Dolby Laboratories, Inc.
Benjamin Long, University of Bristol
Anders Ballestad, MTT Innovation
Gerwin Damberg, MTT Innovation


Coherent Intrinsic Images from Photo Collections - Picture

Coherent Intrinsic Images from Photo Collections

We present a method to decompose a photo collection of a scene into its intrinsic images components. We estimate for each image its illumination and a reflectance that is coherent across all views. Our method facilitates reflectance editing, and enables new applications such as transferring lighting across photographs.


Pierre-Yves Laffont, REVES/INRIA Sophia-Antipolis
Adrien Bousseau, REVES/INRIA Sophia-Antipolis
Sylvain Paris, Adobe Systems
Fredo Durand, Massachusetts Institute of Technology
George Drettakis, REVES/INRIA Sophia-Antipolis


Robust patch-based HDR reconstruction of dynamic scenes - Picture

Robust patch-based HDR reconstruction of dynamic scenes

In this paper, we propose a new approach to HDR reconstruction based on a novel energy-minimization formulation called the HDR image synthesis equation. This results in an patch-based algorithm integrates alignment and reconstruction in a joint optimization, making it more robust to camera/scene motion than previous techniques.


Pradeep Sen, University of California, Santa Barbara
Nima Khademi Kalantari, University of California, Santa Barbara
Maziar Yaesoubi, UNM Advanced Graphics Lab
Soheil Darabi, UNM Advanced Graphics Lab
Dan Goldman, Adobe Systems, Inc.
Eli Shechtman, Adobe Systems, Inc.