Volume 37, Issue 7 pp. 213-221
Appearance and Illumination

Learning Scene Illumination by Pairwise Photos from Rear and Front Mobile Cameras

Dachuan Cheng

Dachuan Cheng

State Key Laboratory of Computer Science, Institute of Software, University of Chinese Academy of Sciences

Search for more papers by this author
Jian Shi

Jian Shi

Institute of Automation, University of Chinese Academy of Sciences

Search for more papers by this author
Yanyun Chen

Yanyun Chen

State Key Laboratory of Computer Science, Institute of Software, University of Chinese Academy of Sciences

Search for more papers by this author
Xiaoming Deng

Xiaoming Deng

Beijing Key Laboratory of Human Computer Interactions, Institute of Software, Chinese Academy of Sciences

Search for more papers by this author
Xiaopeng. Zhang

Xiaopeng. Zhang

Institute of Automation, University of Chinese Academy of Sciences

Search for more papers by this author
First published: 24 October 2018
Citations: 41

Abstract

Illumination estimation is an essential problem in computer vision, graphics and augmented reality. In this paper, we propose a learning based method to recover low-frequency scene illumination represented as spherical harmonic (SH) functions by pairwise photos from rear and front cameras on mobile devices. An end-to-end deep convolutional neural network (CNN) structure is designed to process images on symmetric views and predict SH coefficients. We introduce a novel Render Loss to improve the rendering quality of the predicted illumination. A high quality high dynamic range (HDR) panoramic image dataset was developed for training and evaluation. Experiments show that our model produces visually and quantitatively superior results compared to the state-of-the-arts. Moreover, our method is practical for mobile-based applications.

The full text of this article hosted at iucr.org is unavailable due to technical difficulties.