Generating 360 Outdoor Panorama Dataset with Reliable Sun Position Estimation

Shih-Hsiu Chang
National Tsing-Hua University
Ching-Ya Chiu
National Tsing-Hua University
Chia-Sheng Chang
National Tsing-Hua University
Kuo-Wei Chen
National Taiwan University of Science and Technology
Chih-Yuan Yao
National Taiwan University of Science and Technology
Ruen-Rone Lee
Industrial Technology Research Institute
Hung-Kuo Chu
National Tsing Hua University Figure

Abstract

A large dataset of outdoor panoramas with ground truth labels of sun position (SP) can be a valuable training data for learning outdoor illumination. In general, the sun position (if exists) in an outdoor panorama corresponds to the pixel with highest luminance and contrast with respect to neighbor pixels. However, both image-based estimation and manual annotation can not obtain reliable SP due to complex interplay between sun light and sky appearance. Here, we present an efficient and reliable approach to estimate a SP from an outdoor panorama with accessible metadata. Specifically, we focus on the outdoor panoramas retrieved from Google Street View and leverages built-in metadata as well as a well-established Solar Position Algorithm to propose a set of candidate SPs. Next, a custom made luminance model is used to rank each candidate and a confidence metric is computed to effectively filter out trivial cases (e.g., cloudy day, sun is occluded). We extensively evaluated the efficacy of our approach by conducting an experimental study on a dataset with over 600 panoramas.

Press:

Seamless Virtual Reality News


Links