Monday, February 08, 2016

Panoramic Video from Unstructured Camera Arrays



Link to publication page: 
http://www.disneyresearch.com/publication/panoramic-video-from-unstructured-camera-arrays/

We describe an algorithm for generating panoramic video from unstructured camera arrays. Artifact-free panorama stitching is impeded by parallax between input views. Common strategies such as multi-level blending or minimum energy seams produce seamless results on quasi-static input. However, on video input these approaches introduce noticeable visual artifacts due to lack of global temporal and spatial coherence. In this paper we extend the basic concept of local warping for parallax removal. Firstly, we introduce an error measure with increased sensitivity to stitching artifacts in regions with pronounced structure. Using this measure, our method efficiently finds an optimal ordering of pair-wise warps for robust stitching with minimal parallax artifacts. Weighted extrapolation of warps in non-overlap regions ensures temporal stability, while at the same time avoiding visual discontinuities around transitions between views. Remaining global deformation introduced by the warps is spread over the entire panorama domain using constrained relaxation, while staying as close as possible to the original input views. In combination, these contributions form the first system for spatiotemporally stable panoramic video stitching from unstructured camera array input.


No comments: