We present IBR approaches that compensate for lack of depth using image-based operations to create free viewpoint walkthroughs. In the first project, we use discontinuous image warping guided by quasi-dense depth maps which gives far fewer IBR artifacts than projective texturing an erroneous 3D model. In the second project, we oversegment the image and warp superpixels independently using sparse depth. We introduce depth synthesis to create approximate depth which can be used within our image warps for creating pleasing walkthroughs. We compare our results to as many as four recent IBR approaches and show that our approach extends very well to free viewpoint navigation.
We also investigate perceptual issues of basic IBR of projecting panoramic images onto a planar proxy, widely used for city visualization. We compare artifacts of smooth transitions (blending multiple images) with abrupt transitions (popping/hatching) from a perceptual point of view and develop guidelines for selecting the best possible tradeoff. In another study, we further zoom into the problem and use vision science to investigate perspective distortions produced when single image projected onto a planar geometry is viewed from novel viewpoints. We use the study to develop capture density guidelines that minimize such distortions.
Overall, we believe that our work is an important step towards principled free-viewpoint interactive image-based navigation