Stochastic visibility in point-sampled scenes
Miles Hansard
Abstract
This paper introduces a new visibility model for 3D point-clouds, such as those obtained from multiple time-of-flight or lidar scans. The scene is represented by a set of random particles, statistically distributed around the available surface-samples. Visibility is defined as the appropriate conjunction of occupancy and vacancy probabilities, along any visual ray. These probabilities are subsequently derived, in relation to the statistical scene structure. The resulting model can be used to assign probabilistic visibilities to any collection of scene-points, with respect to any camera position. Moreover, these values can be compared between different rays, and treated as functions of the camera and scene parameters. No surface mesh or volumetric discretization is required. The model is tested by decimating 3D point-clouds, and estimating the visibility of randomly selected targets. These estimates are compared to reference values, computed by standard methods, from the original full-resolution point-clouds. Applications of the new visibility model to multi-view stereo are discussed.
Session
3D Vision
Files
Extended Abstract (PDF, 1219K)
Paper (PDF, 6M)
DOI
10.5244/C.29.89
https://dx.doi.org/10.5244/C.29.89
Citation
Miles Hansard. Stochastic visibility in point-sampled scenes. In Xianghua Xie, Mark W. Jones, and Gary K. L. Tam, editors, Proceedings of the British Machine Vision Conference (BMVC), pages 89.1-89.12. BMVA Press, September 2015.
Note
Video is not published at the request of the presenting author.
Bibtex
@inproceedings{BMVC2015_89,
title={Stochastic visibility in point-sampled scenes},
author={Miles Hansard},
year={2015},
month={September},
pages={89.1-89.12},
articleno={89},
numpages={12},
booktitle={Proceedings of the British Machine Vision Conference (BMVC)},
publisher={BMVA Press},
editor={Xianghua Xie, Mark W. Jones, and Gary K. L. Tam},
doi={10.5244/C.29.89},
isbn={1-901725-53-7},
url={https://dx.doi.org/10.5244/C.29.89}
}