Improved Depth Recovery In Consumer Depth Cameras via Disparity Space Fusion within Cross-spectral Stereo

Gregoire Payen de La Garanderie and Toby Breckon

In Proceedings British Machine Vision Conference 2014
http://dx.doi.org/10.5244/C.28.110

Abstract

We address the issue of improving depth coverage in consumer depth cameras based on the combined use of cross-spectral stereo and near infra-red structured light sensing. Specifically we show that fusion of disparity over these modalities, within the disparity space image, prior to disparity optimization facilitates the recovery of scene depth information in regions where structured light sensing fails. We show that this joint approach, leveraging disparity information from both structured light and cross-spectral sensing, facilitates the joint recovery of global scene depth comprising both texture-less object depth, where conventional stereo otherwise fails, and highly reflective object depth, where structured light (and similar) active sensing commonly fails. The proposed solution is illustrated using dense gradient feature matching and shown to outperform prior approaches that use late-stage fused cross-spectral stereo depth as a facet of improved sensing for consumer depth cameras.

Session

Poster Session

Files

Extended Abstract (PDF, 1 page, 1.5M)
Paper (PDF, 12 pages, 4.3M)
Bibtex File

Citation

Gregoire Payen de La Garanderie and Toby Breckon. Improved Depth Recovery In Consumer Depth Cameras via Disparity Space Fusion within Cross-spectral Stereo. Proceedings of the British Machine Vision Conference. BMVA Press, September 2014.

BibTex

@inproceedings{BMVC.28.110
	title = {Improved Depth Recovery In Consumer Depth Cameras via Disparity Space Fusion within Cross-spectral Stereo},
	author = {Payen de La Garanderie, Gregoire and Breckon, Toby},
	year = {2014},
	booktitle = {Proceedings of the British Machine Vision Conference},
	publisher = {BMVA Press},
	editors = {Valstar, Michel and French, Andrew and Pridmore, Tony}
	doi = { http://dx.doi.org/10.5244/C.28.110 }
}