Multispectral Deep Neural Networks for Pedestrian Detection
Jingjing Liu, Shaoting Zhang, Shu Wang and Dimitris Metaxas
Abstract
Multispectral pedestrian detection is essential for around-the-clock applications, e.g., surveillance and autonomous driving. We deeply analyze Faster R-CNN for multispectral pedestrian detection task and then model it into a convolutional network (ConvNet) fusion problem. Further, we discover that ConvNet-based pedestrian detectors trained by color or thermal images separately provide complementary information in discriminating human instances. Thus there is a large potential to improve pedestrian detection by using color and thermal images in DNNs simultaneously. We carefully design four ConvNet fusion architectures that integrate two-branch ConvNets on different DNNs stages, all of which yield better performance compared with the baseline detector. Our experimental results on KAIST pedestrian benchmark show that the Halfway Fusion model that performs fusion on the middle-level convolutional features outperforms the baseline method by 19% and yields a missing rate 7.5% lower than the other proposed architectures.
Session
Recognition and Physics-based vision
Files
Extended Abstract (PDF, 128K)
Paper (PDF, 612K)
DOI
10.5244/C.30.73
https://dx.doi.org/10.5244/C.30.73
Citation
Jingjing Liu, Shaoting Zhang, Shu Wang and Dimitris Metaxas. Multispectral Deep Neural Networks for Pedestrian Detection. In Richard C. Wilson, Edwin R. Hancock and William A. P. Smith, editors, Proceedings of the British Machine Vision Conference (BMVC), pages 73.1-73.13. BMVA Press, September 2016.
Bibtex
@inproceedings{BMVC2016_73,
title={Multispectral Deep Neural Networks for Pedestrian Detection},
author={Jingjing Liu, Shaoting Zhang, Shu Wang and Dimitris Metaxas},
year={2016},
month={September},
pages={73.1-73.13},
articleno={73},
numpages={13},
booktitle={Proceedings of the British Machine Vision Conference (BMVC)},
publisher={BMVA Press},
editor={Richard C. Wilson, Edwin R. Hancock and William A. P. Smith},
doi={10.5244/C.30.73},
isbn={1-901725-59-6},
url={https://dx.doi.org/10.5244/C.30.73}
}