Deep View-Sensitive Pedestrian Attribute Inference in an end-to-end Model

M. Saquib Saquib, Arne Schumann, Yan Wang and Rainer Stiefelhagen

Abstract

Pedestrian attribute inference is a demanding problem in visual surveillance that can facilitate person retrieval, search and indexing. To exploit semantic relations between attributes, recent research treats it as a multi-label image classification task. The visual cues hinting at attributes can be strongly localized and inference of person attributes such as hair, backpack, shorts, etc., are highly dependent on the acquired view of the pedestrian. In this paper we assert this dependence in an end-to-end learning framework and show that a view-sensitive attribute inference is able to learn better attribute predictions. Our proposed model jointly predicts the coarse pose (view) of the pedestrian and learns specialized view-specific multi-label attribute predictions.

Session

Posters

Files

PDF iconPaper (PDF)
PDF iconSupplementary (PDF)

DOI

10.5244/C.31.134
https://dx.doi.org/10.5244/C.31.134

Citation

M. Saquib Saquib, Arne Schumann, Yan Wang and Rainer Stiefelhagen. Deep View-Sensitive Pedestrian Attribute Inference in an end-to-end Model. In T.K. Kim, S. Zafeiriou, G. Brostow and K. Mikolajczyk, editors, Proceedings of the British Machine Vision Conference (BMVC), pages 134.1-134.13. BMVA Press, September 2017.

Bibtex

            @inproceedings{BMVC2017_134,
                title={Deep View-Sensitive Pedestrian Attribute Inference in an end-to-end Model},
                author={M. Saquib Saquib, Arne Schumann, Yan Wang and Rainer Stiefelhagen},
                year={2017},
                month={September},
                pages={134.1-134.13},
                articleno={134},
                numpages={13},
                booktitle={Proceedings of the British Machine Vision Conference (BMVC)},
                publisher={BMVA Press},
                editor={Tae-Kyun Kim, Stefanos Zafeiriou, Gabriel Brostow and Krystian Mikolajczyk},
                doi={10.5244/C.31.134},
                isbn={1-901725-60-X},
                url={https://dx.doi.org/10.5244/C.31.134}
            }