Perceptual Dynamic Range for In-Camera Image Processing
Praveen Cyriac, David Kane and Marcelo Bertalmio
Abstract
Digital cameras apply a non-linearity to the captured sensor values prior to quantisation. This process is known as perceptual linearisation and ensures that the quantisation rate is approximately proportional to human sensitivity. We propose an adaptive in-camera non-linearity that ensures that the detail and contrast visible in the processed image match closely with the perception of the original scene. The method has been developed to emulate basic properties of the human visual system including contrast normalisation and the efficient coding of natural images via adaptive processes. Our results are validated visually and also quantitatively by two image quality metrics that model human perception. The method works for still and moving images and has a very low computational complexity, accordingly it can be implemented on any digital camera. It can also be applied off-line to RAW images or high dynamic range (HDR) images. We demonstrate the performance of the algorithm using images from digital cinema, mobile phones and amateur photography.
Session
Poster 1
Files
Extended Abstract (PDF, 153K)
Paper (PDF, 4M)
DOI
10.5244/C.29.19
https://dx.doi.org/10.5244/C.29.19
Citation
Praveen Cyriac, David Kane and Marcelo Bertalmio. Perceptual Dynamic Range for In-Camera Image Processing. In Xianghua Xie, Mark W. Jones, and Gary K. L. Tam, editors, Proceedings of the British Machine Vision Conference (BMVC), pages 19.1-19.11. BMVA Press, September 2015.
Bibtex
@inproceedings{BMVC2015_19,
title={Perceptual Dynamic Range for In-Camera Image Processing},
author={Praveen Cyriac and David Kane and Marcelo Bertalmio},
year={2015},
month={September},
pages={19.1-19.11},
articleno={19},
numpages={11},
booktitle={Proceedings of the British Machine Vision Conference (BMVC)},
publisher={BMVA Press},
editor={Xianghua Xie, Mark W. Jones, and Gary K. L. Tam},
doi={10.5244/C.29.19},
isbn={1-901725-53-7},
url={https://dx.doi.org/10.5244/C.29.19}
}