Uncertainty Modeling and Quantification in Neural Network Image Denoisers

Image denoising is a fundamental process in most of computer vision systems, imaging systems, and photography productions. Recently, with the power of deep neural networks, image denoising has been pushed towards new boundaries. However, neural network image denoisers are constrained by the accuracy of the noise model used to train them. Training on a poor noise model results in poor generalization performance on real-world images. Moreover, it is still unclear how to quantitatively assess the output of a neural network image denoiser, especially in real-world cases where ground truth is unavailable. To this end, this project aims at pushing the state-of-the-art denoising performance through: (1) training neural network image denoisers on more accurate noise models derived form real images; and (2) using loss functions that models and quantifies the uncertainty of the network outputs, hence, we can have a confidence measure of the denoising results.

Intern: 
Abdelrahman Abdelhamed
Faculty Supervisor: 
Michael Brown
Province: 
Ontario
University: 
Partner: 
Partner University: 
Program: