Bilevel Learning of L1 Regularizers with Closed-form Gradients (BLoRC)

By Avrajit Ghosh (Previosuly worked on by Mike McCann)

Posted by SLIM on November 10, 2022 · 1 min read

Abstract

We present a method for supervised learning of sparsity-promoting regularizers, which are a key ingredient in many modern signal reconstruction approaches. The parameters of the regularizer are learned to minimize the mean squared error of reconstruction on a training set of ground truth signal and measurement pairs. Training involves solving a challenging bilevel optimization problem with a nonsmooth lower-level objective.

We derive an expression for the gradient of the training loss using the implicit closed-form solution of the lower-level variational problem given by its dual problem, and provide an accompanying gradient descent algorithm (dubbed BLORC) to minimize the loss. Our experiments on simple natural images and for denoising 1D signals show that the proposed method can learn meaningful operators and the analytical gradients are calculated faster than standard automatic differentiation methods. While the approach we present is applied to denoising, we believe that it could be adapted to a wide variety of inverse problems with linear measurement models, thus giving it applicability in a wide range of scenarios.

Demo Image

References

A. Ghosh, M. T. Mccann and S. Ravishankar, "Bilevel Learning of ℓ1 Regularizers with Closed-Form Gradients (BLORC)," ICASSP 2022 - 2022 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP), Singapore, Singapore, 2022, pp. 1491-1495, doi: 10.1109/ICASSP43922.2022.9747201.

A Ghosh, MT McCann, M Mitchell, S Ravishankar, "Learning Sparsity-Promoting Regularizers using Bilevel Optimization," arXiv preprint arXiv:2207.08939, 2022 (to appear in SIAM journal Imaging Sciences)