Abstract. Regularization plays a key role in a variety of optimization formulations of inverse problems. A recurring question in regularization approaches is the selection of regularization pa- rameters, and its e ect on the solution and on the optimal value of the optimization problem. The sensitivity of the value function to the regularization parameter can be linked directly to the Lagrange multipliers. In this paper, we fully characterize the variational properties of the value functions for a broad class of convex formulations, which are not all covered by standard Lagrange multiplier theory. We also present an inverse function theorem that links the value functions of di erent regularization formulations (not necessarily convex). These results have implications for the selection of regularization parameters, and the development of specialized algorithms. We give numerical examples that illustrate the theoretical results.
The software used is SPGL1 and the example provided in the paper is here.. SPGL1 can now be forked on GitHub.
Liked this entry ? subscribe to Nuit Blanche's feed, there's more where that came from. You can also subscribe to Nuit Blanche by Email, explore the Big Picture in Compressive Sensing or the Matrix Factorization Jungle and join the conversations on compressive sensing, advanced matrix factorization and calibration issues on Linkedin.
No comments:
Post a Comment