After attending the Teratec Forum and watching how some contractors were beginning to use HPC as a way of testing full scale designs of their products - a proxy of that use shows through the exponential use of Teraflops for their internal computations in the figure below -, I wonder if some of these computations could use some of the new techniques developed below.
You may recall this Sunday Morning Insight entry on Physics Driven Sensor Design ? well it looks like some of the authors are making progress toward using the constraint of sparsity in field equations and they find that the analysis approach does improve the solver in terms of measurements, memory size and runtime: what's not to like ?
Physics-driven inverse problems made tractable with cosparse regularization by Srđan Kitić, Laurent Albera, Nancy Bertin, Rémi Gribonval
Sparse data models are powerful tools for solving ill-posed inverse problems. We present a regularization framework based on the sparse synthesis and sparse analysis models for problems governed by linear partial differential equations. Although nominally equivalent, we show that the two models differ substantially from a computational perspective: unlike the sparse synthesis model, its analysis counterpart has much better scaling capabilities and can indeed be faster when more measurement data is available. Our findings are illustrated on two examples, sound source localization and brain source localization, which also serve as showcases for the regularization framework. To address this type of inverse problems, we develop a specially tailored convex optimization algorithm based on the Alternating Direction Method of Multipliers.
Liked this entry ? subscribe to Nuit Blanche's feed, there's more where that came from. You can also subscribe to Nuit Blanche by Email, explore the Big Picture in Compressive Sensing or the Matrix Factorization Jungle and join the conversations on compressive sensing, advanced matrix factorization and calibration issues on Linkedin.