You probably recall this entry on Re-Weighted l_1 Dynamic Filtering for Time-Varying Sparse Signal Estimation. Adam Charles just sent me an update on the implementation:
Hi Igor,
I promised some code a few weeks ago when I emailed you about our recently submitted paper "Re-Weighted l_1 Dynamic Filtering for Time-Varying Sparse Signal Estimation". Well the code is ready and I posted it up on my website:http://users.ece.gatech.edu/~acharles6/downloads.html. The MATLAB code inside will both perform the functionality in the recently submitted re-weighted l1 dynamic filtering paper as well as the basis pursuit de-noising dynamic filtering from an earlier CISS paper "Sparsity penalties in dynamical system estimation" (both texts are also available on my website:
Cheers,
-Adam
Thanks Adam!. We covered Re-Weighted l_1 Dynamic Filtering for Time-Varying Sparse Signal Estimation but not Sparsity penalties in dynamical system estimation by
Adam Charles, M. Salman Asif, Justin Romberg, Christopher Rozell. The abstract reads:
Credit: NASA/ESA
In this work we address the problem of state estimation in dynamical systems using recent developments in compressive sensing and sparse approximation. We formulate the traditional Kalman filter as a one-step update optimization procedure which leads us to a more unified framework, useful for incorporating sparsity constraints. We introduce three combinations of two sparsity conditions (sparsity in the state and sparsity in the innovations) and write recursive optimization programs to estimate the state for each model. This paper is meant as an overview of different methods for incorporating sparsity into the dynamic model, a presentation of algorithms that unify the support and coefficient estimation, and a demonstration that these suboptimal schemes can actually show some performance improvements (either in estimation error or convergence time) over standard optimal methods that use an impoverished model.
Credit: NASA/ESA
Liked this entry ? subscribe to Nuit Blanche's feed, there's more where that came from. You can also subscribe to Nuit Blanche by Email, explore the Big Picture in Compressive Sensing or the Matrix Factorization Jungle and join the conversations on compressive sensing, advanced matrix factorization and calibration issues on Linkedin.
No comments:
Post a Comment