One of Suresh's entry led me to the page of Haesun Park that presents some efficient NMF codes that include sparse NMF solvers.
Next, I found the following: Modified-CS_Residual For Recursive Reconstruction of Highly Undersampled Functional MRI Sequences by Wei Lu, Taoran Li, Ian C. Atkinson, Namrata Vaswani. The abstract reads:
In this work, we study the application of compressive sensing (CS) based approaches for blood oxygenation level dependent (BOLD) contrast functional MR imaging (fMRI). In particular, we show, via exhaustive experiments on actual MR scanner data for brain fMRI, that our recently proposed approach for recursive reconstruction of sparse signal sequences, modiﬁed-CS-residual, outperforms other existing CS based approaches. Modiﬁed-CS-residual exploits the fact that the sparsity pattern of brain fMRI sequences and their signal values change slowly over time. It provides a fast, yet accurate, reconstruction approach that is able to accurately track the changes of the active pixels, while using only about 30% measurements per frame. Signiﬁcantly improved performance over existing work is shown in terms of practically relevant metrics such as active pixel time courses, activation maps and receiver operating characteristic (ROC) curves.
One can find the Modified-CS webpage here and Modified-CS code here. The following documents are also of interest:
- Recursive Sparse Recovery and Applications in Dynamic Imaging by Namrata Vaswani
- Particle Filtered Modiﬁed-CS (PaFiMoCS) for tracking signal sequences by Samarjit Das and Namrata Vaswani
- PhD thesis of Samarjit Das: Particle Filtering on Large Dimensional State Spaces and Applications in Computer Vision
Eventually one wonders how Modified-CS compares to an MMV approach as Zhilin Zhang's did for KF-CS in Is MMV more suitable for dynamic environment (time-varying sparsity) than KF-CS and LS-CS? Part 1, Part 2 and Part 3.
The next two papers reminded me of two papers by Dick Gordon
The first one reminded me of his On Monte Carlo algebra paper, where sampling is performed on algebraic expressions. In the following paper, the analytical expressions are sparse in a large combinatorial set of algebraic expressions: Predicting catastrophes in nonlinear dynamical systems by compressive sensing by Wen-Xu Wang, Rui Yang, Ying-Cheng Lai, Vassilios Kovanis, and Celso Grebogi.The abstract reads:
An extremely challenging problem of signiﬁcant interest is to predict catastrophes in advance of their occurrences. We present a general approach to predicting catastrophes in nonlinear dynamical systems under the assumption that the system equations are completely unknown and only time series reﬂecting the evolution of the dynamical variables of the system are available. Our idea is to expand the vector ﬁeld or map of the underlying system into a suitable function series and then to use the compressive-sensing technique to accurately estimate the various terms in the expansion. Examples using paradigmatic chaotic systems are provided to demonstrate our idea
The next paper is his well known paper introducing ART
Convergence Analysis of SART by Bregman Iteration and Dual Gradient Descent by Ming Yan. The abstract reads:
Simultaneous algebraic reconstruction technique (SART) [1, 2] is an iterative method for solving inverse problems of the form Ax = b. This type of problems arises for example in computed tomography reconstruction, when A is obtained from the discrete Radon transform. In this paper, we provide two new methods for the derivation of the SART method. Using these, we also prove in two diﬀerent ways the convergence of the simultaneous algebraic reconstruction technique. The ﬁrst approach uses the linearized Bregman iteration, while the second approach uses the dual gradient descent method. These novel proofs can be applied to other Landweber-like schemes such as Cimmino’s algorithm and component averaging (CAV). Furthermore the noisy case is considered and error estimate is given. Several numerical experiments for computed tomography are provided to demonstrate the convergence results in practice.