Thursday, January 15, 2009

CS: Q/A, Accelerating SENSE Using CS, Regularized SENSE thru Bregman Iterations, SparseSENSE, Bregmanized Nonlocal Regularization for Deconvolution

In light of monday's post featuring Michael Lustig presentation on the combination of Parallel Imaging and Compressive Sensing, here are the comments made on that post and other preprints/papers in relation to the same topic followed by a paper from UCLA. In the comment section of Monday's entry one can read an anonymous commenter asking:


I wonder how random sampling is better than poisson-disc sampling in case of single coil with regard to CS point of view as we have local high and low density areas in case of random sampling?

Miki responded with:
I am not yet sure if random sampling is better. Random sampling has uniform coherence (or lack of) between pixels. Poisson disc has higher coherence between pixels at a distance larger than the disc radius, and lower coherence with closer pixels.
Then Davide, another semi-anonymous reader asked:
Which method you are using to generate poisson disc sampling pattern as it is quite computationally expensive.
Miki kindly responded with:
I used dart throwing using the code available at: 
The computational complexity is negligible. We generated a very large set of a poisson disc sampling pattern (supports a Nyquist rate equivalent of 1024x1024 matrix). Whenever we need to scan, we crop the sampling pattern and stretch it according to the desired field of view and resolution. This is implemented on an MRI scanner and has no delay what so ever.  Originally, I think it took about 20 min to generate that sampling pattern, but we did it only once.
Thanks Miki!

The other papers/preprints I found are: Accelerating SENSE Using Compressed Sensing (also viewable here) by Dong Liang, Bo Liu, JiunJie Wang, Leslie Ying. The abstract reads:
Both parallel magnetic resonance imaging (pMRI) and compressed sensing (CS) are emerging techniques to accelerate conventional MRI by reducing the number of acquired data. The combination of pMRI and CS for further acceleration is of great interests. In this paper, we propose two methods to combine SENSE, one of the standard methods for pMRI, and SparseMRI, a recently proposed method for CS-MRI with Cartesian trajectories. The first method, named SparseSENSE, directly formulates the reconstruction from multi-channel reduced k-space data as the same nonlinear convex optimization problem as SparseMRI, except that the encoding matrix is the Fourier transform of the channel-specific sensitivity modulation. The second method, named CS-SENSE, first employs SparseMRI to reconstruct a set of aliased reduced-field-of-view images in each channel, and then applies Cartesian SENSE to reconstruct the final image. The results from simulations, phantom and in vivo experiments demonstrate that both SparseSENSE and CS-SENSE can achieve a reduction factor higher than those achieved by SparseMRI and SENSE individually, and CS-SENSE outperforms SparseSENSE in most cases.

also of interest in light of yesterday's entry, Regularized SENSE Reconstruction Using Bregman Iterations by Bo Liu, Kevin King, Michael Steckner, Jun Xie, Jinhua Sheng and Leslie Ying. The abstract reads: 
In parallel imaging, the signal-to-noise ratio of SENSE reconstruction is usually degraded by the illconditioning problem, which becomes especially serious at large acceleration factors. Existing regularization methods have been shown to alleviate the problem. However, they usually suffer from image artifacts at high acceleration factors due to the large data inconsistency resulting from heavy regularization. In this paper, we propose Bregman iteration for SENSE regularization. Unlike the existing regularization methods where the regularization function is fixed, the method adaptively updates the regularization function using the Bregman distance at different iterations, such that the iteration gradually removes the aliasing artifacts and recovers fine structures before the noise finally comes back. With a discrepancy principle as the stopping criterion, our results demonstrate that the reconstructed image using Bregman iteration preserves both sharp edges lost in Tikhonov regularization and fines structures missed in total variation regularization, while reducing more noise and aliasing artifacts.
Finally, I don't why I did not cover it before, but here it is: SparseSENSE: Randomly-Sampled Parallel Imaging using Compressed Sensing by Bo Liu, Florian Sebert, Yi Ming Zou, and Leslie Ying. The introduction reads:
Since the advent of compressed Sensing (CS) (1), much effort has been made to apply this new concept to various applications (2,3).The most desirable property of CS in MRI application is that it allows sampling of k-space well below Nyquist sampling rate, while still being able to reconstruct the image if certain conditions are satisfied. Recent work (4,5) applied CS to reduce scanning time in conventional Fourier imaging and demonstrated impressive results. In this abstract, we investigate the structure of parallel imaging encoding matrix, and apply CS to parallel imaging to achieve an even higher reduction in scanning time than what can be achieved by each individual method alone. Our experiments show the the combined method, named SparseSENSE, can achieve a reduction factor higher than the number of channels.
Let us note that:
The proposed reconstruction algorithm is computationally intensive, with a running time of 45 minutes for the phantom data on a 2.8GHz CPU/512MB RAM PC.

And therefore it makes sense to consider Multicores or GPUs to speed these computations as the UCLA folks did for a Bregman based algorithm.

While we are talking about UCLA, here is a tech report from these guys entitled: Bregmanized Nonlocal Regularization for Deconvolution and Sparse Reconstruction by Xiaoqun ZhangMartin BurgerXavier Bresson, and Stanley Osher. The abstract reads:
We propose two algorithms based on Bregman iteration and operator splitting technique for nonlocal TV regularization problems. The convergence of the algorithms is analyzed and applications to deconvolution and sparse reconstruction are presented.
I note in the end of the paper that
As expected, the standard TV regularization is not capable of recovering texture patterns presented in these images. The results based on wavelet are obtained by using a daubqf(8) wavelet with maximum decomposition level and optimal thresholding parameter setting in a wide range. Since there is no noise considered in these two examples, we solve the equality constrained problem by activating the continuation option in the GPSR code. The nonlocal regularization schemes with Bregman iteration (BOS/PBOS) achieve the best reconstruction result. Surprisingly, with only few measurements, the image textures are almost perfectly reconstructed by the nonlocal TV regularization. This is because image structures are expressed implicitly in the nonlocal weight function, and the nonlocal regularization process with Bregman iteration provides an efficient way to recover textures without explicitly construct a basis.
Stanley Osher tells me that they intend on making their implementation available at some point in time.

And by the way, good luck Sina

Credit: NASA, Appollo 11 LEM

No comments:

Printfriendly