Phil Schniter sent me the following yesterday following a link to the recent arxiv preprint: Approximate Message Passing under Finite Alphabet Constraints by Andreas Muller, Dino Sejdinovic, Robert Piechocki
"..hi igor,Regarding the paper "Approximate Message Passing under Finite Alphabet Constraints", it's a great idea to exploit such prior info when available. for fairness, your readers may be be interested to hear that finite-alphabet priors have been part of the GAMPmatlab package (http://gampmatlab.wikia.com/wiki/Generalized_Approximate_Message_Passing) since its inception. moreover, such priors have been used with GAMP to do joint decoding and channel estimation in our work http://www2.ece.ohio-state.edu/~schniter/pdf/jstsp11_ofdm.pdf.cheers,phil..."
Thanks Phil. You all probably remembered when Phil schooled me in alphabet issues ( A Small Q&A with Phil Schniter on TurboGAMP. )
I came across this very nice example in the Python based Scikits Learn package of a Compressive sensing: tomography reconstruction with L1 prior (Lasso). Let us recall that another solver written in Python includes the ASPICS toolbox.
Talking about Phil and ASPICS reminded that I drew a graph of the recent events that occurred in 2011 in compressive sensing, both Phil and the folks behind ASPICS (Florent Krzakala, Marc Mézard, François Sausset, Yifan Sun, Lenka Zdeborová) played a no small part in these improvements. I don't know if it came out right but here it is:
Anna Gilbert has a new entries on her two recent lectures on compressive sensing.
Finally, on Quora I asked: In the Qualcomm X Prize, what should the 15 diseases be using an ultrasound based Tricorder?
No comments:
Post a Comment