Vlad just sent me the following:
Thanks Vlad ! Here are the two papers:
An Elementary Proof of Convex Phase Retrieval in the Natural Parameter Space via the Linear Program PhaseMax by Paul Hand, Vladislav Voroninski
Compressed Sensing from Phaseless Gaussian Measurements via Linear Programming in the Natural Parameter Space by Paul Hand, Vladislav Voroninski
Hi Igor,
I am writing regarding two new results which you may find of interest, they are joint work with Paul Hand.
The first is an elementary proof that PhaseMax, a linear program in the natural parameter space, achieves noiseless phase retrieval when coupled with the standard spectral initializer. This is in stark contrast to highly technical approaches by Candes et al. and Wright et al. using non-convex programs with the same goals, and simpler yet than the geometric probability/VC dimension arguments by the originators of PhaseMax:
The second, is a result of a new kind which connects faithfully compressed sensing and phase retrieval in the natural parameter space. Specifically, we establish that given an anchor vector that correlates with a sparse vector x_0 and O(k log n) iid gaussian magnitude measurements about x_0, a linear program in the natural parameter space called SparsePhaseMax recovers x_0 w.h.p, provided that x_0 is sufficiently sparse. This result also establishes a curious connection between phase retrieval and 1-bit compressed sensing.
Best,
-Vlad
Vladislav Voroninski
http://www.mit.edu/~vvlad/
Thanks Vlad ! Here are the two papers:
An Elementary Proof of Convex Phase Retrieval in the Natural Parameter Space via the Linear Program PhaseMax by Paul Hand, Vladislav Voroninski
The phase retrieval problem has garnered significant attention since the development of the PhaseLift algorithm, which is a convex program that operates in a lifted space of matrices. Because of the substantial computational cost due to lifting, many approaches to phase retrieval have been developed, including non-convex optimization algorithms which operate in the natural parameter space, such as Wirtinger Flow. Very recently, a convex formulation called PhaseMax has been discovered, and it has been proven to achieve phase retrieval via linear programming in the natural parameter space under optimal sample complexity. The current proofs of PhaseMax rely on statistical learning theory or geometric probability theory. Here, we present a short and elementary proof that PhaseMax exactly recovers real-valued vectors from random measurements under optimal sample complexity. Our proof only relies on standard probabilistic concentration and covering arguments, yielding a simpler and more direct proof than those that require statistical learning theory, geometric probability or the highly technical arguments for Wirtinger Flow-like approaches.
Compressed Sensing from Phaseless Gaussian Measurements via Linear Programming in the Natural Parameter Space by Paul Hand, Vladislav Voroninski
We consider faithfully combining phase retrieval with classical compressed sensing. Inspired by the recent novel formulation for phase retrieval called PhaseMax, we present and analyze SparsePhaseMax, a linear program for phaseless compressed sensing in the natural parameter space. We establish that when provide d with an initialization that correlates with a k-sparse n-vector, SparsePhaseMax recovers this vector up to global sign with high probability from O(klognk) magnitude measurements against i.i.d. Gaussian random vectors. Our proof of this fact exploits a curious newfound connection between phaseless and 1-bit compressed sensing. This is the first result to establish bootstrapped compressed sensing from phaseless Gaussian measurements under optimal sample complexity.
Liked this entry ? subscribe to Nuit Blanche's feed, there's more where that came from. You can also subscribe to Nuit Blanche by Email, explore the Big Picture in Compressive Sensing or the Matrix Factorization Jungle and join the conversations on compressive sensing, advanced matrix factorization and calibration issues on Linkedin.
No comments:
Post a Comment