Pages

Friday, July 13, 2018

Phase Retrieval Under a Generative Prior


Vlad just sent me the following: 
Hi Igor,

I am writing regarding a paper you may find of interest, co-authored with Paul Hand and Oscar Leong. It applies a deep generative prior to phase retrieval, with surprisingly good results! We can show recovery occurs at optimal sample complexity for gaussian measurements, which in a sense resolves the sparse phase retrieval O(k^2 log n) bottleneck.

https://arxiv.org/pdf/1807.04261.pdf


Best,

-Vlad

Thanks Vlad ! Here is the paper:

Phase Retrieval Under a Generative Prior by Paul Hand, Oscar Leong, Vladislav Voroninski
The phase retrieval problem asks to recover a natural signal y0Rn from m quadratic observations, where m is to be minimized. As is common in many imaging problems, natural signals are considered sparse with respect to a known basis, and the generic sparsity prior is enforced via 1 regularization. While successful in the realm of linear inverse problems, such 1 methods have encountered possibly fundamental limitations, as no computationally efficient algorithm for phase retrieval of a k-sparse signal has been proven to succeed with fewer than O(k2logn) generic measurements, exceeding the theoretical optimum of O(klogn). In this paper, we propose a novel framework for phase retrieval by 1) modeling natural signals as being in the range of a deep generative neural network G:RkRn and 2) enforcing this prior directly by optimizing an empirical risk objective over the domain of the generator. Our formulation has provably favorable global geometry for gradient methods, as soon as m=O(kd2logn), where d is the depth of the network. Specifically, when suitable deterministic conditions on the generator and measurement matrix are met, we construct a descent direction for any point outside of a small neighborhood around the unique global minimizer and its negative multiple, and show that such conditions hold with high probability under Gaussian ensembles of multilayer fully-connected generator networks and measurement matrices. This formulation for structured phase retrieval thus has two advantages over sparsity based methods: 1) deep generative priors can more tightly represent natural signals and 2) information theoretically optimal sample complexity. We corroborate these results with experiments showing that exploiting generative models in phase retrieval tasks outperforms sparse phase retrieval methods.



Join the CompressiveSensing subreddit or the Google+ Community or the Facebook page and post there !
Liked this entry ? subscribe to Nuit Blanche's feed, there's more where that came from. You can also subscribe to Nuit Blanche by Email, explore the Big Picture in Compressive Sensing or the Matrix Factorization Jungle and join the conversations on compressive sensing, advanced matrix factorization and calibration issues on Linkedin.

No comments:

Post a Comment