Vlad just sent me the following:
Hi Igor,
I am writing regarding a paper you may find of interest, co-authored with Paul Hand and Oscar Leong. It applies a deep generative prior to phase retrieval, with surprisingly good results! We can show recovery occurs at optimal sample complexity for gaussian measurements, which in a sense resolves the sparse phase retrieval O(k^2 log n) bottleneck.
https://arxiv.org/pdf/1807.04261.pdf
Best,
-Vlad
Thanks Vlad ! Here is the paper:
Phase Retrieval Under a Generative Prior by Paul Hand, Oscar Leong, Vladislav Voroninski
The phase retrieval problem asks to recover a natural signaly0∈Rn fromm quadratic observations, wherem is to be minimized. As is common in many imaging problems, natural signals are considered sparse with respect to a known basis, and the generic sparsity prior is enforced viaℓ1 regularization. While successful in the realm of linear inverse problems, suchℓ1 methods have encountered possibly fundamental limitations, as no computationally efficient algorithm for phase retrieval of ak -sparse signal has been proven to succeed with fewer thanO(k2logn) generic measurements, exceeding the theoretical optimum ofO(klogn) . In this paper, we propose a novel framework for phase retrieval by 1) modeling natural signals as being in the range of a deep generative neural networkG:Rk→Rn and 2) enforcing this prior directly by optimizing an empirical risk objective over the domain of the generator. Our formulation has provably favorable global geometry for gradient methods, as soon asm=O(kd2logn) , whered is the depth of the network. Specifically, when suitable deterministic conditions on the generator and measurement matrix are met, we construct a descent direction for any point outside of a small neighborhood around the unique global minimizer and its negative multiple, and show that such conditions hold with high probability under Gaussian ensembles of multilayer fully-connected generator networks and measurement matrices. This formulation for structured phase retrieval thus has two advantages over sparsity based methods: 1) deep generative priors can more tightly represent natural signals and 2) information theoretically optimal sample complexity. We corroborate these results with experiments showing that exploiting generative models in phase retrieval tasks outperforms sparse phase retrieval methods.
Liked this entry ? subscribe to Nuit Blanche's feed, there's more where that came from. You can also subscribe to Nuit Blanche by Email, explore the Big Picture in Compressive Sensing or the Matrix Factorization Jungle and join the conversations on compressive sensing, advanced matrix factorization and calibration issues on Linkedin.
No comments:
Post a Comment