Thursday, November 21, 2013

Signal Recovery from $\ell_p$ Pooling Representations

Happy Thanksgiving y'all.

In Sunday Morning Insight: A Quick Panorama of Sensing from Direct Imaging to Machine Learning, we saw that the main difference between neural networks these days and traditional sensing revolved around the acquisition stage: It is linear in compressive sensing while nonlinear in neural networks. At some point the two fields will collide (for one thing Imaging CMOSes are already nonlinear devices and there is this issue of quantization and 1bit compressive sensing lurking beneath). One of the question would be to ask how does the nonlinear aquisition stage permits some sort of recovery, which seems to be the question asked in the following paper (and it seems that there is a clear connection to phase retrieval.)

In this work we compute lower Lipschitz bounds of $\ell_p$ pooling operators for $p=1, 2, \infty$ as well as $\ell_p$ pooling operators preceded by half-rectification layers. These give sufficient conditions for the design of invertible neural network layers. Numerical experiments on MNIST and image patches confirm that pooling layers can be inverted with phase recovery algorithms. Moreover, the regularity of the inverse pooling, controlled by the lower Lipschitz constant, is empirically verified with a nearest neighbor regression.

No comments: