Wednesday, December 21, 2011

Sparsity-based single-shot sub-wavelength coherent diffractive imaging

Thanks to Sylvain Gigan for the heads-up on this one. We have covered it before but there seems to be additional findings (simpler solvers,....)

We present the experimental reconstruction of sub-wavelength features from the far-field intensity of sparse optical objects: sparsity-based sub-wavelength imaging combined with phase-retrieval. As examples, we demonstrate the recovery of random and ordered arrangements of 100 nm features with the resolution of 30 nm, with an illuminating wavelength of 532 nm. Our algorithmic technique relies on minimizing the number of degrees of freedom; it works in real-time, requires no scanning, and can be implemented in all existing microscopes - optical and non-optical.
Much could be gathered from the Supplemental Information Section but I cannot seem to locate it. However, I note from their paper the following:

.....It is in fact a universal scheme for recovering information beyond the cut-off of the response function of  a general system, relying only on a priori knowledge that the information is sparse in a known basis. As an exciting example, we have recently investigated the ability to utilize this method for recovering the actual shape of very short optical pulses measured by a slow detector [36]. Our preliminary theoretical and experimental results indicate, unequivocally, that our method offers an improvement by orders of magnitude beyond the most sophisticated deconvolution methods. 
Looks like FROG may have a competitor. If you recall FROG  tries to perform a certain kind of convolution/deconvolution so that, in Rick Trebino's words:

In order to measure an event in time, you must use a shorter one. But then, to measure the shorter event, you must use an even shorter one. And so on. So, now, how do you measure the shortest event ever created?

More can be found on this hardware based convolution/deconvolution solution in the following entries:
Eventually, while the method may look like they are competing, it would be interesting if we ever find out how theoretically  FROG provides additional constraints thereby allowing recovery of less than sparse signals.

An earlier paper from that group (and behind a paywall):

We demonstrate experimentally pulse-shape reconstruction at resolution that significantly exceeds the photodiode inherent resolution limit. The knowledge that pulses are inherently sparse enables us to retrieve data that is otherwise hidden in the noise.

No comments: