Time-Resolved Image Demixing by Ayush Bhandari, Aurélien Bourquart, Shahram Izadi, and Ramesh Raskar
When multiple light paths combine at a given location on an image sensor, an image mixture is created. Demixing or recovering the original constituent components in such cases is a highly ill–posed problem. A number of elegant solutions have thus been developed in the literature, relying on measurement diversity such as polarization, shift, motion, or scene features. In this paper, we approach the image-mixing problem as a time–resolved phenomenon—if every photon arriving at the sensor could be time–stamped, the demixing problem would then amount to separating transient events in time. Based on this idea, we first show that, while acquiring measurements is prohibitive and challenging in the time domain, this task is surprisingly straightforward in the frequency domain. We then establish a link between frequency-domain measurements and consumer time–of–flight (ToF) imaging. Finally, we propose a demixing algorithm, relying only on magnitude information of the ToF sensor. We show that our problem is closely tied to the topic of phase retrieval and that for K-image mixture,(K2 -K)/2 + 1 magnitude-only ToF measurements suffice to demix images exactly in noiseless settings. Our developments are corroborated with experiments on synthetic and ToF data acquired using the Microsoft Kinect sensor.
The attendant website is here.
h/t Laurent
Liked this entry ? subscribe to Nuit Blanche's feed, there's more where that came from. You can also subscribe to Nuit Blanche by Email, explore the Big Picture in Compressive Sensing or the Matrix Factorization Jungle and join the conversations on compressive sensing, advanced matrix factorization and calibration issues on Linkedin.
No comments:
Post a Comment