Pages

Friday, August 09, 2013

SAHD: Gigapixel! - Michael Gehm

From the SAHD workshop:

Michael Gehm - Gigapixel! , Slides , This time instead of a commercial 360 panorama taken like it was done at SPARS11, a picture of the meeting's participants was taken with the Gigapixel Camera (630 MB)


from the Nature paper ( see below), I note the following:

 Ubiquitous gigapixel cameras may transform the central challenge of photography from the question of where to point the camera to that of how to mine the data.
and from his slides, I note the increased complexities of PSF engineering at the gigapixel level, the exact knowledge of the camera location for calibration purposes, the use of Maximum Likelihood for the stitching paet  and the Map/Reduce route taken by the group. I wonder if a GraphLab solution might be an even more interesting solution.




Related:

Multiscale gigapixel photography D. J. Brady,  M. E. Gehm, R. A. Stack, D. L. Marks, D. S. Kittle, D. R. Golish, E. M. Vera & S. D. Feller
Pixel count is the ratio of the solid angle within a camera’s field of view to the solid angle covered by a single detector element. Because the size of the smallest resolvable pixel is proportional to aperture diameter and the maximum field of view is scale independent, the diffraction-limited pixel count is proportional to aperture area. At present, digital cameras operate near the fundamental limit of 1–10 megapixels for millimetre-scale apertures, but few approach the corresponding limits of 1–100 gigapixels for centimetre-scale apertures. Barriers to high-pixel-count imaging include scale-dependent geometric aberrations, the cost and complexity of gigapixel sensor arrays, and the computational and communications challenge of gigapixel image management. Here we describe the AWARE-2 camera, which uses a 16-mm entrance aperture to capture snapshot, one-gigapixel images at three frames per minute. AWARE-2 uses a parallel array of microcameras to reduce the problems of gigapixel imaging to those of megapixel imaging, which are more tractable. In cameras of conventional design, lens speed and field of view decrease as lens scale increases1, but with the experimental system described here we confirm previous theoretical results2, 3, 4, 5, 6 suggesting that lens speed and field of view can be scale independent in microcamera-based imagers resolving up to 50 gigapixels. Ubiquitous gigapixel cameras may transform the central challenge of photography from the question of where to point the camera to that of how to mine the data.

but also previously:

Thank you to the organizers of SAHDDavid BradyRobert Calderbank,Lawrence CarinIngrid Daubechies,David DunsonMauro MaggioniSayan MukherjeeGuillermo Sapiro and Rebecca Willett for making these videos available.


Join the CompressiveSensing subreddit or the Google+ Community and post there !
Liked this entry ? subscribe to Nuit Blanche's feed, there's more where that came from. You can also subscribe to Nuit Blanche by Email, explore the Big Picture in Compressive Sensing or the Matrix Factorization Jungle and join the conversations on compressive sensing, advanced matrix factorization and calibration issues on Linkedin.

No comments:

Post a Comment