Pages

Friday, August 15, 2014

Data Driven Sensor Design: Learning to be a Depth Camera for Close-Range Human Capture and Interaction

Here is an instance of Data Driven Sensor Design or Zero Knowledge Sensor Design. Let us remind our readers what Data Driven Sensor Design is:

  1. Pick a sensor, 
  2. Discover its transfer function with supervised learning, 
  3. You're done.



Learning to be a Depth Camera for Close-Range Human Capture and Interaction by Sean Fanello, Cem Keskin, Shahram Izadi, Pushmeet Kohli, David Kim, David Sweeney, Antonio Criminisi, Jamie Shotton, Sing Bing Kang, and Tim Paek


We present a machine learning technique for estimating absolute, per-pixel depth using any conventional monocular 2D camera, with minor hardware modifications. Our approach targets close-range human capture and interaction where dense 3D estimation of hands and faces is desired. We use hybrid classification-regression forests to learn how to map from near infrared intensity images to absolute, metric depth in real-time. We demonstrate a variety of humancomputer interaction and capture scenarios. Experiments show an accuracy that outperforms a conventional light fall-off baseline, and is comparable to high-quality consumer depth cameras, but with a dramatically reduced cost, power consumption, and form-factor.





Of note this site: http://www.eigenimaging.com/DIY/NexusDYI for modifying smartphone camera module for Near Infrared (NIR) imaging such as night vision and multispectral imaging applications


Join the CompressiveSensing subreddit or the Google+ Community and post there !
Liked this entry ? subscribe to Nuit Blanche's feed, there's more where that came from. You can also subscribe to Nuit Blanche by Email, explore the Big Picture in Compressive Sensing or the Matrix Factorization Jungle and join the conversations on compressive sensing, advanced matrix factorization and calibration issues on Linkedin.

No comments:

Post a Comment