Monday, June 01, 2015

Hardware for Machine Learning

In the same way I featured hardware designed specifically to materialize compressive sensing, I am also going to start a new tag around all the hardware that is focused on getting machine learning computations done. The tag will be MLHardware

I am not sure what area of knowledge this is mapping to as it can be pretty large, but my focus will be on new technologies that make Machine Learning a first class citizen in that same way Matlab made matrices first class citizen. Graphics cards use will surely be part of that tag but so will any improvement on Quantum computers, stochastic hardware, probabilistic computing and more. I also welcome any information on meetings focused on the matter. From following Eric Jonas' page, I stumbled upon these two interesting papers:

The brain interprets ambiguous sensory information faster and more reliably than modern computers, using neurons that are slower and less reliable than logic gates. But Bayesian inference, which underpins many computational models of perception and cognition, appears computationally challenging even given modern transistor speeds and energy budgets. The computational principles and structures needed to narrow this gap are unknown. Here we show how to build fast Bayesian computing machines using intentionally stochastic, digital parts, narrowing this efficiency gap by multiple orders of magnitude. We find that by connecting stochastic digital components according to simple mathematical rules, one can build massively parallel, low precision circuits that solve Bayesian inference problems and are compatible with the Poisson firing statistics of cortical neurons. We evaluate circuits for depth and motion perception, perceptual learning and causal reasoning, each performing inference over 10,000+ latent variables in real time - a 1,000x speed advantage over commodity microprocessors. These results suggest a new role for randomness in the engineering and reverse-engineering of intelligent computation.

Stochastic Digital Circuits for Probabilistic Inference by Vikash Mansinghka, Eric Jonas, Josh Tenenbaum

We introduce combinational stochastic logic, an abstraction that generalizes deterministic digital circuit design (based on Boolean logic gates) to the probabilistic setting. We show how this logic can be combined with techniques from contemporary digital design to generate stateless and stateful circuits for exact and approximate sampling from a range of probability distributions. We focus on Markov chain Monte Carlo algorithms for Markov random fields, using massively parallel circuits. We implement these circuits on commodity reconfigurable logic and estimate the resulting performance in time, space and price. Using our approach, these simple and general algorithms could beaffordably run for thousands of iterations on models with hundreds of thousands of variables in real time

Date: 08 May 2015
Satellite: Rosetta
Depicts: Comet 67P/Churyumov-Gerasimenko
Copyright: ESA/Rosetta/NAVCAM, CC BY-SA IGO 3.0
Join the CompressiveSensing subreddit or the Google+ Community and post there !
Liked this entry ? subscribe to Nuit Blanche's feed, there's more where that came from. You can also subscribe to Nuit Blanche by Email, explore the Big Picture in Compressive Sensing or the Matrix Factorization Jungle and join the conversations on compressive sensing, advanced matrix factorization and calibration issues on Linkedin.

No comments: