The theoretical setting of hierarchical Bayesian inference is gaining acceptance as a framework for understanding cortical computation. In this paper, we describe how Bayesian belief propagation in a spatio-temporal hierarchical model, called Hierarchical Temporal Memory (HTM), can lead to a mathematical model for cortical circuits. An HTM node is abstracted using a coincidence detector and a mixture of Markov chains. Bayesian belief propagation equations for such an HTM node define a set of functional constraints for a neuronal implementation. Anatomical data provide a contrasting set of organizational constraints. The combination of these two constraints suggests a theoretically derived interpretation for many anatomical and physiological features and predicts several others. We describe the pattern recognition capabilities of HTM networks and demonstrate the application of the derived circuits for modeling the subjective contour effect. We also discuss how the theory and the circuit can be extended to explain cortical features that are not explained by the current model and describe testable predictions that can be derived from the model.
You can create your own vision experiment or use for free some of the Vision Demos at the company created to support this model (Numenta). I also note from the paper:
In the case of a simplified generative model, an HTM node remembers all the coincidence patterns that are generated by the generative model. In real world cases, where it is not possible to store all coincidences encountered during learning, we have found that storing a fixed number of a random selection of the coincidence patterns is sufficient as long as we allow multiple coincidence patterns to be active at the same time. Motivation for this method came from the field of compressed sensing [20]. The HMAX model of visual cortex [21] and some versions of convolutional neural networks [22] also use this strategy. We have found that reasonable results can be achieved with a wide range of the number of coincidences stored.Compressed Sensing and HMAX in the same sentence, uhh..., this echoes some of the observation made a while back here:
- Compressed Sensing in the Primary Visual Cortex ?
- Compressed Sensing, Primary Visual Cortex, Dimensionality Reduction, Manifolds and Autism
By the way, I still own a Handspring Visor, a machine created by Jeff Hawkins. Let us hope that this software breaks a new market like the Visor did in its time.
He is a video of Jeff at TED back in 2003. When are we going to have a speaker at TED on Compressed Sensing proper ? Hey I volunteer on any one of the tech featured in These Technologies Do Not Exist.Photo: Lincoln from afar and closeby, Dali Museum in Rosas.
You might want to check out a March 2010 video from Jeff Hawkins in which he describes a brand new algorithm that Numenta is developing for emulating brain learning that is much more biologically grounded. It uses fixed-sparsity distributed representations for learning:
ReplyDeletehttp://www.youtube.com/watch?v=TDzr0_fbnVk