Nonlinear Dimensionality Reduction of Data by Deep Distributed Random Samplings by Xiao-Lei Zhang

It is believed that if machine can learn human-level invariant semantic concepts from highly-variant real-world data, the real artificial intelligence is able to be touched. If we encode the data and concepts compactly by for example 0 and 1, the golden pathway is a problem of reducing high-dimensional codes to low-dimensional representations. Dimensionality reduction, as a fundamental problem of machine learning---one of the central fields of artificial intelligence, has been intensively studied, where classification and clustering are two special cases of dimensionality reduction that reduce high-dimensional data to discrete points. Multilayer neural networks have demonstrated their great power on dimensionality reduction, which triggered the recent breakthrough of artificial intelligence. However, current multilayer neural networks are limited to large-scale problems, and their training methods are so far complicated and time-consuming. Here we describe a simple multilayer network for dimensionality reduction that each layer of the network is a group of mutually independent k-centers clusterings. We find that the network can be trained successfully layer-by-layer by simply assigning the centers of each clustering by randomly sampled data points from the input. Our results show that the described simple method outperformed 7 well-known dimensionality reduction methods on both very small-scale biomedical data and large-scale image and document data, with much less training time than multilayer neural networks on large-scale data. Our findings imply that if properly designed, a very simple training method can make multilayer networks work well on a wide range of data. Furthermore, given the broad use of simple methods, the described method, which may be easily understood without the domain knowledge, will have a bright future in many branches of science.

An attendant implementation is on Xiao-Lei's page.

**Join the CompressiveSensing subreddit or the Google+ Community and post there !**

Liked this entry ? subscribe to Nuit Blanche's feed, there's more where that came from. You can also subscribe to Nuit Blanche by Email, explore the Big Picture in Compressive Sensing or the Matrix Factorization Jungle and join the conversations on compressive sensing, advanced matrix factorization and calibration issues on Linkedin.

## No comments:

Post a Comment