Entropy-Scaling Search of Massive Biological Data by Y. William Yu, Noah M. Daniels, David Christian Danko, Bonnie Berger

## Highlights

- •We describe entropy-scaling search for finding approximate matches in a database
- •Search complexity is bounded in time and space by the entropy of the database
- •We make tools that enable search of three largely intractable real-world databases
- •The tools dramatically accelerate metagenomic, chemical, and protein structure search

## Summary

Many datasets exhibit a well-defined structure that can be exploited to design faster search tools, but it is not always clear when such acceleration is possible. Here, we introduce a framework for similarity search based on characterizing a dataset’s entropy and fractal dimension. We prove that searching scales in time with metric entropy (number of covering hyperspheres), if the fractal dimension of the dataset is low, and scales in space with the sum of metric entropy and information-theoretic entropy (randomness of the data). Using these ideas, we present accelerated versions of standard tools, with no loss in specificity and little loss in sensitivity, for use in three domains—high-throughput drug screening (Ammolite, 150× speedup), metagenomics (MICA, 3.5× speedup of DIAMOND [3,700× BLASTX]), and protein structure search (esFragBag, 10× speedup of FragBag). Our framework can be used to achieve “‘compressive omics,” and the general theory can be readily applied to data science problems outside of biology (source code: http://gems.csail.mit.edu).

Obviously the implementation is here.

**Join the CompressiveSensing subreddit or the Google+ Community or the Facebook page and post there !**

Liked this entry ? subscribe to Nuit Blanche's feed, there's more where that came from. You can also subscribe to Nuit Blanche by Email, explore the Big Picture in Compressive Sensing or the Matrix Factorization Jungle and join the conversations on compressive sensing, advanced matrix factorization and calibration issues on Linkedin.

## No comments:

Post a Comment