I am not an expert in quantum computing and how this will change the world. However, here are some items that might receive a special attention from you the readers considering some of the themes and results witnessed here on Nuit Blanche.
First item of interest: you might recall this L0 regularization using an adiabatic quantum computing algorithm as a replacement of current heuristic algorithms ( see here ). While, Random Kitchen Sinks (RKS) seem to already do better than Adaboost, yesterday's posting show another "heuristic" that seems to be on its way to do even better than RKS in terms of speed. Heuristics may be bad because they have no theoretical grounding until you find out there is a good reason they work. The field of compressive sensing took off in 2004 only because some heuristics were finally understood in some specific cases. It led the way to comforting many researchers into thinking it was OK not to have an absolute theoretical grounding to investigate further. Since then, many empirical results do not need the stamp of approval it once needed to go through peer review. People have gotten over some of these fear of rejection issues.
Second item of interest: This arxiv paper that came out last month featuring an O(sqrt(k)log(k)) quantum algorithm for the combinatorial group testing problem that some could see as a subset of compressive sensing. It is indeed an improvement over O(k log(k/N)) but is it worth a change in technology ? The answer is no. While these approaches seem very interesting, I currently fail to see how a combination of concentration of measure/randomization results and silicon can be surpassed within, at least, the next twenty years.
Liked this entry ? subscribe to Nuit Blanche's feed, there's more where that came from. You can also subscribe to Nuit Blanche by Email, explore the Big Picture in Compressive Sensing or the Matrix Factorization Jungle and join the conversations on compressive sensing, advanced matrix factorization and calibration issues on Linkedin.