Sometimes, different words mean the same thing,
Sometimes, we don't know if the big O notation means the thing we think it means :-)
- From the internets, I found the following abstract from the upcoming SODA conference: Lower Bounds for Sparse Recovery by Khanh Do Ba, Piotr Indyk, Eric Price. The abstract reads:
Over the recent years, a new *linear* method for compressing high-dimensional data has been discovered. For an n-dimensional vector x, its *sketch* is equal to Ax, where A is an m x n matrix, possibly chosen at random. Although typically the sketch length m is much smaller than the number of dimensions n, the sketch often contains plenty of information about x. A particularly useful and well-studied problem is that of sparse recovery: given Ax, recover a k-sparse vector x* (with at most k non-zeros) such that ||x-x*||_p <= C min_{k-sparse x'} ||x-x'||_q for some norm parameters p and q and an approximation factor C=C(k). Sparse recovery has numerous applications to areas such as data stream computing and compressed sensing. It is known that there exist matrices A that achieve the above guarantee with p=q=1 and constant C (in fact, even somewhat stronger bounds), with sketch length m=O(k log (n/k)). However, perhaps surprisingly, it is not known if this bound can be improved any further, even though matching lower bounds are known for specific *algorithms*, specific *matrix types*, or other recovery scenarios (e.g., involving measurement noise). The lack of tight lower bounds represents a major gap in our understanding of the problem. In this paper we make significant progress on this issue. In particular we show that: - in the deterministic case, where we require one matrix A to work for all signals x, we show a tight lower bound of Omega(k log (n/k)), thus resolving the issue completely - in the randomized case, where we require that a matrix A chosen at random from some distribution should work for a *fixed* vector x with probability at least 1-1/n, we show a lower bound of Omega(log n / log log n)
Sometimes, we think a term means something when in fact it means to the opposite:
- The Mathematics and Images Analysis (M.I.A.) Conference 2009 will be held on December 14-16 at IHP in Paris, and will feature several sparsity and compressed sensing related talks. The program is here.
It really means something. C'est une solution de continuité
ReplyDeleteLaurent,
ReplyDeleteI did not get that one!
Igor.