Compressed Sensing and Bayesian Experimental Design by Mathias Seeger, Hannes Nickisch. The abstract of the presentation reads:
We relate compressed sensing (CS) with Bayesian experimental design and provide a novel efficient approximate method for the latter, based on expectation propagation. In a large comparative study about linearly measuring natural images, we show that the simple standard heuristic of measuring Wavelet coefficients top-down systematically outperforms CS methods using random measurements; the sequential projection optimisation approach of [Ji & Carin 2007] performs even worse. We also show that our own approximate Bayesian method is able to learn measurement filters on full images efficiently which outperform the Wavelet heuristic. To our knowledge, ours is the first successful attempt at {}"learning compressed sensing" for images of realistic size. In contrast to common CS methods, our framework is not restricted to sparse signals, but can readily be applied to other notions of signal complexity or noise models. We give concrete ideas how our method can be scaled up to large signal representations.
Autonomous Geometric Precision Error Estimation in Low-level Computer Vision Tasks, a work by Andres Corrada-Emmanuel and Howard Schultz presented by John Paisley from Duke.
The abstract of the paper reads:
Errors in map-making tasks using computer vision are sparse. We demonstrate this by considering the construction of digital elevation models that employ stereo matching algorithms to triangulate real-world points. This sparsity, coupled with a geometric theory of errors recently developed by the authors, allows for autonomous agents to calculate their own precision independently of ground truth. We connect these developments with recent advances in the mathematics of sparse signal reconstruction or compressed sensing. The theory presented here extends the autonomy of 3-D model reconstructions discovered in the 1990s to their errors.
Both videos have been added to the CS video section.
No comments:
Post a Comment