Workshop on Analysis Operator Learning vs. Dictionary Learning: Fraternal Twins in Sparse Modeling
Abstract
Exploiting structure in data is crucial for the success of many techniques in neuroscience, machine learning, signal processing, and statistics. In this context, the fact that data of interest can be modeled via sparsity has been proven extremely valuable. As a consequence, numerous algorithms either aiming at learning sparse representations of data, or exploiting sparse representations in applications have been proposed within the machine learning and signal processing communities over the last few years.
This workshop aims at highlighting the differences, commonalities, advantages and disadvantages of the analysis and synthesis data models. The workshop will provide a venue for discussing pros and cons of the two approaches in terms of scalability, ease of learning, and most importantly, applicability to problems in machine learning such as classification, recognition, data completion, source separation, etc. The targeted group of participants ranges from researchers in machine learning and signal processing to mathematicians. All participants of the workshop will gain a deeper understanding of the duality of the two approaches for modeling data and a clear view of which model suits best for certain applications. Moreover, further research directions will be identified that adress the issue of usability of the analysis operator approach for problems arising in Machine Learning as well as important theoretical questions related to the connection of the two fraternal twins in sparse modeling.
Topics of Interest
- Dictionary Learning
- Analysis Operator Learning
- Joint Learning and Classification/Recognition
- Task Oriented Learning of Sparse Representations
- Theory of Analysis Operators and Dictionaries
- Sparse Models for Data Completion
- Multimodal Dictionary/Analysis Operator Learning
- Optimization for Learning Dictionaries and Analysis Operators
General Information
- Workshop Date: December 7/8, 2012
- Workshop Location: Lake Tahoe, Nevada, USA held in conjunction with NIPS conference 2012
- Submission Deadline: September 16, 2012
Invited Speakers
- Michael Elad Technion, Tel Aviv, Israel
- Yann LeCun Duke University, Durham NC, USA
- Lawrence Carin Duke University, Durham NC, USA
- Yi Ma Microsoft Research Asia, Beijing, China
- Bruno A. Olshausen UC Berkeley, Berkeley, USA
Organizers
- Martin Kleinsteuber, Technische Universität München, München, Germany
- Francis Bach, INRIA, Paris, France
- Rémi Gribonval, INRIA, Rennes, France
- John Wright, Columbia University, New York, USA
- Simon Hawe, Technische Universität München, München, Germany
Peyman Milanfar advetized the following on LinkedIn
CALL FOR PAPERS: The Fifth IEEE International Conference on Computational Photography (ICCP) 2013
Harvard University, Cambridge, MA, USA
April 19-21, 2013
Website: http://www.iccp13.org/
Important Dates
Paper Submission: December 12, 2012
Supplementary Material Submission: December 19, 2012
Paper Decisions: February 12, 2013
ICCP 2013 seeks high quality submissions in all areas related to computational photography. We welcome all submissions that introduce new ideas to the field including, but not limited to, those in the following areas:
- Computational cameras
- Computational illumination
- Computational optics (wavefront coding, compressive optical sensing, digital holography, ...)
- High-performance imaging (high-speed, hyper-spectral, high-dynamic range, thermal , confocal, ...).
- Multiple images and camera arrays
- Sensor and illumination hardware
- Scientific imaging and videography
- Advanced image processing
- Organizing and exploiting photo/video collections
In addition to papers, ICCP 2013 will have submission paths for talks, posters, and demos.
Please see the conference website: http://www.iccp13.org/ for additional details.
Program Chairs
- David Boas, Harvard University
- Sylvain Paris, Adobe
- Shmuel Peleg, Hebrew University of Jerusalem
- Todd Zickler, Harvard University
Liked this entry ? subscribe to Nuit Blanche's feed, there's more where that came from. You can also subscribe to Nuit Blanche by Email, explore the Big Picture in Compressive Sensing or the Matrix Factorization Jungle and join the conversations on compressive sensing, advanced matrix factorization and calibration issues on Linkedin.
No comments:
Post a Comment