Pages

Thursday, July 26, 2018

CfP: Call for Papers: Special Issue on Information Theory Applications in Signal Processing

Sergio just sent me the following:
Dear Igor,
Could you please announce in nuit blanche the following call for contributions to our Special Issue.
Best resgards,
Sergio
Sure Sergio !
Dear colleagues, 
We are currently leading a Special Issue entitled "Information Theory Applications in Signal Processing" for the journal Entropy (ISSN 1099-4300, IF 2.305). A short prospectus is given at the volume website: 
We would like to invite you to contribute a review or full research paper for publication in this Special Issue after standard peer-review procedure in Open access form.
The official deadline for submission is 30 November 2018. However, you may send your manuscript at any time before the deadline. We can organize a very fast peer-review, if accepted, the paper will be published immediately. Please also feel free to distribute this call for papers to colleagues and collaborators.
You can contact with the assistant editor Ms. Alex Liu (alex.liu@mdpi.com) to solve any question or doubt.
Thank you in advance for considering our invitation.
Sincerely,
Guest Editors:
Dr. Sergio Cruces (http://personal.us.es/sergio/)
Dr. Rubén Martín-Clemente (http://personal.us.es/ruben/)
Dr. Wojciech Samek (http://iphome.hhi.de/samek/)




Join the CompressiveSensing subreddit or the Google+ Community or the Facebook page and post there !
Liked this entry ? subscribe to Nuit Blanche's feed, there's more where that came from. You can also subscribe to Nuit Blanche by Email, explore the Big Picture in Compressive Sensing or the Matrix Factorization Jungle and join the conversations on compressive sensing, advanced matrix factorization and calibration issues on Linkedin.

Monday, July 23, 2018

Rank Minimization for Snapshot Compressive Imaging - implementation -



Yang just sent me the following:

Hi Igor,

I am writing regarding a paper on compressive sensing you may find of interest, co-authored with Xin Yuan, Jinli Suo, David Brady, and Qionghai Dai. We get exciting results on snapshot compressive imaging (SCI), i.e., encoding each frame of an image sequence with a spectral-, temporal-, or angular- variant random mask and summing them pixel-by-pixel to form one-shot measurement. Snapshot compressive hyperspectral, high-speed, and ligh-field imaging are among representatives.

We combine rank minimization to exploit the nonlocal self-similarity of natural scenes, which is widely acknowledged in image/video processing and alternating minimization approach to solve this problem. Results of both simulation and real data from four different SCI systems, where measurement noise is dominant, demonstrate that our proposed algorithm leads to significant improvements (>4dB in PSNR) and more robustness to noise compared with current state-of-the-art algorithms.

Paper arXiv link: https://arxiv.org/abs/1807.07837.
Github repository link: https://github.com/liuyang12/DeSCI.

Here is an animated demo for visualization and comparison with the state-of-the-art algorithms, , i.e., GMM-TP (TIP'14), MMLE-GMM (TIP'15), MMLE-MFA (TIP'15), and GAP-TV (ICIP'16).
Thanks,
Yang (y-liu16@mails.tsinghua.edu.cn)


Thanks Yang !

Snapshot compressive imaging (SCI) refers to compressive imaging systems where multiple frames are mapped into a single measurement, with video compressive imaging and hyperspectral compressive imaging as two representative applications. Though exciting results of high-speed videos and hyperspectral images have been demonstrated, the poor reconstruction quality precludes SCI from wide applications.This paper aims to boost the reconstruction quality of SCI via exploiting the high-dimensional structure in the desired signal. We build a joint model to integrate the nonlocal self-similarity of video/hyperspectral frames and the rank minimization approach with the SCI sensing process. Following this, an alternating minimization algorithm is developed to solve this non-convex problem. We further investigate the special structure of the sampling process in SCI to tackle the computational workload and memory issues in SCI reconstruction. Both simulation and real data (captured by four different SCI cameras) results demonstrate that our proposed algorithm leads to significant improvements compared with current state-of-the-art algorithms. We hope our results will encourage the researchers and engineers to pursue further in compressive imaging for real applications.

Join the CompressiveSensing subreddit or the Google+ Community or the Facebook page and post there !

Thursday, July 19, 2018

CSJob: PhD and Postdoc positions KU Leuven: Optimization frameworks for deep kernel machines


Johan let me know of the following positions in his group:

Dear Igor,
could you please announce this on nuit blanche.
many thanks,
Johan


Sure thing Johan !

PhD and Postdoc positions KU Leuven: Optimization frameworks for deep kernel machines
The research group KU Leuven ESAT-STADIUS is currently offering 2 PhD and 1 Postdoc (1 year, extendable) positions within the framework of the KU Leuven C1 project Optimization frameworks for deep kernel machines (promotors: Prof. Johan Suykens and Prof. Panos Patrinos).
Deep learning and kernel-based learning are among the very powerful methods in machine learning and data-driven modelling. From an optimization and model representation point of view, training of deep feedforward neural networks occurs in a primal form, while kernel-based learning is often characterized by dual representations, in connection to possibly infinite dimensional problems in the primal. In this project we aim at investigating new optimization frameworks for deep kernel machines, with feature maps and kernels taken at multiple levels, and with possibly different objectives for the levels. The research hypothesis is that such an extended framework, including both deep feedforward networks and deep kernel machines, can lead to new important insights and improved results. In order to achieve this, we will study optimization modelling aspects (e.g. variational principles, distributed learning formulations, consensus algorithms), accelerated learning
schemes and adversarial learning methods.
The PhD and Postdoc positions in this KU Leuven C1 project (promotors: Prof. Johan Suykens and Prof. Panos Patrinos) relate to the following  possible topics:
-1- Optimization modelling for deep kernel machines
-2- Efficient learning schemes for deep kernel machines
-3- Adversarial learning for deep kernel machines
For further information and on-line applying, see
https://www.kuleuven.be/personeel/jobsite/jobs/54740654" (PhD positions) and
https://www.kuleuven.be/personeel/jobsite/jobs/54740649" (Postdoc position)
(click EN for English version).
The research group ESAT-STADIUS http://www.esat.kuleuven.be/stadius at the university KU Leuven Belgium provides an excellent research environment being active in the broad area of mathematical engineering, including data-driven modelling, neural networks and machine learning, nonlinear systems and complex networks, optimization, systems and control, signal processing, bioinformatics and biomedicine.





Join the CompressiveSensing subreddit or the Google+ Community or the Facebook page and post there !
Liked this entry ? subscribe to Nuit Blanche's feed, there's more where that came from. You can also subscribe to Nuit Blanche by Email, explore the Big Picture in Compressive Sensing or the Matrix Factorization Jungle and join the conversations on compressive sensing, advanced matrix factorization and calibration issues on Linkedin.

Friday, July 13, 2018

Phase Retrieval Under a Generative Prior


Vlad just sent me the following: 
Hi Igor,

I am writing regarding a paper you may find of interest, co-authored with Paul Hand and Oscar Leong. It applies a deep generative prior to phase retrieval, with surprisingly good results! We can show recovery occurs at optimal sample complexity for gaussian measurements, which in a sense resolves the sparse phase retrieval O(k^2 log n) bottleneck.

https://arxiv.org/pdf/1807.04261.pdf


Best,

-Vlad

Thanks Vlad ! Here is the paper:

Phase Retrieval Under a Generative Prior by Paul Hand, Oscar Leong, Vladislav Voroninski
The phase retrieval problem asks to recover a natural signal y0Rn from m quadratic observations, where m is to be minimized. As is common in many imaging problems, natural signals are considered sparse with respect to a known basis, and the generic sparsity prior is enforced via 1 regularization. While successful in the realm of linear inverse problems, such 1 methods have encountered possibly fundamental limitations, as no computationally efficient algorithm for phase retrieval of a k-sparse signal has been proven to succeed with fewer than O(k2logn) generic measurements, exceeding the theoretical optimum of O(klogn). In this paper, we propose a novel framework for phase retrieval by 1) modeling natural signals as being in the range of a deep generative neural network G:RkRn and 2) enforcing this prior directly by optimizing an empirical risk objective over the domain of the generator. Our formulation has provably favorable global geometry for gradient methods, as soon as m=O(kd2logn), where d is the depth of the network. Specifically, when suitable deterministic conditions on the generator and measurement matrix are met, we construct a descent direction for any point outside of a small neighborhood around the unique global minimizer and its negative multiple, and show that such conditions hold with high probability under Gaussian ensembles of multilayer fully-connected generator networks and measurement matrices. This formulation for structured phase retrieval thus has two advantages over sparsity based methods: 1) deep generative priors can more tightly represent natural signals and 2) information theoretically optimal sample complexity. We corroborate these results with experiments showing that exploiting generative models in phase retrieval tasks outperforms sparse phase retrieval methods.



Join the CompressiveSensing subreddit or the Google+ Community or the Facebook page and post there !
Liked this entry ? subscribe to Nuit Blanche's feed, there's more where that came from. You can also subscribe to Nuit Blanche by Email, explore the Big Picture in Compressive Sensing or the Matrix Factorization Jungle and join the conversations on compressive sensing, advanced matrix factorization and calibration issues on Linkedin.

Monday, July 09, 2018

Nuit Blanche in Review (February - June 2018)


It's been five months, already, here is what we featured on Nuit Blanche since the last Nuit Blanche in Review (Janvier 2018). During that time, Mila Nikolova left us. Here are some other things that happened.

Implementations


Meetings:


Posters:

In-depth

Book:

Video:

job:

Around the blogs:
conferences
Paris Machine Learning meetups:
IA en France
Other