Monday, December 19, 2016

Sparse Label Propagation



Alex just sent me the following:

Hi Igor, 
I just wanted to draw your attention to our recent manuscript “Sparse Label Propagation” https://arxiv.org/abs/1612.01414 which formulates the good old null-space property of sparse signal recovery in terms of cuts. On a higher-level it is another effort to combine compressed sensing and complex network techniques for mastering big data over networks. I thought it could be interesting for your CS blog. 
Kiitos, 
Alex
Thanks Alex ! Here is the paper: Sparse Label Propagation by Alexander Jung
We consider massive heterogeneous datasets with intrinsic network structure, i.e., big data over networks. These datasets can be modelled by graph signals, which are defined over large-scale irregular graphs representing complex networks. We show that (semi-supervised) learning of the entire underlying graph signal based on incomplete information provided by few initial labels can be reduced to a compressed sensing recovery problem within the cosparse analysis model. This reduction provides two things: First, it allows to apply highly developed compressed sensing methods to the learning problem. In particular, by implementing a recent primal-dual method for convex optimization, we obtain a sparse label propagation algorithm. Moreover, by casting the learning problem within compressed sensing, we are able to derive sufficient conditions on the graph structure and available label information, such that sparse label propagation is accurate.



Join the CompressiveSensing subreddit or the Google+ Community or the Facebook page and post there !
Liked this entry ? subscribe to Nuit Blanche's feed, there's more where that came from. You can also subscribe to Nuit Blanche by Email, explore the Big Picture in Compressive Sensing or the Matrix Factorization Jungle and join the conversations on compressive sensing, advanced matrix factorization and calibration issues on Linkedin.

No comments:

Printfriendly