Monday, April 27, 2015

Compressed Sensing Petrov-Galerkin Approximations for Parametric PDEs

Responding to a question I had on Twitter about the following paper and its connection with uncertainty quantification, Jean-Luc responded with the following:

Dear Igor,

Sorry for the late reply regarding your comment on Twitter. I prefered to reply per email as I'm guessing I'm going to go over the character limitation :)

Our work is directly related to problems in uncertainty quantification. The reason why it's not really obvious in this small note is that we had a 5 page restriction (this is mandatory for SampTA 2015) and decided to focus on the sampling/approximation results.

Why does it relate to uncertainty quantification? Consider the case where the 'parameters' y are in fact outputs of a random field with a certain probability distribution (see or any other publications from Cohen, Devore, Schwab for more details), then you can recast the problem from uncertainty quantification into a parametric approach (in a sense): take a PCA / KL decomposition of the random field, and you get a parametric representation. Hence, yes, our results do apply for uncertainty quantification, even though they are phrased from another point of view right now.

We have other manuscripts in preparation that should be done (at least in a preprint form) by late June - I'll let you know when this is the case, if you have any interest? I will try to write a bit more details regarding the relation to uncertainty quantification and the relevance of our work on this topic.

Let me know if you have any questions,


Jean-Luc also added:

...I forgot to mention, it might be interesting to link to the following papers:
Starting papers:
* Cohen, Devore, Schwab, Analytic regularity and polynomial approximation of parametric and stochastic elliptic PDEs:
* Cohen, Devore, Schwab, Convergence rates of best N-term Galerkin approximations for a class of elliptic sPDEs:
The previous two papers describe the general ideas and first results behind the compressibility of the polynomial chaos expansions of the solution map.

* Cohen, Chkifa, Schwab, Breaking the curse of dimensionality in sparse polynomial approximation of parametric PDEs:

Compressed sensing Petrov-Galerkin:
* Doostan et al, A non-adaptive sparse approximation of pdes with stochastic inputs: (first numerical steps)
* Rauhut, Schwab, Compressive sensing Petrov-Galerkin approximation of high-dimensional parametric operator equations: (theoretical analysis)
* Bouchot et al, Compressed sensing Petrov-Galerkin approximations for Parametric PDEs (direct application from the previous paper)

There has been quite a bit of research lately also on L2 minimization and projections on polynomial spaces. But I guess it gets a little out of the scope here. I'll send you pointers if you're interested.


We consider the computation of parametric solution families of high-dimensional stochastic and parametric PDEs. We review recent theoretical results on sparsity of polynomial chaos expansions of parametric solutions, and on compressed sensing based collocation methods for their efficient numerical computation.
With high probability, these randomized approximations realize best N-term approximation rates afforded by solution sparsity and are free from the curse of dimensionality, both in terms of accuracy and number of samples evaluations (i.e. PDE solves). Through various examples we illustrate the performance of Compressed Sensing Petrov-Galerkin (CSPG) approximations of parametric PDEs, for the computation of (functionals of) solutions of intregral and differential operators on high-dimensional parameter spaces. The CSPG approximations reduce the number of PDE solves, as compared to Monte-Carlo methods, while being likewise nonintrusive, and being “embarassingly parallel”, unlike dimension-adaptive collocation or Galerkin methods. 
Join the CompressiveSensing subreddit or the Google+ Community and post there !
Liked this entry ? subscribe to Nuit Blanche's feed, there's more where that came from. You can also subscribe to Nuit Blanche by Email, explore the Big Picture in Compressive Sensing or the Matrix Factorization Jungle and join the conversations on compressive sensing, advanced matrix factorization and calibration issues on Linkedin.

No comments: