Wednesday, November 01, 2017

CfP: Workshop on Approximating high dimensional functions, Alan Turing Institute, London,18-19 December 2017.

Hemant just sent me the following:

Dear Igor,

Aretha Teckentrup and I are organizing a workshop on “Approximating high dimensional functions” at the Alan Turing Institute, London in December. We were wondering if it would be possible for you to post the announcement for this workshop (below) on your blog Nuit blanche? We would very much appreciate it, thanks a lot in advance!
Many thanks,
Hemant Tyagi

Sure Hemant, here it is:

================= Announcement ==============================

A workshop on "Approximating high dimensional functions" will be held at the Alan Turing Institute, London on 18, 19 December 2017.
The workshop will focus on problems centered around approximating a high dimensional function from limited information, featuring talks by eminent researchers in the fields of multivariate approximation theory, ridge functions, stochastic PDEs and non-parametric regression.

Confirmed speakers are:
Pierre Alquier (ENSAE, Universite Paris-Saclay, France),
Albert Cohen (Université Pierre et Marie Curie, France),
Sergey Dolgov (University of Bath, UK),
Arthur Gretton / Dougal Sutherland (UCL, UK),
Sandra Keiper (Technische Universität Berlin, Germany),
Sebastian Mayer (Universität Bonn, Germany),
Richard Samworth (University of Cambridge, UK),
Jan Vybiral (Czech Technical University, Czech Republic),
Sören Wolfers (KAUST, Saudi Arabia)

The complete program can be found here:
Registration for the workshop is free, but mandatory, and can be done here:
Best regards,
Aretha Teckentrup,
Hemant Tyagi

Join the CompressiveSensing subreddit or the Google+ Community or the Facebook page and post there !
Liked this entry ? subscribe to Nuit Blanche's feed, there's more where that came from. You can also subscribe to Nuit Blanche by Email, explore the Big Picture in Compressive Sensing or the Matrix Factorization Jungle and join the conversations on compressive sensing, advanced matrix factorization and calibration issues on Linkedin.

1 comment:

Anonymous said...

Offtopic: Chaos cancelling neural network ensembles.
The idea is that the non-linearities in deep neural networks compound (exponentially) layer after layer. A recent paper shows single pixel attacks of deep networks that would support the idea of bifurcations along 1 or several dimensions. Implying chaos theory applies.
By using ensembles of diverse neural networks you should be able to cancel out chaotic responses to low level Gaussian noise according to the central limit theorem.

It shouldn't add too much extra computational burden because if you train the ensemble collectively you still get a chaos cancelling effect using individual networks with fewer weight parameters each.!topic/artificial-general-intelligence/itUghRNZWN8