Tuesday, August 20, 2019

Transfer Learning as a Tool for Reducing Simulation Bias: Application to Inertial Confinement Fusion

** Nuit Blanche is now on Twitter: @NuitBlog **

Using Transfer learning for exploration purposes in expensive Inertial Confinement Fusion experiments is probably the only way to speed up our exploration of the right parameters in this nuclear fusion quest. 

Transfer Learning as a Tool for Reducing Simulation Bias: Application to Inertial Confinement Fusion. by B. Kustowski , Jim A. Gaffney , Brian K. Spears , Gemma J. Anderson , Jayaraman Jayaraman Thiagarajan , and Rushil Anirudh

We adapt a technique, known in the machine learning community as transfer learning, to reduce the bias of a computer simulation using very sparse experimental data. Unlike the Bayesian calibration, which is commonly used to estimate the simulation bias, transfer learning involves calculating an artificial neural network surrogate model of the simulations. Assuming that the simulation code correctly predicts trends in the experimental data but it is subject to unknown biases, we then partially retrain, or transfer learn, the initial surrogate model to match the experimental data. This process eliminates the bias while still taking advantage of the physics relations learned from the simulation. Transfer learning can be easily adapted to a wide range of problems in science and engineering. In this paper, we carry out numerical tests to investigate the applicability of this technique to predict inertial confinement fusion experiments under new conditions. Using our synthetic validation data set we demonstrate that an accurate predictive model can be built by retraining an initial surrogate model with experimental data volumes so small that they are relevant to the inertial confinement fusion problem. This opens up new opportunities for knowledge transfer and building predictive models in physics. After implementing transfer learning in a standard neural network, we successfully extended the method to a more complex, generative adversarial network architecture, which will be needed for predicting not only scalars but also diagnostic images in our future work.

Follow @NuitBlog or join the CompressiveSensing Reddit, the Facebook page, the Compressive Sensing group on LinkedIn  or the Advanced Matrix Factorization group on LinkedIn

Liked this entry ? subscribe to Nuit Blanche's feed, there's more where that came from. You can also subscribe to Nuit Blanche by Email.

Other links:
Paris Machine LearningMeetup.com||@Archives||LinkedIn||Facebook|| @ParisMLGroup< br/> About LightOnNewsletter ||@LightOnIO|| on LinkedIn || on CrunchBase || our Blog
About myselfLightOn || Google Scholar || LinkedIn ||@IgorCarron ||Homepage||ArXiv

No comments: