Online Optimization with Costly and Noisy Measurements using Random Fourier Expansions by Laurens Bliek, Hans R. G. W. Verstraete, Michel Verhaegen, Sander Wahls
This paper analyzes DONE, an online optimization algorithm that iteratively minimizes an unknown function with costly and noisy measurements. The algorithm maintains a surrogate of the unknown function in the form of a random Fourier expansion (RFE). The surrogate is updated whenever a new measurement is available, and then used to determine the next measurement point. The algorithm is comparable to Bayesian optimization algorithms, but its computational complexity per iteration does not depend on the number of measurements. We derive several theoretical results that provide insight on how the hyperparameters of the algorithm should be chosen. The algorithm is compared to a Bayesian optimization algorithm for a benchmark problem and two optics applications, namely, optical coherence tomography and optical beam-forming network tuning. It is found that the DONE algorithm is significantly faster than Bayesian optimization in all three discussed problems, while keeping a similar or better performance.
Liked this entry ? subscribe to Nuit Blanche's feed, there's more where that came from. You can also subscribe to Nuit Blanche by Email, explore the Big Picture in Compressive Sensing or the Matrix Factorization Jungle and join the conversations on compressive sensing, advanced matrix factorization and calibration issues on Linkedin.