Handling Expensive Optimization with Large Noise
We present lower and upper bounds on runtimes for expensive noisy optimization problems. Runtimes are expressed in terms of number of fitness evaluations. Fitnesses considered are monotonic transformations of the sphere function. The analysis focuses on the common case of fitness functions quadratic in the distance to the optimum in the neighbourhood of this optimum—it is nonetheless also valid for any monotonic polynomial of degree p > 2. Upper bounds are derived via a bandit-based estimation of distribution algorithm that relies on Bernstein races called R-EDA. It is an evolutionary algorithm in the sense that it is based on selection, random mutations, with a distribution (updated at each iteration) for generating new individuals. It is known that the algorithm is consistent (i.e. it converges to the optimum asymptotically in the number of examples) even in non-differentiable cases. Here we show that: (i) if the variance of the noise decreases to 0 around the optimum, it can perform well for quadratic transformations of the norm to the optimum, (ii) otherwise, a convergence rate is slower than the one exhibited empirically by an algorithm called Quadratic Logistic Regression (QLR) based on surrogate models—although QLR requires a probabilistic prior on the fitness class.