Speeding-up black-box optimization algorithms via learning and using a surrogate model is a heavily studied topic. This paper evaluates two different surrogate models: Gaussian processes and random forests which are interconnected with the state-of-the art optimization algorithm CMA-ES.
Results on the BBOB testing set show that considerable amount of fitness evaluations can be saved especially during the initial phase of the algorithm's progress.