Optimization of noisy blackboxes with adaptive precision
Pierre-Yves Bouchet, Charles Audet, Sébastien Le Digabel, Stephane Alarie
In derivative-free optimization, the objective may be evaluated through a computer program which returns a stochastic output with tunable variance. As a low variance leads to an important cost per evaluation, we propose an approach which allows the variance to remain high. Some satisfying results on an industrial problem are given.
A derivative free methods for mixed-integer nonsmooth constrained optimization problems.
Stefano Lucidi, Tommaso Giovannelli, Giampaolo Liuzzi, Francesco Rinaldi
In this paper, we propose new linesearch-based methods for mixed-interger nonsmooth constrained optimization problems when first-order information on the problem functions is not available. First, we describe a general framework for mixed integer bound constrained problems. Then we use an exact penalty approach to tackle the presence of nonlinear (possibly nonsmooth) constraints. We analyze the global convergence properties of all the proposed algorithms toward Clarke stationary points and we report results of a preliminary numerical experience on a set of mixed-integer problems.
A Merit Function Approach for Evolution Strategies
In this talk, we present a class of globally convergent evolution strategies for solving relaxable constrained optimization problems. The proposed framework handles relaxable constraints using a merit function approach combined with a specific restoration procedure. The introduced extension guarantees to the regarded class of evolution strategies global convergence properties for first order stationary constraints. Comparison tests are carried out using two sets of known constrained optimization problems.