Optimization and Artificial Intelligence II

FA-05: Optimization and Artificial Intelligence II
Stream: Optimization and Artificial Intelligence
Room: Pontryagin
Chair(s): Sébastien Gerchinovitz

K-Quant: a non uniform post-training quantization algorithm
Enrico Civitelli, Leonardo Taccari, Fabio Schoen
Quantization is a simple yet effective way to deploy deep neural networks on resource-limited hardware. Post-training quantization algorithms are particularly interesting because they do not require the full dataset to run. In this work we explore a way to perform non uniform post-training quantization using an optimization algorithm to minimize the output differences between each compressed layer and the original one. The proposed method significantly reduces the memory required by the neural network without affecting the performance in terms of accuracy.

An Adaptive ML-Based Discretization Method for Computing Optimal Experimental Designs
Philipp Seufert, Jan Schwientek, Tobias Seidel, Michael Bortz, Karl-Heinz Küfer
Standard algorithms for the computation of optimal experimental designs (OED) consist of an inner point acquisition and an outer weight optimization. Whereas the latter is a convex problem, the inner one is a general non-convex non-linear program with implicitly given objective. We present a modification of the common OED solution approach which uses Bayesian optimization to adaptively form a grid of candidate points for determining the optimal design. We proved convergence of the algorithm to a locally optimal continuous design and obtained promising numerical results on real-world problems.

Sparse RBF Regression for the Optimization of Noisy Expensive Functions
Alessio Sortino, Matteo Lapucci, Fabio Schoen
Global optimization problems for black-box functions are usually addressed by building a surrogate model over the data and an acquisition function to decide where to place the next observation. When data are noisy the surrogate should not trust the latter too much. This typically introduces an extra hyperparameter into the model that corresponds to the variance of the noise. In this work we present a novel approach where a robust RBF-based surrogate model is built from the solution of a particular MIQP problem. Experimental results show the effectiveness of our approach w.r.t. existent methods

On the optimality of the Piyavskii-Shubert algorithm for global Lipschitz optimization: a bandit perspective
Clément Bouttier, Sébastien Gerchinovitz, Tommaso Cesari
We consider the problem of maximizing a non-concave Lipschitz multivariate function over a compact domain by sequentially querying its (possibly perturbed) values. We study a natural algorithm originally designed by Piyavskii and Shubert in 1972, for which we prove new bounds on the number of evaluations of the function needed to reach or certify a given optimization accuracy. Our analysis uses a bandit-optimization viewpoint and solves an open problem from Hansen et al. (1991), by bounding the number of evaluations to certify a given accuracy with a simple and optimal integral.

Theme: Overlay by Kaira