SCRO / Journées de l'optimisation
HEC Montréal, 2931 mai 2023
CORSJOPT2023
HEC Montréal, 29 — 31 mai 2023
BOII Blackbox Optimization II
31 mai 2023 13h30 – 15h10
Salle: Transat (jaune)
Présidée par Christophe Tribes
4 présentations

13h30  13h55
Scalable Bayesian optimization methods
Bayesian optimisation (BO) is a class of global optimization methods that targets to solve efficiently expensive blackbox optimization problems. BO iteratively constructs and uses conditional Gaussian processes to guide the optimization process.
In this talk, we first present a scalable BO approach that uses, when available, efficiently additional information on the compositestructure of the blackbox optimization problem. Our scalable BO approach shows excellent performance compared to classical BO approaches. A sensitivity analysis with respect to the dimension and the structure of the blackbox problem confirmed the good potential of the proposed BO approach.
The second part of this talk describes a second improved BO method, called TREGO (trustregionlike efficient global optimisation). TREGO alternates between regular BO steps and local steps within a trust region. A set of experiments reveal that TREGO consistently outperforms regular BO method and is competitive with other stateoftheart blackbox optimization methods.

13h55  14h20
Sequential stochastic blackbox optimization
This work considers stochastic optimization problems in which the objective function values can only be computed by a blackbox corrupted by some random noise following an unknown distribution. The method developed is based on sequential stochastic optimization (SSO) : the original problem is decomposed into a sequence of subproblems. Each of these subproblems is solved using a zeroth order version of a sign stochastic gradient descent with momentum algorithm (ZOSignum) and with an increasingly fine precision. This decomposition allows a good exploration of the space while maintaining the efficiency of the algorithm once it gets close to the solution. Under Lipschitz continuity assumption on the blackbox, a convergence rate
in expectation is derived for the ZOsignum algorithm. If, moreover the blackbox is locally convex around its minima, an almost sure convergence rate may be given for the SSO algorithm. Finally, numerical experiments are conducted to compare the SSO algorithm with other stateoftheart algorithms and to demonstrate its competitiveness. 
14h20  14h45
Tabu Search Hyperparameters Tuning with Blackbox Optimization
The performance and behaviour of any metaheuristic algorithm largely depends on its hyperparameters. These parameters need to be chosen carefully to achieve the best results. Their impact is even more important in large and realworld size problems. The best values for these hyperparameters cannot be identified by hand and with trial and error. Research shows that hyperparameter tuning is a nontrivial task and efficient methods are required to obtain the best possible results. We present how blackbox optimization can help choose the tabu search parameters efficiently in a physician scheduling problem. We are solving this problem through a Mesh Adaptive Direct Search (MADS) algorithm with no derivative information. The experiments have been conducted and the results will be presented.

14h45  15h10
Improving NOMAD with quadratic programming solver
Our presentation will discuss improvements in the resolution of constrained blackbox optimization (BBO) problems using the mesh adaptive direct search (Mads) algorithm implemented in the Nomad software package. Mads alternates between Search and Poll steps to generate trial points. A quadratic model search uses a subset of all the previously obtained blackbox evaluations to build function models. This search requires to solve an optimization subproblem which is currently done with the Mads algorithm. But on some specific problems the results are not satisfactory. In particular, the COCO platform BBOBconstrained Sphere problem in 10 dimensions with one linear inequality constraint has a single known optimum. However, Mads converges very slowly to the subproblem optimum.
This work focuses on replacing Mads by a Quadratic Programming (QP) solver for the subproblem. The presentation will describe what happens when solving the Sphere problem. Then we will present one candidate QP solver to replace Mads for the search method and its implementation in Nomad. Some preliminary results showing the benefits of using a QP solver will also be given.