SCRO / Journées de l'optimisation

HEC Montréal, 29-31 mai 2023


HEC Montréal, 29 — 31 mai 2023

Horaire Auteurs Mon horaire

BOII Blackbox Optimization II

31 mai 2023 13h30 – 15h10

Salle: Transat (jaune)

Présidée par Christophe Tribes

4 présentations

  • 13h30 - 13h55

    Scalable Bayesian optimization methods

    • Youssef Diouane, prés., Polytechnique Montréal
    • Jie GAO, HEC
    • Carolina Osorio, HEC Montréal & GERAD

    Bayesian optimisation (BO) is a class of global optimization methods that targets to solve efficiently expensive blackbox optimization problems. BO iteratively constructs and uses conditional Gaussian processes to guide the optimization process.

    In this talk, we first present a scalable BO approach that uses, when available, efficiently additional information on the composite-structure of the blackbox optimization problem. Our scalable BO approach shows excellent performance compared to classical BO approaches. A sensitivity analysis with respect to the dimension and the structure of the blackbox problem confirmed the good potential of the proposed BO approach.

    The second part of this talk describes a second improved BO method, called TREGO (trust-region-like efficient global optimisation). TREGO alternates between regular BO steps and local steps within a trust region. A set of experiments reveal that TREGO consistently outperforms regular BO method and is competitive with other state-of-the-art blackbox optimization methods.

  • 13h55 - 14h20

    Sequential stochastic blackbox optimization

    • Romain Couderc, prés.,
    • Charles Audet, GERAD - Polytechnique Montréal
    • Jean Bigeon, Univ. Grenoble Alpes, CNRS, Grenoble INP, G-SCOP, F-38000 Grenoble, France
    • Michael Kokkolaras, Université McGill

    This work considers stochastic optimization problems in which the objective function values can only be computed by a blackbox corrupted by some random noise following an unknown distribution. The method developed is based on sequential stochastic optimization (SSO) : the original problem is decomposed into a sequence of subproblems. Each of these subproblems is solved using a zeroth order version of a sign stochastic gradient descent with momentum algorithm (ZO-Signum) and with an increasingly fine precision. This decomposition allows a good exploration of the space while maintaining the efficiency of the algorithm once it gets close to the solution. Under Lipschitz continuity assumption on the blackbox, a convergence rate
    in expectation is derived for the ZO-signum algorithm. If, moreover the blackbox is locally convex around its minima, an almost sure convergence rate may be given for the SSO algorithm. Finally, numerical experiments are conducted to compare the SSO algorithm with other state-of-the-art algorithms and to demonstrate its competitiveness.

  • 14h20 - 14h45

    Tabu Search Hyperparameters Tuning with Blackbox Optimization

    • Nazgol Niroumandrad, prés., Polytechnique Montreal
    • Nadia Lahrichi, Polytechnique Montréal
    • Sébastien Le Digabel, GERAD, Polytechnique Montréal

    The performance and behaviour of any metaheuristic algorithm largely depends on its hyperparameters. These parameters need to be chosen carefully to achieve the best results. Their impact is even more important in large and real-world size problems. The best values for these hyperparameters cannot be identified by hand and with trial and error. Research shows that hyperparameter tuning is a nontrivial task and efficient methods are required to obtain the best possible results. We present how blackbox optimization can help choose the tabu search parameters efficiently in a physician scheduling problem. We are solving this problem through a Mesh Adaptive Direct Search (MADS) algorithm with no derivative information. The experiments have been conducted and the results will be presented.

  • 14h45 - 15h10

    Improving NOMAD with quadratic programming solver

    • Christophe Tribes, prés., Polytechnique Montréal
    • Sébastien Le Digabel, GERAD, Polytechnique Montréal
    • Tangi Migot, Polytechnique Montréal

    Our presentation will discuss improvements in the resolution of constrained blackbox optimization (BBO) problems using the mesh adaptive direct search (Mads) algorithm implemented in the Nomad software package. Mads alternates between Search and Poll steps to generate trial points. A quadratic model search uses a subset of all the previously obtained blackbox evaluations to build function models. This search requires to solve an optimization sub-problem which is currently done with the Mads algorithm. But on some specific problems the results are not satisfactory. In particular, the COCO platform BBOB-constrained Sphere problem in 10 dimensions with one linear inequality constraint has a single known optimum. However, Mads converges very slowly to the sub-problem optimum.
    This work focuses on replacing Mads by a Quadratic Programming (QP) solver for the sub-problem. The presentation will describe what happens when solving the Sphere problem. Then we will present one candidate QP solver to replace Mads for the search method and its implementation in Nomad. Some preliminary results showing the benefits of using a QP solver will also be given.