SCRO / Journées de l'optimisation

HEC Montréal, 29-31 mai 2023


HEC Montréal, 29 — 31 mai 2023

Horaire Auteurs Mon horaire

BOI Blackbox Optimization I

31 mai 2023 10h30 – 12h10

Salle: Transat (jaune)

Présidée par Xavier Lebeuf

4 présentations

  • 10h30 - 10h55

    A General Blackbox Optimization Framework for Hyperparameter Optimization in Deep Learning

    • Edward Hallé-Hannan, prés., Polytechnique
    • Charles Audet, GERAD - Polytechnique Montréal
    • Sébastien Le Digabel, GERAD, Polytechnique Montréal

    The uprising of deep learning has brought new challenges in the blackbox optimization (BBO) community. Notably, tuning the hyperparameters of a deep model is a mixed-variable BBO problem with an unfixed structure. For instance, the number of hidden layers, which is itself a hyperparameter, affects the number of hyperparameters regarding the units in the architecture of the model. Moreover, the hyperparameter optimization problem (HPO) may simultaneously contain categorical, integer and continuous variables. In conjunction, the diversity of variable types and the unfixed structure of the HPO create a substantial challenge in a BBO context. To tackle the HPO, we meticulously developed a mathematical framework that properly models the class of problems of interest. Meta variables are introduced to model variables that influence which variables are included or excluded. In essence, meta variables model the unfixed structure of the problem, and they may affect the dimension of the problem. Many algorithmic subtleties and implications are outlined by the mathematical framework, which ease the development of optimization methods. A simple HPO problem is illustrated throughout the presentation and solution strategies to tackle meta and categorical variables are discussed with different approaches, such as direct search, heuristics and Bayesian optimization.

  • 10h55 - 11h20

    Erratum, counterexample and an additional revealing poll step for a result of "Analysis of direct searches for discontinuous functions"

    • Pierre-Yves Bouchet, Polytechnique Montréal
    • Charles Audet, prés., GERAD - Polytechnique Montréal
    • Loïc Bourdin, XLIM - Université de Limoges

    This talk provides a counterexample to a theorem announced in the last part of the paper “Analysis of direct searches for discontinuous functions", Mathematical Programming Vol. 133, pp. 299--325, 2012. The counterexample involves a one-dimensional objective function f which satisfies all the assumptions required by the theorem but contradicts some of its conclusions. A corollary of this theorem is also affected by this counterexample. This talk also discusses the exactitude of each statements in the theorem in general. Finally this talk introduces a modification of the directional direct search method (dDSM) that allows to recover the properties broken by the counterexample. The main flaw revealed by the counterexample is the possibility that a dDSM generates a sequence of trial points (xk) converging to a point x* at which f is discontinuous and whose objective function value is strictly less than the limit of the sequence (f(xk)). Moreover the dDSM generates no trial point in only one of the branches of f near x*.

  • 11h20 - 11h45

    Hidden constraints surrogates using gaussian processes in blackbox optimization

    • Stéphane Jacquet, prés.,

    In blackbox optimization, the values of the objective function and the constraints are not given by the analytical expression of functions but with computer simulations or industrial experiments. This leads to lack of gradients, long computational time for each evaluation and evaluations that can crash. Those last ones are due to hidden constraints. Their importance are relevent in real-world problems. In this work, we study the impact of gaussian processes surrogate functions to help blackbox optimization solvers. It suggests several ways of using such surrogates as well as numercial results.

  • 11h45 - 12h10

    Hierarchically constrained multi-fidelity blackbox optimization

    • Xavier Lebeuf, prés.,

    We propose a multi-fidelity blackbox optimization algorithm that addresses the problem of having to spend large computational resources on infeasible points when using direct search algorithms. The proposed method is coupled with an existing solver, allowing a decrease of the expected time per evaluation while keeping the efficiency and convergence properties of the existing method. This is achieved by estimating the feasibility of points to evaluate with low fidelity, hence low cost, evaluations before deciding if a point is worth the full time investment. These estimations are given by a hierarchy of constraints that are affected by the multi-fidelity aspect of the blackbox. During this project, numerical tests are conducted on the SOLAR family of blackbox problems. The NOMAD software is used as the existing solver, and the solver with default parameters is compared to the solver coupled with the proposed algorithm during the tests. These tests show that given the same time budget, the coupling with the proposed method results in a great improvement in the quality of the solutions when a feasible starting point is known prior to the optimization. Without this condition, the results are mixed and largely depend on some properties of the optimized blackbox.