Journées de l'optimisation 2019

HEC Montréal, 13-15 mai 2019

JOPT2019

HEC Montréal, 13 — 15 mai 2019

Horaire Auteurs Mon horaire

WA5 Derivative-Free Optimization I

15 mai 2019 09h00 – 10h15

Salle: Marie-Husny

Présidée par Sébastien Le Digabel

3 présentations

  • 09h00 - 09h25

    MADMS: Mesh adaptive direct multisearch for constrained blackbox multiobjective optimization

    • Ludovic Salomon, prés., Polytechnique Montréal
    • Sébastien Le Digabel, GERAD, Polytechnique Montréal
    • Jean Bigeon, Univ. Grenoble Alpes, CNRS, Grenoble INP, G-SCOP, F-38000 Grenoble, France

    The context of this work is derivative-free multiobjective optimization in the presence of two or more conflicting objective functions, considered as blackboxes for which no derivative information is available. A new extension of the Mesh Adaptive Direct Search (MADS) algorithm, called MADMS, is considered. This algorithm keeps a list of non-dominated points which converges to the Pareto front. As for the single-objective MADS algorithm, this method is built around an optional search step and a poll step. Under classical direct search assumptions, it is proved that this algorithm generates multiple subsequences of iterates which converge to local Pareto stationary points. Computational experiments show that this new approach is promising with respect to other state-of-the-art algorithms.

    derivative-free, multiobjective, direct search

  • 09h25 - 09h50

    Mesh-based constrained stochastic blackbox optimization using probabilistic estimates

    • Kwassi Joseph Dzahini, prés., Polytechnique Montréal
    • Charles Audet, GERAD - Polytechnique Montréal
    • Sébastien Le Digabel, GERAD, Polytechnique Montréal

    This work introduces STOMADS, a stochastic variant of the Mesh Adaptive Direct Search (MADS) algorithm designed for deterministic blackbox optimization. STOMADS considers the constrained optimization of an (unknown) objective function f whose values can only be computed with some random noise of an unknown distribution. The proposed algorithm uses an algorithmic concept similar to that of MADS and utilizes random estimates of true function values obtained from their stochastic observations to ensure improvements since the exact deterministic computable version of f is not available. Such estimates are required to be accurate with a sufficiently large but fixed probability and satisfy a certain variance condition. The ability of the proposed algorithm to generate an asymptotically dense set of search directions is then exploited to show that it converges to a Clarke stationary point with probability one, with the help of martingale theory.

  • 09h50 - 10h15

    HYPERNOMAD: Hyper-parameter optimization of deep neural networks using mesh adaptive direct search

    • Sébastien Le Digabel, prés., GERAD, Polytechnique Montréal
    • Dounia Lakhmiri, Polytechnique Montréal
    • Christophe Tribes, Polytechnique Montréal

    The performance of deep neural networks is highly sensitive to the choice of the hyper-parameters that define the structure of the network and the learning process. When facing a new application, tuning a deep neural network is a tedious and time consuming process that is often described as a “dark art”. This work introduces the HYPERNOMAD package that applies the MADS derivative-free algorithm to solve this particular hyper-parameter optimization problem including the handling of categorical variables. This new approach is tested on the MNIST and CIFAR-10 datasets and achieves results comparable to the current state of the art.

Retour