JOPT2025
HEC Montreal, 12 — 14 May 2025
JOPT2025
HEC Montreal, 12 — 14 May 2025

Derivative-free optimization I
May 13, 2025 03:30 PM – 05:10 PM
Location: Accra (Yellow)
Chaired by Théo Denorme
3 Presentations
-
03:30 PM - 03:55 PM
A Trust-Region-Based Bayesian Optimization Method for Mixed-Variable Spaces
Hyper-parameter tuning is a critical yet computationally demanding Black-Box Optimization (BBO) problem, particularly in relation to deep learning tasks. Existing BBO techniques, including Bayesian Optimization (BO), often struggle with mixed-variable search spaces that include continuous, integer, and categorical variables. In this work, we introduce a mixed-variable framework for BO, integrating a Trust-Region (TR) mechanism to allow alternating between local (within the vicinity of the trust region) and global (exploring regions beyond the trust region) BO steps. The proposed framework not only enhances performance but also ensures asymptotic convergence. The proposed approach constructs a Gaussian Process tailored for mixed-variables and optimizes the acquisition function over the mixed-variable domain to ensure an efficient exploration of the mixed search space. The potential of the proposed framework is first illustrated on a set of analytical test cases. The framework is then tested on a deep learning-related task, where we aim to tune the hyper-parameters of MobileNetV2—a lightweight convolutional neural network—on a garbage classification dataset. Our framework demonstrates improved computational efficiency and model performance compared to classical BO methods.
-
03:55 PM - 04:20 PM
CatMADS: categorical variables with the MADS algorithm
Solving optimization problems where functions lack explicit expressions and variables involve different types poses significant theoretical and algorithmic challenges. Nevertheless, such settings often occur in simulation-based engineering design and machine learning. In this talk, the mesh adaptive direct search (MADS) algorithm is extended to mixed-variable problems. MADS is a robust derivative-free optimization framework with a well-established convergence analysis for constrained quantitative problems. CatMADS generalizes MADS by incorporating categorical variables, handled via distance-induced neighborhoods. Different types of local minima are introduced for the class of problems and developing theoretical results. An exhaustive convergence analysis of CatMADS is provided, with flexible choices balancing computational cost and local optimality strength. CatMADS integrates the progressive barrier strategy for handling constraints with guarantees. An instance of CatMADS employs cross-validation to construct problem-specific categorical distances. The instance is benchmarked against state-of-the-art solvers on 32 new mixed-variable problems, half of which are constrained. Data profiles show CatMADS achieves the best results, demonstrating that the framework is empirically efficient in addition to having strong theoretical foundations.
-
04:20 PM - 04:45 PM
Adaptive Direct Search algorithms for constrained optimization.
This talk addresses the minimization of a nonsmooth objective function subject to general nonsmooth constraints, where the analytical form of the problem may be unavailable, leading to a blackbox optimization setting. The proposed method is a derivative-free optimization approach, suited for situations where derivative information is either inaccessible or costly to obtain. Convergence analyses of directional-based direct search methods generally require a sufficient decrease in the objective function value or require that trial points belong to a space discretization known as the mesh. Both approaches have their advantages. The sufficient decrease condition allows for more flexibility in the placement of trial points, as they do not need to be projected onto the mesh.
The mesh-based approach ensures that no trial point that improves the current solution is wasted, even if the improvement is only marginal.
In this talk, we present a new direct search algorithm named Adaptive Direct Search (ADS). To accept a trial point, ADS introduces a new criterion that is based neither on sufficient decrease in the objective function nor on a mesh, but rather on a new set called the {\em esh}. This new set allows placement of trial points while retaining theoretical benefits of the existing direct search algorithms. Moreover, ADS generalizes practical instances of MADS (i.e., a well-established mesh-based direct search method). Numerical results will be presented to illustrate the performance of the method.