03:30 PM - 03:55 PM
Optimization methods for neural networks training
Given a set of labeled data points, the optimization problem associated to the training of neural networks aims at determining the parameters, e.g., synaptic weights, which minimize the empirical loss between the true output to the given input and the predicted output. The (regularized) problem is nonconvex even when the loss (and the regularization) function is convex. We analyze and compare extended bundle and trust region methods for nonconvex loss and non/convex non/smooth regularization term.
03:55 PM - 04:20 PM
A hybrid decision diagram approach for the job shop scheduling problem
We propose an optimization framework which integrates mixed-integer programming (MIP) and multivalued decision diagrams (MDDs) for optimization. A MDD representation of the problem identifies parts of the search space that can be efficiently explored by MIP technology, while the MIP results are iteratively used to refine the MDD.
04:20 PM - 04:45 PM
Decompositions based on Decision Diagrams
This talk describes a new decomposition approach where small-sized decision diagrams exactly represent different portions of a discrete optimization problem, all of which are linked through special constraints. We discuss potential techniques to solve the underlying decomposition problem and show a number of applications of this method.