A Derivative-Free Local Optimizer for Multi-Objective Problems

07/30/2021, 8:00 PM8:30 PM UTC
JuMP Track

Abstract:

In real-world applications, optimization problems might arise where there is more than one objective. Additionally, some objectives could be computationally expensive to evaluate, with no gradient information available. I present a derivative-free local optimizer (written in Julia) aimed at such problems. It employs a trust-region strategy and local surrogate models (e.g., polynomials or radial basis function models) to save function evaluations.

Description:

I will revisit the basic concepts of multi-objective optimization and introduce the notion of Pareto optimality and Pareto criticality. Based on this idea, the steepest descent direction for multi-objective problems (MOPs) is derived. When used in conjunction with a trust region strategy, the steepest descent direction can be used to generate iterates converging to first-order critical points. Besides talking about the mathematical background, I want to describe how local surrogate models are constructed and how we use other available packages (JuMP, NLopt, DynamicPolynomials etc.) in our implementation. Moreover, I will show the results of a few numerical experiments proving the efficiency of the approach and talk a bit about how the local solver could be embedded in a global(ish) framework.

Platinum sponsors

Julia Computing

Gold sponsors

Relational AI

Silver sponsors

Invenia LabsConningPumas AIQuEra Computing Inc.King Abdullah University of Science and TechnologyDataChef.coJeffrey Sarnoff

Media partners

Packt Publication

Fiscal Sponsor

NumFOCUS