7–11 Apr 2025
Lecture and Conference Centre
Europe/Warsaw timezone

Neural network function approximation for solving parametric optimization problem via optimality condition penalties

10 Apr 2025, 18:10
20m
Room 9

Room 9

Speakers

Matthias Hoffmann Kathrin Flaßkamp

Description

Numerical optimization has long been a cornerstone in engineering disciplines, underpinning areas such as optimal control and design optimization. The multi-objective nature of design optimization problems raises the interest in computing entire Pareto fronts of optimal compromises. By suitable scalarization techniques, this can be formulated as solving a family of parametric optimization problems. Model predictive control requires to iteratively solving optimal control problems, which are also related to one another by a parametrization of the initial state. Here, to achieve real-time capability, numerical efficiency is of great interest. In this work, we introduce Optimality-Informed Neural Networks (OptINNs) as a combination of classical optimization and machine learning approaches. Drawing inspiration from Physics-Informed Neural Networks, OptINNs directly integrate domain-specific knowledge, here on the optimality of solutions, into their architecture and into the training process. Thereby, the common data dependency bottleneck of neural networks is addressed by providing an objective performance metric that does not rely solely on validation datasets. We propose Karush-Kuhn-Tucker (KKT)-type OptINNs specifically designed for parametric optimization problems. To validate our approach, we apply KKT-type OptINNs to different optimization challenges, ranging from simple linear constrained problems to complex nonlinear optimal control scenarios. Our results highlight the effectiveness of OptINNs in enhancing optimization performance while reducing data requirements.

Co-authors

Presentation materials

There are no materials yet.