Introduction to optimal control and Hamilton-Jacobi equations
(Summer school in PDE and applications 2024)
Khai T. Nguyen
NCSU Mathematics Department, USA
Course Objectives/Goals: The goal of this course is to provide an introduction to optimal control and the basic theory of viscosity solutions for first-order Hamilton-Jacobi Equations.
Several examples will be given in order to motivate and introduce the main problems, to illustrate and explain the critical parts of the proofs.
Specific goals include:
- Elementary introduction to the basic problems in dynamic optimization: both in finite and infinite time horizon
- Necessary and sufficient conditions in optimal control
- Viscosity solutions: definition, existence, stability properties, and comparison principle
Student Learning Outcomes: By the end of this course, the students expect to know standard problems in optimal control, Hop-Lax formula, both Dynamic Programming Principle and Pontryagin maximum principle, and understand the concept, basic ideas and theory of viscosity solutions for first-order Hamilton-Jacobi equations.
Course outline:
I. Introduction
– Ordinary differential equations and control dynamics
– Standard optimal control problems
– Existence of optimal open-loop control
– Hop-Lax formula
II. Necessary and sufficient conditions
– Bang-Bang principle
– The Pontryagin maximum principle
– Dynamic programming principle and HJ equations
– Recovering the optimal control problem from the value function
III. Viscosity solutions
– The method of characteristics
– Generalized differentials
– Stability properties
– Comparison principle
– Perron’s method
Recommended notes or books:
- Introduction to the mathematical theory of control, Alberto Bressan
- Lecture note on viscosity solutions and optimal control problems, Khai T. Nguyen
- Hamilton–Jacobi Equations: Theory and Applications, Hung V. Tran