gradient flow

Optimization, gradient flow, and their asymptotic analysis in geometric problems

MSI Colloquium, where the school comes together for afternoon tea before one speaker gives an accessible talk on their subject

schedule Date & time
Date/time
13 Jun 2024 4:00pm - 13 Jun 2024 5:00pm
person Speaker

Speakers

Beomjun Choi (Pohang University of Science and Technology)
next_week Event series

Event series

contact_support Contact

Content navigation

Description

Abstract

Finding minimizers of a given function has always been a fundamental task in mathematics. One successful method, both from practical and theoretical perspectives, is the gradient flow. A gradient flow simply refers to a trajectory of a point moving in the negative gradient of a given function to minimize the function in the most efficient way.

In this talk, we will discuss the convergence of gradient flow, an important subject in optimization (including the theory of machine learning) and the calculus of variations, which studies the optimization of functions, shapes, and spaces. In the 1960s, Stanisław Łojasiewicz made a significant contribution by introducing gradient inequalities from real algebraic geometry, and in the 1980s, Leon Simon extended Łojasiewicz’s theory to infinite-dimensional problems in the calculus of variations. After an overview of these developments, we will discuss the gradient conjecture of René Thom concerning the next-order asymptotics of convergence and our resolution of this conjecture for infinite-dimensional problems. This is joint work with Pei-Ken Hung at UIUC.

Location

Room 1.33 (Seminar Room), Building 145, Science Road, ANU

-35.275389387895, 149.11926090717