Abstract: The problem of optimization - maximizing or minimizing a real-valued objective function - is a major topic in numerical analysis with diverse applications. Most standard techniques for optimization rely on the availability of accurate estimates of the derivative of the objective. However, if the objective is computationally expensive or noisy, this is often not possible, and derivative-free optimization (DFO) techniques are necessary. In this talk, I will introduce a class of DFO algorithms for large-scale optimization based on iterative minimization within random subspaces. This technique improves over existing methods with dimension-independent convergence rates. I will also outline a practical variant of this idea for solving nonlinear least-squares problems which yields improved scalability than existing software and strong performance.
This talk is a version of Lindon's recent Leslie Fox Prize talk, this prize was awarded to Lindon by the Institute of Mathematics and its applications (IMA).
In person attendance is available in HN 1.33 for up to 52 people.
All attendees will be asked to check in using the CBR Covid-safe Check-In app or sign in on arrival.
Zoom attenence is also avalible.
To join this seminar via Zoom please click here.
If you would like to join the seminar online and are not currently affiliated with ANU, please contact Martin Helmer at email@example.com.