I taught Riemannian optimization at EPFL in spring 2023 for the course MATH-512. The recorded lectures are available below. A semester runs for 14 weeks. Most weeks have a lecture and an exercise session, which each last 90-95 minutes (week 5 had double lectures). The students further learned the material by completing two extensive projects.
Video links bring you to an external hosting service. There, you can change the playback speed of videos (play faster, slower): see controls at the bottom-right of the video panel. You can also download the videos.
I gratefully acknowledge EPFL's CEDE (Matthew Goodman in particular) for video post-production!
Exercises and solutions prepared with Christopher Criscitiello and Timon Miehling in 2023.
Click a question to display a sketch of the answer if available. Click a sketch to display a detailed answer if available. If no sketch is available but a detailed answer is, then clicking the question displays the detailed answer.
To load exercises below, go to the Lectures tab, and click "Exercises" next to a week's header.
You may find the Manopt toolboxes helpful to use Riemannian optimization in your projects:
For example, in Matlab, to minimize the function $f(x) = x^\top A x$ on the sphere in $\Rn$, you can write:
n = 42;
A = randsym(n);
problem.M = spherefactory(n);
problem.cost = @(x) x'*A*x;
problem.egrad = @(x) 2*A*x; % or use autodiff: problem = manoptAD(problem);
x = trustregions(problem); % call an overly fancy algorithm
In Python, you could use JAX (among other backends) and write:
import pymanopt
import jax
n = 42
key = jax.random.key(0)
A = jax.random.normal(key, [n, n])
A = A + A.T
manifold = pymanopt.manifolds.Sphere(n)
@pymanopt.function.jax(manifold)
def cost(x):
return x.T @ A @ x
problem = pymanopt.Problem(manifold, cost)
optimizer = pymanopt.optimizers.TrustRegions()
result = optimizer.run(problem)
In Julia, you might write:
using Manopt, Manifolds, LinearAlgebra
n = 42
A = Symmetric(randn(n, n))
M = Sphere(n-1)
f(E, x) = x'*A*x
∇f(E, x) = 2*A*x # define the Euclidean gradient
x = trust_regions(M, f, ∇f; objective_type=:Euclidean)
Contact: nicolas.boumal@epfl.ch.
For general questions about geometry and optimization, please use the Manopt forum, MathOverflow or Math StackExchange.
I gave a minitutorial on Riemannian optimization at SIAM Conference on Optimization 2023. The format was two lectures of 90 minutes each. The slides used during the tutorial hold a summary of the basic geometric tools and algorithms useful for optimization. You can also download the Manopt example codes for Max-Cut in Matlab. When/if videos become available, I will link to them here.
Here are an (older) one-hour video and a two-hour video covering some of those concepts. The former was a bootcamp tutorial at the Simons Institute: the slides of that video are available at that link. The two videos have mostly the same contents.
With high probability, you will enjoy the classic book Optimization Algorithms on Matrix Manifolds by Absil, Mahony and Sepulchre (Princeton University Press, 2008), also freely available online.
This book about Riemannian optimization by Nicolas Boumal is published by Cambridge University Press, 2023.
You can also download the pre-publication PDF.
This website further offers recorded lectures (videos + slides) and exercises, as a companion to the book.
Feel free to e-mail me about any mistakes you spot or suspect, be it typos or more serious things. These are added as sticky notes in the pdf above (last updated on Sep. 15, 2023). I appreciate your input, always.
Optimization on manifolds is the result of smooth geometry and optimization merging into one elegant modern framework. This text introduces the differential geometry and Riemannian geometry concepts to help students and researchers in applied mathematics, computer science and engineering gain a firm mathematical grounding to use these tools confidently in their research.
All definitions and theorems are motivated to build time-tested optimization algorithms. Starting from first principles, the text goes on to cover current research on topics including iteration complexity and geodesic convexity. Readers will appreciate the tricks of the trade sprinkled throughout the book, to guide research and numerical implementations.
This book has no prerequisites in geometry or optimization. Chapters 3 and 5 can serve as a standalone introduction to differential and Riemannian geometry, focused on embedded submanifolds of linear spaces, with proofs and an eye towards computability. It then builds from there to cover algorithms and equip the reader for modern research challenges.
Chapter 8 provides the general theory so that we can build quotient manifolds in Chapter 9. The optimization algorithms in Chapters 4 and 6 apply to the general case, but can already be understood after reading Chapters 3 and 5. Chapter 7 details examples of submanifolds that come up in practice. Chapter 10 covers more advanced Riemannian tools, and Chapter 11 introduces geodesic convexity.
As a course, this material is popular with applied mathematicians, computer scientists and mathematically inclined engineering students, at the graduate and advanced undergraduate levels.
In a one-semester graduate course of the mathematics department at Princeton University in 2019 and 2020 (24 lectures of 80 minutes each, two projects, no exercises), I covered much of (what became) Chapters 1–6 and select parts of Chapter 7 before the midterm break, then much of Chapters 8–9 and select parts of Chapters 10–11 after the break. Those chapters were shorter at the time, but it still made for a sustained pace.
At EPFL in 2021, I discussed mostly Chapters 1–8 in 13 lectures of 90 minutes, plus as many exercise sessions and two projects. In 2023, the lectures were recorded: see Lectures tab above.
If you (will) teach this topic, feel free to e-mail me: I can share more resources.