Skip to main content

Beyond Minimax Optimality: A Subgame Perfect Gradient Method

Speaker
Dr. Ben Grimmer; Department of Applied Mathematics and Statistics Johns Hopkins University
Date
Location
University of Houston
Abstract

This talk will take up the task of designing the provably best possible gradient method for smooth convex optimization. Methods with big-O optimal worst-case guarantees were (famously) discovered in the 80s by Nesterov. Methods with exactly minimax optimal worst-case guarantees were developed in the last decade. We will present a “subgame perfect” method that is not only optimal against a worst-case problem instance but also optimally leverages all gradient information revealed at each step. This corresponds to being dynamically minimax optimal, or in game theory terms, provides us with a subgame perfect strategy for optimization. An adaptive version of our subgame perfect gradient method will be presented, achieving state-of-the-art performance that rivals BFGS

Biography

Ben Grimmer is an assistant professor of applied mathematics and statistics at Johns Hopkins University, supported by AFOSR and as a Sloan Fellow. Ben’s work primarily focuses on novel methods for the design and analysis of first-order methods, recently receiving best paper prizes, including the most recent INFORMS Optimization Societies Young Researcher Prize. Some of his recent computer-assisted works have received substantial interest, being featured in popular mathematics venues like Quanta.