Duke Computer Science Colloquium

New Algorithms for Convex Optimization and Scarification

Thursday, March 27, -
Speaker(s): Arun Jambulapati

Lunch

Lunch will be served at 11:45 AM.

Abstract 

The computational challenges posed by modern massive datasets has driven the need for more efficient algorithms. My research addresses problems inspired by contemporary machine learning and develops theoretical tools to answer two central questions: 1) how can we learn from data faster, and 2) how can we select representative subsets of data (to potentially speed up downstream processing)

I will first present several results on faster convex optimization leveraging a new theoretical primitive called a “ball optimization oracle”. I will give near-optimal algorithms for minimizing convex functions assuming access to this oracle, and show how this framework can be applied to yield faster algorithms for important problems in computer science such as regression, differentially private optimization, and robust optimization.
 
I will follow with results on problems motivated by dataset compression. Leveraging techniques from high-dimensional geometry, I give near-optimal constructions of sparsifiers for sums of norms and generalized linear models. This directly implies new sparsifiers for hypergraphs, sums of symmetric submodular functions, and gives faster algorithms for p-norm regression. 

Speaker Bio

Arun Jambulapati is a visiting researcher at the University of Texas - Austin, working with Kevin Tian. Previously, he was a postdoc at the University of Michigan, a Simons Research Fellow, and a postdoc at the University of Washington; he completed his Ph. D from Stanford in 2022. His primary research interests are in the design of fast algorithms and sparsification. His work has been supported by a NSF Graduate Research Fellowship and a Stanford Graduate Research Fellowship.

Contact

Pankaj Agarwal