Pages

Tuesday, October 04, 2016

Thesis: Faster algorithms for convex and combinatorial optimization by Yin Tat Lee.

Congratulations Dr. Lee !

Here is some amazing work: Faster algorithms for convex and combinatorial optimization  by Yin Tat Lee.
In this thesis, we revisit three algorithmic techniques: sparsification, cutting and collapsing. We use them to obtain the following results on convex and combinatorial optimization: 
  • Linear Programming: We obtain the first improvement to the running time for linear programming in 25 years. The convergence rate of this randomized algorithm nearly matches the universal barrier for interior point methods. As a corollary, we obtain the first ... time randomized algorithm for solving the maximum flow problem on directed graphs with m edges and n vertices. This improves upon the previous fastest running time of achieved over 15 years ago by Goldberg and Rao. 
  • Maximum Flow Problem: We obtain one of the first almost-linear time randomized algorithms for approximating the maximum flow in undirected graphs. As a corollary, we improve the running time of a wide range of algorithms that use the computation of maximum flows as a subroutine. 
  • Non-Smooth Convex Optimization: We obtain the first nearly-cubic time randomized algorithm for convex problems under the black box model. As a corollary, this implies a polynomially faster algorithm for three fundamental problems in computer science: submodular function minimization, matroid intersection, and semidefinite programming.
  • Graph Sparsification: We obtain the first almost-linear time randomized algorithm for spectrally approximating any graph by one with just a linear number of edges. This sparse graph approximately preserves all cut values of the original graph and is useful for solving a wide range of combinatorial problems. This algorithm improves all previous linear-sized constructions, which required at least quadratic time. 
  • Numerical Linear Algebra: Multigrid is an efficient method for solving large-scale linear systems which arise from graphs in low dimensions. It has been used extensively for 30 years in scientific computing. Unlike the previous approaches that make assumptions on the graphs, we give the first generalization of the multigrid that provably solves Laplacian systems of any graphs in nearly-linear expected time.
Yin Tat's publications are here.



 
Join the CompressiveSensing subreddit or the Google+ Community or the Facebook page and post there !
Liked this entry ? subscribe to Nuit Blanche's feed, there's more where that came from. You can also subscribe to Nuit Blanche by Email, explore the Big Picture in Compressive Sensing or the Matrix Factorization Jungle and join the conversations on compressive sensing, advanced matrix factorization and calibration issues on Linkedin.

No comments:

Post a Comment