A Semismooth Newton Method for Fast, Generic Convex Programming by Alnur Ali, Eric Wong, J. Zico Kolter
Due to their generality, conic optimization problems, which can represent most convex optimization problems encountered in practice, have been the focus of much recent work, and additionally form the basis of many convex optimization modeling frameworks. In this paper, we introduce Newton-ADMM, a method for fast conic optimization. The basic idea is to view the residuals of consecutive iterates generated by SCS, a state-of-the-art, iterative conic solver, as a fixed point iteration, and then use a nonsmooth Newton method to find a fixed point. We demonstrate theoretically, by extending the theory of semismooth operators, that Newton-ADMM converges rapidly (i.e., quadratically) to a solution; empirically, Newton-ADMM is significantly faster than SCS, on a number of problems. The method also has essentially no tuning parameters, generates certificates of primal or dual infeasibility, when appropriate, and can be specialized to solve specific convex problems.
No comments:
Post a Comment