WF-04: Constrained optimization methods and solvers in Julia
Stream: Constrained optimization methods and solvers in Julia
Chair(s): Miguel F. Anjos
Krylov methods in interior-point algorithms: a computational survey
Mathieu Tanneau, Alexis Montoison
Following growing interest in the community, we present a computational survey on the use of Krylov methods for solving the linear systems that arise in interior-point algorithms. First, we review the various linear systems that can be formulated from the so-called Newton system, and establish a nomenclature of which Krylov methods are suited to which linear system formulation. Then, we review a number of generic preconditioners, both from mathematical and practical perspectives. Finally, we compare the numerical behavior of each possible approach through extensive computational experiments.
Hypatia.jl: A Generic Nonsymmetric Conic Optimization Solver in Julia
Chris Coey, Lea Kapelevich, Juan Pablo Vielma
Hypatia is an open-source optimization solver in Julia and is accessible through a native interface and through JuMP/MathOptInterface. Hypatia makes it easy to model and solve primal-dual conic problems involving general convex cones for which appropriate primal OR dual barrier functions are known. We introduce Hypatia’s interfaces and algorithms and demonstrate computational advantages of compact “natural” conic formulations over extended formulations that use only “classical” cones. We also describe some algorithmic advances that have helped to make Hypatia competitive.
Conditional gradient methods for large-scale constrained optimization
Mathieu Besançon, Sebastian Pokutta, Alejandro Carderera
Conditional gradient algorithms allow the integration of convex constraints in a first-order optimization method. We present a new Julia toolbox integrating several variants of the Conditional Gradient method and detail the design choices that enable large-scale and atypical applications. In particular, we will present the linear minimization oracle interface making the library extensible and allowing users to leverage closed-form solution of linear minimization subproblems when they are known.
DiffOpt.jl differentiating your favorite optimization problems
Joaquim Dias Garcia, Mathieu Besançon, Benoît Legat, Akshay Sharma
DiffOpt aims at differentiating optimization problems written in MathOptInterface (MOI). Because MOI is JuMP’s lower level interface with solvers, this will “just work” with JuMP problems. The current framework is based on existing techniques for differentiating the solution of optimization problems with respect to the input parameters. We will show the current state of the package that supports Quadratic Programs and Conic Programs. Moreover, we will highlight how other packages are used to keep the library generic and efficient.