Software


Large-Scale Nonlinear Optimization

  • Our inexact interior-point algorithm for large-scale nonlinear optimization with inexact step computations has been implemented in Ipopt and can be used along with the iterative linear system solver in Pardiso. Please use the Ipopt option inexact_algorithm yes and e-mail me with any questions, bug reports, comments, or suggestions — they would be greatly appreciated!

Nonlinear Optimization Algorithms for Potentially Infeasible Problems

  • PIPAL 1.2 (zip): Penalty-Interior-Point Algorithm.
    prototype code for smooth constrained optimization. The optimization problem is reformulated as a penalty-interior-point subproblem (with only equality constraints) and a Newton method is applied. The slack variables are effectively eliminated during the solution process and a line search is employed for global convergence. The penalty and interior-point parameters are updated using an adaptive strategy (a conservative strategy is also implemented as an alternative) in order to achieve rapid convergence to an optimal solution or, if no feasible solution can be found, an infeasible stationary point. The code accepts AMPL input. Note that this is only a prototype implementation. Please e-mail me with any bug reports, comments, or suggestions — they would be greatly appreciated!

Nonconvex, Nonsmooth Optimization

  • GRANSO (html): GRadient-based Algorithm for Nonsmooth Optimization.
  • SLQP-GS 1.2 (zip): Sequential Linear or Quadratic Programming with Gradient Sampling.
    prototype code for nonconvex, nonsmooth constrained optimization. The search direction computation is performed by minimizing a local linear or quadratic model of the objective subject to a linearization of the constraints. Gradients for each problem function are sampled to make the search direction computation effective in nonsmooth regions. The user has the option of choosing between SLP-GS or SQP-GS modes, and has the option of tuning various input parameters for each application. The code for a sample problem is provided in order to illustrate how other problems can be formulated and solved with the code. Note that this is only a prototype implementation. Please e-mail me if you use the code or with any bug reports, comments, or suggestions — they would be greatly appreciated!

Stochastic Optimization

  • scBFGS (zip): Self-Correcting BFGS Algorithm for Stochastic Optimization.
    prototype code for stochastic optimization. The code allows various algorithmic options, each trying to exploit the self-correcting properties of BFGS-type updating. A sample logistic regression minimization problem is provided in order to illustrate how other problems can be formulated and solved with the code. Note that this is only a prototype implementation. Please e-mail me if you use the code or with any bug reports, comments, or suggestions — they would be greatly appreciated!

Optimization Benchmarking

  • betaRMP (html): beta-Relative Minimization Profiles.
    Benchmarking visualization tool for creating plots that concisely compare optimization methods evaluated on large heterogenous sets of test problems; written by Tim Mitchell.
  • PSARNOT (html): (Pseudo)Spectral Abscissa|Radius Nonsmooth Optimization Test.
    A test set for evaluating methods for nonsmooth optimization; written by Tim Mitchell.

Primal-Dual Active-Set Methods for Convex Quadratic Optimization

  • pypdas (html).
    Python software for a primal-dual active-set method for solving general convex quadratic optimization problems; written by Zheng Han.
  • ipdas (html).
    Python software for a primal-dual active-set method with inexact subproblem solves for solving certain convex quadratic optimization problems, typically arising in optimal control; written by Zheng Han.