In classical Greek mythology, the Minotaur was a ferocious monster that dwelt in a giant maze and devoured humans as an annual tribute. The Athenian hero Theseus successfully saved his countrymen from that gruesome fate.
A team of researchers at Argonne National Laboratory have similarly saved their colleagues from a monster – mixed-integer nonlinear programming (MINLP) problems. These problems arise in optimal decision problems common in science, engineering and financial applications; and they combine the combinatorial difficulty of optimizing over discrete variable sets with the challenges of handling nonlinear functions.
But the researchers had a bit of fun in twisting the myth slightly: their solution to slaying the computational monster is a toolkit named Minotaur (https://wiki.mcs.anl.gov/minotaur). And instead of a maze, the researchers view the many problem classes within the generic MINLP formulation as a tree.
“Actually, like a maze, this tree involves choices of path and direction,” said Sven Leyffer, a senior computational mathematician in the Mathematics and Computer Science (MCS) division at Argonne National Laboratory. “It starts with convex and nonconvex problems and branches into function classes including quadratic programming, linear programming, second-order cone programming, and polynomial optimization.”
Minotaur provides a broad range of reliable and efficient tree-search algorithms for solving MINLP problems. Moreover, its flexible design enables developers to explore new algorithms without compromising computational efficiency. Typically, in handling tree searches, Minotaur reformulates the main problem to comprise subproblems that are then “relaxed.” Minotaur also calls libraries for solving relaxations or simpler problems. Relaxation provides a lower bound on the optimal value of the subproblem such that it then can be solved by an available algorithm.
In some cases, for example where the problem constraints may be nonlinear but convex, the linear relaxation can be created by deriving linear underestimators from a first-order approximation.
And hence the name: Minotaur is actually an acronym for Mixed-Integer Nonlinear Optimization Toolkit: Algorithms, Underestimators, and Relaxations.
The key to Minotaur’s success is its ability to take advantage of problem structure. “Structure” refers to properties of the constraints and variables that can be exploited – and, sometimes, must be exploited – to solve the MINLP problem.
“Algorithms over the past decade have often exploited special problem structure,” said Todd Munson, a computational mathematician in the MCS division at Argonne. “What distinguishes Minotaur is that it provides a general software framework that is not associated with a particular problem type or solver. Without this flexible framework, finding global solutions to difficult nonconvex MINLP problems would remain out of reach for many applications.”
Another challenge that the Minotaur team faced was accommodating new problem classes. Here again, the flexible software framework of Minotaur provided the solution. By following a modular approach, applications developers can easily implement new functionality. The components remain interoperable and compatible as long as the functions they implement are compatible. Thus, developers can customize only a few selected components and use the other remaining components to produce new solvers.
Munson emphasized that Minotaur’s flexibility does not come at the cost of speed and efficiency. In fact, exploiting the problem structure in Minotaur can result in more efficient solvers.
The ancient Minotaur was slain. But the new Minotaur remains alive and well – opening the door for research to further advance the state of the art in algorithms and software for MINLP.
For further information about the toolkit, see the report by Ashutosh Mahajan, Sven Leyffer, Jeff Linderoth, James Luedtke, and Todd Munson, Minotaur: A Mixed-Integer Nonlinear Optimization Toolkit, ANL/MCS-P8010-0817. http://www.optimization-online.org/DB_HTML/2017/10/6275.html