With the adoption of renewable energy resources, battery storage and other modern grid elements, the computational performance of existing tools for power systems optimization has become a critical bottleneck, preventing the adoption of important model enhancements that could bring significant benefits to the system. The computational challenge typically comes from having to solve a large number of closely-related optimization problems to optimality, which can be time-consuming. This is the case, for example, in power generation scheduling under uncertainty, where the system need to be optimized under multiple scenarios to find a reliable and cost-effective stochastic solution. Existing optimization tools solve subproblems from scratch, failing to leverage the fact that they share significant similarities.
Our goal in this project is to develop a general framework for the next-generation of power system optimization tools, which uses Artificial Intelligence (AI) to automatically identify patterns in previously-solved optimization subproblems and effectively uses this information to accelerate the solution of new ones. Unlike conventional tools, these AI-accelerated optimization tools will become increasingly more efficient over time, as they learn about the particular characteristics of the system where they operate.
Instead of completely replacing state-of-the-art methods based on Mixed-Integer Programming (MIP) – which are already widely used, understood and well accepted within the power industry – we propose to develop new AI components that will guide existing MIP solvers towards the optimal solution faster. This research, therefore, leverages both decades of MIP research, as well as modern advancements in AI and Machine Learning (ML). To accelerate adoption within the industry, an implementation of all the methods developed in this project will be available as part of MIPLearn, an open-source software package for Learning-Enhanced Mixed-Integer Optimization being developed during this project, compatible with existing commercial and open-source MIP solvers, and applicable to power system optimization problems in general.
Branch-and-cut (B&C) is the state-of-the-art method used by the power industry to solve large-scale Mixed-Integer Optimization (MIP) problems to optimality. In B&C, the solution space is recursively partitioned into sub-regions and organized in a tree structure, which is then explored until an optimal solution is found. Candidate solutions found during the exploration, in conjunction with the linear relaxation of the problem and cutting planes, are used to identify and discard branches of the tree that cannot provably contain optimal solutions. During the solution process, MIP solvers typically make millions of heuristic decisions to efficiently explore the solution space. Examples include deciding how to partition the solution space when creating new tree branches; which branch to explore next; which cutting planes to generate; and how to generate initial candidate solutions. All these decisions have dramatic impact on the performance of the method. Because off-the-shelf MIP solvers have no prior knowledge of the specific problems the user intends to solve, solvers typically include only generic strategies that make acceptable decisions to a diverse range of optimization problems. Off-the-shelf solvers also use hard-coded, non-adaptive strategies which do not take into account problem characteristics of previously solved instances.
In this project, we will use AI to guide the branch-and-cut (B&C) method, focusing on the situation where a large number of subproblems need to be solved efficiently. In the proposed approach, after a small number of subproblems has been solved to optimality, the previously-found optimal solutions, together with statistics collected during the execution of the B&C algorithm itself, will be used to automatically select, train and cross-validate Machine Learning (ML) models which will guide future runs of the algorithm, with minimal user intervention. As more subproblems are solved, these ML models will be further trained and refined, improving the performance of the MIP solver as time progresses. We will specifically focus on four aspects of B&C that have high performance impact: (1) generation of high-quality initial solutions; (2) branching and node selection; (3) generation of cutting planes; and (4) parallelization.
- Alinson Santos Xavier, Feng Qiu, Shabbir Ahmed. Learning to Solve Large-Scale Security-Constrained Unit Commitment Problems. INFORMS Journal on Computing, 2021. https://pubsonline.informs.org/doi/abs/10.1287/ijoc.2020.0976
- Alinson Santos Xavier, Feng Qiu, Fengyu Wang, Prakash R. Thimmapuram. Transmission Constraint Filtering in Large-Scale Security-Constrained Unit Commitment. IEEE Transactions on Power Systems, 2019. https://doi.org/10.1109/TPWRS.2019.2892620
Open-source framework for solving discrete optimization problems using a combination of Mixed-Integer Linear Programming (MIP) and Machine Learning (ML). MIPLearn uses ML methods to automatically identify patterns in previously solved instances of the problem, then uses these patterns to accelerate the performance of conventional state-of-the-art MIP solvers such as CPLEX, Gurobi or XPRESS.
- Source code: https://github.com/ANL-CEEESA/MIPLearn
- Documentation: https://anl-ceeesa.github.io/MIPLearn/
Open-source optimization package for the Security-Constrained Unit Commitment Problem (SCUC), a fundamental optimization problem in power systems used, for example, to clear the day-ahead electricity markets. The package provides benchmark instances for the problem and JuMP implementations of state-of-the-art mixed-integer programming formulations.
- Source code: https://github.com/ANL-CEEESA/UnitCommitment.jl
- Documentation: https://anl-ceeesa.github.io/UnitCommitment.jl//