Brust, Johannes; Di, Zichao; Leyffer, Sven; Petra, Cosmin
For general large-scale optimization problems compact representations exist in which recursive quasi-Newton update formulas are represented as compact matrix factorizations. For problems in which the objective function contains additional structure, recent structured quasi-Newton methods exploit available second-derivative information and approximate unavailable second derivatives. This article develops the compact representations of two structured Broyden-Fletcher-Goldfarb-Shanno update formulas. The compact representations enable efficient limited memory and initialization strategies. Two limited memory line search algorithms are described for which extensive numerical results demonstrate the efficacy of the algorithms, including comparisons to IPOPT on large machine learning problems, and to L-BFGS on a real world large scale ptychographic imaging application.