Faster Optimization on Sparse Graphs via Neural Reparametrization

Csaba Both, Nima Dehmamy, Jianzhi Long, Rose Yu
Proceedings of Machine Learning Research
(LoG 2024), PMLR 269
November 16, 0024

Optimization on sparse graphs underlies many scientific challenges, such as heatdiffusion in solids and synchronization of nonlinear oscillators. Conventionalmethods often suffer from slow convergence, especially in complex, randomgraph structures typical of glassy systems. We introduce a novel approach usingGraph Neural Networks (GNN) to reparametrize the state of each node as anetwork output, transforming the optimization variables into the GNN weightsand high-dimensional node embeddings. This reparametrization effectivelypreconditions the optimization landscape, leveraging the approximate Hessianto accelerate convergence—akin to quasi-Newton methods. Our experimentsdemonstrate significant speed enhancements in solving the heat equation withboundary conditions on 2D and 3D graphs and in achieving synchronizationin the Kuramoto model. Our work both provides a theoretical analysis for theobserved performance gains and also a framework for improving optimizationtechniques in graph-based problems.

Share this page:

Related publications