Tristan Brugère
Zhengchao Wan
Yusu Wang
The 35th International Conference on Algorithmic Learning Theory (2024)
(labeled) Graphs are everywhere
Discrete or continuous features
We want to
And we want this loss to be differentiable (for gradient descent).
No Polynomial algorithm for graph isomorphism
Graphs are combinatorial objects (how do we differentiate on the space of graphs?)
Fused Gromov Wassertein distance (FGW):
See graphs as metric spaces
Compute a matching of nodes minimizing:
Markov chain
Optimal Transport (Wasserstein’s distance, Earth mover’s distance…)
Regular
Markov
Pointset law
Markov Chains
Take
Study of optimal transport method for Markov chains
Theoretical framework: Optimal transport Markov (OTM) distances.
particular case of interest:
Differentiable
Computationally tractable
Naturally adapted to directed graphs
Can work with non-stationary Markov chains
Similar to the Weisfeiler-Lehman isomorphism test
Or to an MPNN
“Propagate” labels from neighbors to neighbors (random walk)
Determine “how different” the label distributions are
Compute by gradient descent:
Example output of the barycenter experiment on circle graphs with Erdős–Rényi noise
Computationally expensive
Ongoing work on generative models
$ pip install ot_markov_distances
This work is partially supported by NSF under grants CCF-2112665, CCF-2217058, and CCF-2310411.