Tristan Brugère’s personal website

Bio

I am a PhD student in the Halıcıoğlu Data Science Institute at University of California, San Diego. I am advised by Professor Yusu Wang.

My main research interest is approaching machine learning from a mathematical perspective. I am currently working on optimal transport, and neural networks on graphs, with applications to chip design.

I obtained my Diplôme d’Ingénieur from École Polytechnique in 2021, majoring in Math and Computer Science, and my Master of Science in Data Science and Machine Learning in UCSD's ECE department in 2023.

Software

  • YABD Daemon that automatically sets the brightness levels depending on light sensor on Linux.

Papers

2024

  • Tristan Brugère, Zhengchao Wan, and Yusu Wang. Distances for markov chains, and their differentiation. In Proceedings of The 35th International Conference on Algorithmic Learning Theory. PMLR, 2024.
    (Directed) graphs with node attributes are a common type of data in various applications and there is a vast literature on developing metrics and efficient algorithms for comparing them. Recently, in the graph learning and optimization communities, a range of new approaches have been developed for comparing graphs with node attributes, leveraging ideas such as the Optimal Transport (OT) and the Weisfeiler-Lehman (WL) graph isomorphism test. Two state-of-the-art representatives are the OTC distance proposed in (O’Connor et al., 2022) and the WL distance in (Chen et al., 2022). Interestingly, while these two distances are developed based on different ideas, we observe that they both view graphs as Markov chains, and are deeply connected. Indeed, in this paper, we propose a unified framework to generate distances for Markov chains (thus including (directed) graphs with node attributes), which we call the Optimal Transport Markov (OTM) distances, that encompass both the OTC and the WL distances. We further introduce a special one-parameter family of distances within our OTM framework, called the discounted WL distance. We show that the discounted WL distance has nice theoretical properties and can address several limitations of the existing OTC and WL distances. Furthermore, contrary to the OTC and the WL distances, our new discounted WL distance can be differentiated after a entropy-regularization similar to the Sinkhorn distance, making it suitable to use in learning frameworks, e.g., as the reconstruction loss in a graph generative model.
    @InProceedings{pmlr-v237-brugere24a, author = "Brugère, Tristan and Wan, Zhengchao and Wang, Yusu", title = "Distances for Markov Chains, and Their Differentiation", booktitle = "Proceedings of The 35th International Conference on Algorithmic Learning Theory", year = "2024", publisher = "PMLR", pages = "282--336", editor = "Vernade, Claire and Hsu, Daniel", volume = "237", series = "Proceedings of Machine Learning Research", month = "25--28 Feb" }
    Paper
    (external_link) external
    Code
    (external_link) external

2023

  • Jesse He, Tristan Brugère, and Gal Mishne. Product manifold learning with independent coordinate selection. In Proceedings of 2nd Annual Workshop on Topology, Algebra, and Geometry in Machine Learning (TAG-ML). PMLR, 2023.
    In many dimensionality reduction tasks, we wish to identify the constituent components that explain our observations. For manifold learning, this can be formalized as factoring a Riemannian product manifold. Recovering this factorization, however, may suffer from certain difficulties in practice, especially when data is sparse or noisy, or when one factor is distorted by the other. To address these limitations, we propose identifying non-redundant coordinates on the product manifold before applying product manifold learning to identify which coordinates correspond to different factor manifolds. We demonstrate our approach on both synthetic and real-world data.
    @InProceedings{He23Product, author = "He, Jesse and Brugère, Tristan and Mishne, Gal", title = "Product Manifold Learning with Independent Coordinate Selection", booktitle = "Proceedings of 2nd Annual Workshop on Topology, Algebra, and Geometry in Machine Learning (TAG-ML)", year = "2023", publisher = "PMLR", pages = "267--277", editor = "Doster, Timothy and Emerson, Tegan and Kvinge, Henry and Miolane, Nina and Papillon, Mathilde and Rieck, Bastian and Sanborn, Sophia", volume = "221", series = "Proceedings of Machine Learning Research", month = "28 Jul" }
    Paper
    (external_link) external