[G-Stats Seminar] Grégoire Sergeant-Perthuis

Grégoire Sergeant-Perthuis

Inria (Ouragan project-team)

19/01/2023 – h. 2 p.m.

Title: Optimization over presheaves and message passing algorithms, with applications


A recent trend in machine learning is on how to use deep learning for data with combinatorial structure that encode some geometry (geometric deep learning [1]: Graph Neural Networks [2] and generalization, Sheaf Neural Networks [3]).  On the other hand, these structures have not ceased to interest those who are interested in the probabilistic aspects of machine learning [4,5] (belief networks, graphical models [6]). The structures of interest in geometric deep learning (graphs and their representations) exhibit properties that allow for an increase in modeling power. They are compositional; their description equates to a blueprint of how their parts are patched together, indifferently of how heterogeneous the different parts are. The description of the structure is local and does not require a global homogeneous description. This contrast with previous models in probabilistic graphical models which require a description of the whole probabilistic model of interest in an ambient space. Keeping the description of probabilistic structures local motivated an active trend of fundamental research in statistics, probability and machine learning [7,8] similar to the one in quantum mechanics [9].

In this presentation, we ask what would optimization on a compositional structure look like and how to solve it [10]. The compositional structure we consider is a generalization of data structured on a hierarchy (a presheaf over a partially ordered set). It generalizes models of data with combinatorial structures most commonly used in practice. We answer this question by introducing a combinatorial cost function for the structured data and give message-passing algorithms to solve this optimization problem. This problem we pose could be reformulated as follows; consider data structured in a hierarchy for which there is a clear cost function at each place of the hierarchy. What is a meaningful global cost function for the whole structure? How to solve this optimization problem?

We explain how these results relate to (Generalized) Belief Propagation [6] in the specific case of approximate inference over energy-based models. We exhibit several applications by specifying our framework to specific structures and cost functions.

[1] Geometric Deep Learning: Grids, Groups, Graphs, Geodesics, and Gauges, M. M. Bronstein, J. Bruna, T. Cohen, P. Veličković, arXiv:2104.13478, 2021
[2]  The Graph Neural Network Model, F. Scarselli, M. Gori, A. C. Tsoi, M. Hagenbuchner and G. Monfardini, IEEE Transactions on Neural Networks, 2008
[3] Sheaf Neural Networks with Connection
Laplacians, F. Barbero, C. Bodnar, H. Saez de Ocariz Borde, M. Bronstein, P. Velickovic, P. Lio,   ICML 2022 Workshop on Topology, Algebra, and Geometry in Machine Learning, 2022
[4] Probabilistic Reasoning in Intelligent Systems, Judea Pearl, Morgan Kaufmann, 1988
[5] Information, Physics, and Computation, M. Mézard, A. Montanari, Oxford University press, 2009
[6] Constructing free-energy approximations and generalized belief propagation algorithms, J. S. Yedidia, W. T. Freeman and Y. Weiss, IEEE Transactions on Information Theory, 2005
[7] Category Theory in Machine Learning, D. Shiebler, B. Gavranović, P. Wilson, arXiv:2106.07032, 2021
[8] Intersection property, interaction decomposition, regionalized optimization and applications, G. Sergeant-Perthuis, PhD Thesis, 2021 (https://www.researchgate.net/publication/349732943_Intersection_property_interaction_decomposition_regionalized_optimization_and_applications)
[9] Categories for Quantum Theory, C. Heunen, J. Vicary, Oxford Graduate Texts in Mathematics, 2019
[10] Regionalized Optimization, G. Sergeant-Perthuis, https://arxiv.org/abs/2201.11876, 2022

Comments are closed.