We monthly host researchers for seminars on Geometric Statistics.

**Upcoming**

**14/12/2022 – h. 11 a.m.**

### Ian Dryden

Florida International University

**
Title:** Statistical shape analysis of molecular dynamics data

**Abstract:** Molecular dynamics (MD) simulations produce huge datasets of temporal sequences of molecules. It is of interest to summarize the shape evolution of the molecules in a succinct, low-dimensional representation. However, Euclidean techniques such as principal components analysis (PCA) can be problematic as the data may lie far from in a flat manifold. Principal nested spheres gives a fundamentally different decomposition of data from the usual Euclidean subspace based PCA. Subspaces of successively lower dimension are fitted to the data in a backwards manner with the aim of retaining signal and dispensing with noise at each stage. We adapt the methodology to 3D subshape spaces and provide some practical fitting algorithms. The methodology is applied to cluster analysis of peptides, where different states of the molecules can be identified. Also, the temporal transitions between cluster states are explored. Further molecular modelling tasks include resolution matching, where coarse resolution models are backmapped into high resolution (atomistic) structures.

**12/01/2023 – h. 2 p.m.**

### Ludovic Rifford

Laboratoire J.A. Dieudonné

Université Côte d’Azur

**Past Speakers**

**24/11/2022 – h. 2 p.m.**

#### Frank Nielsen

Sony CSL

**Title**: “Revisiting Chernoff Information with Likelihood Ratio Exponential Families”

** Abstract:**

The Chernoff information between two probability measures is a statistical divergence measuring their deviation defined as their maximally skewed Bhattacharyya distance. Although the Chernoff information was originally introduced for bounding the Bayes error in statistical hypothesis testing, the divergence found many other applications due to its empirical robustness property found in applications ranging from information fusion to quantum information. From the viewpoint of information theory, the Chernoff information can also be interpreted as a minmax symmetrization of the Kullback–Leibler divergence. In this talk, we first revisit the Chernoff information between two densities of a measurable Lebesgue space by considering the exponential families induced by their geometric mixtures: The so-called likelihood ratio exponential families. Second, we show how to

(i) solve exactly the Chernoff information between any two univariate Gaussian distributions or get a closed-form formula using symbolic computing,

(ii) report a closed-form formula of the Chernoff information of centered Gaussians with scaled covariance matrices, and

(iii) use a fast numerical scheme to approximate the Chernoff information between any two multivariate Gaussian distributions.

**25/05/2022 – h. 10 a.m.**

### Huiling Le

University of Nottingham, United Kingdom

**
Title:** Recent progress on Stein’s method on manifolds

**Abstract:** Motivated by recent interest in applications of Stein’s method to non-Euclidean data analysis in statistics, this talk discusses our recent progress in the investigation into how the diffusion method can be used to generalise Stein’s equation to probability measures on Riemannian manifolds. We obtain the Stein factors which contain curvature-dependent terms and reduce to those currently available for the Euclidean space. In particular, we show that the Stein factors for the Euclidean space remain valid as long as the manifold is flat.

**25/05/2022 – h. 11 a.m.**

### Marc Arnaudon

Université de Bordeaux, France

**
Title:** Coupling of Brownian motions with set valued dual processes on Riemannian manifolds; application to perfect simulation

**Abstract:** In this talk we will motivate and explain the evolution by renormalized stochastic mean curvature flow, of boundaries of relatively compact connected domains in a Riemannian manifold. We will construct coupled Brownian motions inside the moving domains, satisfying a Markov intertwining relation. We will prove that the Brownian motions perform perfect simulation of uniform law, when the domain reaches the whole manifold. We will investigate the example of evolution of discs in spheres, and of symmetric domains in R^2. Skeletons of moving domains will play a major role.

**25/05/2022 – h. 2 p.m.**

### Rajendra Bhatia

Ashoka University, India

**
Title:** The Sylvester Equation and Its Applications to Perturbation of Eigenspaces

**Abstract:** The matrix equation AX-XB=C arises in several contexts. We will present some general facts, and then specific solutions suited to applications in deriving perturbation bounds for eigenspaces of Hermitian and normal matrices. Perturbation of some other matrix functions such as the square root and the polar factors will also be discusse

**09/05/2022 – h. 2 p.m.**

### Erland Grong

University of Bergen

**Bio**: Erland is an assistant professor at the Department of Mathematics at University of Bergen. He is a researcher in the project GeoProCo: Geometry and Probability with Constrains, supported by the Trond Mohn Foundation. See more details about the project at its homepage. He did his PhD at University of Bergen and his Post Doc at Universite Paris Saclay. His research interests are centered around sub-Riemannian geometry, and branches out in any direction linked to this subject.

**
Title:** Most probable paths for anisotropic Brownian motion.

**Abstract:** We want to consider mean and covariance of a set of points on a Riemannian manifold in an intrinsic way. In order to do this we will use the diffusion mean. Trying to estimate this mean with covariance, one of the main tools is most probable paths, that differ from geodesics when spaces are curved. We will describe these paths in details and show their applications in algorithms for finding the diffusion mean and variance..Computing most probable paths is related to sub-Riemannian geometry of the frame bundle.

**28/04/2022 – h. 2 p.m.**

### Erik J. Bekker

University of Amsterdam

**Bio**: Erik Bekkers is an assistant professor in Geometric Deep Learning in the Machine Learning Lab of the University of Amsterdam (AMLab, UvA). Before this he did a post-doc in applied differential geometry at the dept. of Applied Mathematics at Technical University Eindhoven (TU/e). In his PhD thesis (cum laude, Biomedical Engineering, TU/e), he developed medical image analysis algorithms based on sub-Riemannian geometry in the Lie group SE(2) using the same mathematical principles that underlie mathematical models of human visual perception. Such mathematics find their application in machine learning where through symmetries and geometric structure, robust and efficient representation learning methods are obtained. His current work is on generalizations of group convolutional NNs and improvements of computational and representation efficiency through sparse (graphs) and adaptive learning mechanisms.

**Title:** Group equivariant deep learning and non-linear convolutions

**Abstract:** In this talk I will cover the essential theory behind group equivariant deep learning and show applications in medical imaging, computational chemistry, and physics. Group equivariant deep learning methods allow for providing guarantees of equivariance (if the input transforms, the output transforms in a predictable way). The equivariance property in turn enables weight sharing over symmetries, enable hierarchical pattern recognition, respect symmetry constraints imposed by the to-be-solved problem at hand, and allow for working with geometric quantities such as (force) vectors in a principled way. I will start with an introduction to group convolutional neural networks (G-CNNs) with applications to medical images and show that group convolutions are the most general class of equivariant linear layers. Then I will show an application of group equivariant deep learning on point clouds (molecular property prediction, n-body simulations) using steerable group convolutions. Finally, I will introduce two important directions that we are currently pursuing: (i) powerful deep learning architectures via so-called non-linear convolutions and (ii) a flexible framework to build equivariant DL architectures by parametrizing continuous convolution kernels with band-limited neural networks.

Reference: Brandstetter, J., Hesselink, R., van der Pol, E., Bekkers, E., & Welling, M. (2021). Geometric and Physical Quantities improve E (3) Equivariant Message Passing. In ICLR 2022

**24/02/2022 – h. 2 p.m.**

### Emmanuel Chevallier

Aix-Marseille university (AMU)

**Bio**: Emmanuel is an assistant professor at Aix-Marseille university (AMU), in the “physics and image processing” team of the Fresnel institute. Before joining AMU, he was a post-doc at Duke university with David Dunson and Jana Schaich Borg, a post-doc at the Weizmann institute with Ronen Basri, and a phd student at Ecole des Mines de Paris with Jesus Angulo. His research is focused on data science problems involving particular geometric structures.

**
Title:** Hyperbolic geometry and light polarization.

**Abstract:** I will start this talk by an introduction to hyperbolic spaces: how

they are constructed and why they form an important class of spaces. I

will continue by an introduction to the polarization of light waves.

After describing the fully polarized planar waves, we will see how the

Stockes vector describing the state of a partially polarized light, is

naturally interpreted as a point of the hyperbolique space of dimension

3. We will then discuss the consequences of this observation.

**10/02/2022 – h. 2 p.m.**

### Mathieu Carrière

INRIA Sophia Antipolis-Méditerrané and University of Nice

**Bio**: Mathieu did his PhD at Inria Saclay in the DataShape team, under the supervision of Steve Oudot, and a postdoc of two years in the Rabadán Lab, at the Department of Systems Biology of Columbia University, under the supervision of Raúl Rabadán. His research focuses on topological data analysis (TDA) and statistical machine learning (ML), with an application to bioinformatics and genomics.

**
Title:** A Framework to Differentiate Persistent Homology with Applications in Machine Learning and Statistics

**Abstract:** Solving optimization tasks based on functions and losses with a topological flavor is a very active and growing field of research in data science and Topological Data Analysis, with applications in non-convex optimization, statistics and machine learning. However, the approaches proposed in the literature are usually anchored to a specific application and/or topological construction, and do not come with theoretical guarantees. To address this issue, we study the differentiability of a general map associated with the most common topological construction, that is, the persistence map. Building on real analytic geometry arguments, we propose a general framework that allows us to define and compute gradients for

persistence-based functions in a very simple way. We also provide a simple, explicit and sufficient condition for convergence of stochastic subgradient methods for such functions. This result encompasses all the constructions and applications of topological optimization in the literature. Finally, we provide associated code, that is easy to handle and to mix with other non-topological methods and constraints, as well as some experiments showcasing the versatility of our approach.

**04/11/2021 – h. 1 p.m.**

### Clément Maria

INRIA Sophia Antipolis-Méditerrané and University of Nice

**Bio**: Clement is a permanent researcher within the DataShape research group at INRIA Sophia Antipolis-Méditerranée, France. My research interests are in Computational Geometry and Topology, with a focus on: Persistent Homology and Low-Dimensional Topology. He is one of the main creators of the C++ software library GUDHI for Topological Data Analysis. He received his PhD in 2014 from the University of Nice and INRIA Sophia Antipolis-Méditerranée, under the supervision of Jean-Daniel Boissonnat. He worked from 2014 to 2017 as postdoctoral fellow within the School of Mathematics and Physics of the University of Queensland, in collaboration with Ben Burton.

**
Title:** Scanning a Riemannian manifold: The intrinsic topological transform.

**Abstract:** We study the question of whether a space can be recognised uniquely by only capturing elementary topological properties of a collection of linear scans. More specifically, it has recently been proved that

piecewise linear subsets of R^d can be uniquely characterised by their persistent homology when filtered linearly along all directions, the so called persistent homology transform. In this talk, we study the recognition problem for Riemannian manifolds up to isometry, using persistent homology. We introduce Morse functions on the manifold playing the role of ”scans” and inducing a coordinate

system to embed the manifold in a “straighten fashion”. We also study

the stability and approximability of this embedding, and discuss

applications of persistent homology in this framework. This is joint work with Steve Oudot and Elchanan Solomon.

**17/11/2021 – h. 4 p.m.**

### Sarang Joshi

The University of Utha

**Bio:** Before joining SCI, Dr. Joshi was an Assistant Professor of Radiation Oncology and an Adjunct Assistant Professor of Computer Science at the University of North Carolina in Chapel Hill. Prior to joining Chapel Hill Dr. Joshi was Director of Technology Development at IntellX. Sarang’s research interests are in the emerging field of Computational Anatomy and have recently focused on its application to Radiation Oncology.

Title: Integrated Construction of Multimodal Atlases with Structural Connectomes in the Space of Riemannian Metrics

**Abstract:** The structural network of the brain, or structural connectome, can be represented by fiber bundles generated by a variety of tractography methods. While such methods give qualitative insights into brain structure, there is controversy over whether they can provide quantitative information, especially at the population level. In order to enable population-level statistical analysis of the structural connectome, we propose representing a connectome as a Riemannian metric, which is a point on an infinite-dimensional manifold. We equip this manifold with the Ebin metric, a natural metric structure for this space, to get a Riemannian manifold along with its associated geometric properties. We then use this Riemannian framework to apply object-oriented statistical analysis to define an atlas as the Fr\’echet mean of a population of Riemannian metrics. This formulation ties into the existing framework for diffeomorphic construction of image atlases, allowing us to construct a multimodal atlas by simultaneously integrating complementary white matter structure details from DWMRI and cortical details from T1-weighted MRI. We illustrate our framework with 2D data examples of connectome registration and atlas formation. Finally, we build an example 3D multimodal atlas using T1 images and connectomes derived from diffusion tensors estimated from a subset of subjects from the Human Connectome Project.

**17/11/2021 – h. 4 p.m.**

### Stefan Horst Sommer

University of Copenhagen

**Bio:** Stefan is a professor at University of Copnehagen. His main research interests are Image Analysis, Computational Modelling and Geometry.

**
Title:** Stochastic shape analysis and probabilistic geometric statistics

**Abstract:** Analysis and statistics of shape variation can be formulated in

geometric settings with geodesics modelling transitions between shapes.

The talk will concern extensions of these smooth geodesic models to

account for noise and uncertainty: Stochastic shape processes and

stochastic shape matching algorithms. In the stochastic setting,

matching algorithms take the form of bridge simulation schemes which

also provide approximations of the transition density of the stochastic

shape processes. The talk will cover examples of stochastic shape

processes and connected bridge simulation algorithms. I will connect

these ideas to geometric statistics, particularly to the diffusion mean.

**19/11/2021 – h. 10.30 a.m.**

### Nicolas Duchateau

CREATIS LAB – University of Lyon 1

**Bio:** Nicolas is currently an Associate Professor (Maître de Conférences) at the Université Lyon 1 and the CREATIS lab in Lyon, France. His research focuses on the characterization of diseases from medical imaging populations. In particular, he develops new statistical and computational approaches to represent the cardiac function, and better understand disease apparition and evolution.

**
Title:** Cardiac function analysis with representation learning

**Abstract:** The cardiac function can be studied from many points of view, ranging from the original images, up to physiologically-relevant descriptors of its shape or deformation along time. However, clinical practice severely truncates these data by relying on arbitrary thresholds and scalar measurements. Several techniques from representation learning such as manifold learning or variational auto-encoders allow estimating a latent space that encodes more complex descriptors of the cardiac function, and is statistically relevant to compare individuals or subgroups. Challenges consist in handling high-dimensional descriptors that originate from a non-linear (unknown) space, of heterogeneous types and with potential interactions between descriptors. In this talk, I will provide an overview of some representation learning techniques relevant for analyzing the cardiac function, with specific focus on studies involving multiple acquisitions or descriptors, and the need for representations that can be interpreted and trusted by medical doctors.

**21/10/2021 – h. 2 p.m.**

### Alice Le Brigant

SAAM – University Paris 1

Bio: Alice is an assistant professor in the Applied Mathematics team SAMM at University Paris 1 Pantheon Sorbonne since September 2019. Previously, she was a post-doc at the French Civil Aviation School in Toulouse. She did my PhD at the University of Bordeaux under the supervision of Marc Arnaudon and Frédéric Barbaresco (Thales).

**Title:** Fisher information geometry of beta and Dirichlet distributions

**Abstract:** The Fisher information metric provides a Riemannian framework to compare probability distributions inside a parametric family. The most well-known example is the univariate gaussian case, where the Fisher information geometry amounts to hyperbolic geometry. In this talk we will investigate the Fisher information geometry of Dirichlet distributions and show that it is negatively curved and geodesically complete. This guarantees the uniqueness of the notion of mean and makes it a suitable geometry to apply the K-means algorithm. We illustrate the use of the Fisher information geometry of beta distributions, a particular case of Dirichlet distributions, to compare and classify histograms of medical data.

**30/09/2021**

### Jacopo di Iorio

Penn State University

**Bio:** Jacopo is a postdoc research fellow at Pennsylvania State University and EMbeDS Sant’Anna School of Advances Studies. He did my PhD at the MOX Department of Mathematics, Politecnico di Milano under the supervision of Prof. Simone Vantini.

**Title:** The shapes of an epidemic: Functional Data Analysis characterizes the first COVID-19 epidemic wave in Italy.

**Abstract:** We investigate patterns of COVID-19 mortality across 20 Italian regions and their association with mobility, positivity, and socio-demographic, infrastructural and environmental covariates. Notwithstanding limitations in accuracy and resolution of the data available from public sources, we pinpoint significant trends exploiting information in curves and shapes with Functional Data Analysis techniques. These depict two starkly different epidemics; an “exponential” one unfolding in Lombardia and the worst hit areas of the north, and a milder, “flat(tened)” one in the rest of the country—including Veneto, where cases appeared concurrently with Lombardia but aggressive testing was implemented early on. We find that mobility and positivity can predict COVID-19 mortality, also when controlling for relevant covariates. Among the latter, primary care appears to mitigate mortality, and contacts in hospitals, schools and workplaces to aggravate it. The techniques we describe could capture additional and potentially sharper signals if applied to richer data.

**20/01/2021**

### Anna Calissano

**Bio:** Anna is a post doc in the Gstats Epione team at INRIA. She took her PhD between the Mox team of the Politecnico de Milano and DTU University in Cophenhagen. Her research interests regards the development of statistical tools for the analysis of populations of graphs.

**Title: **Populations of Unlabelled Networks: Graph Space Geometry and Geodesic Principal Components

**Abstract:** Populations of graphs are a complex and strongly non-Euclidean data type describing different relational phenomena in different fields. In this talk, we describe the statistical tools for the analysis of populations of unlabelled graphs, embedding them in the Graph Space, a quotient space of permuted adjacency matrices. After the overview of the geometrical framework and the different statistical techniques available, we will detail the Graph-valued regression model, a model with Euclidean inputs and Network-valued outputs. To estimate the defined intrinsic statistics in the Graph Space, we introduce an algorithm, namely Align All and Compute. These statistical tools are applied to quantify and analyse urban movements, in order to understand how people move within a square, a city, a region.

**19/12/2019**

### Sarang Joshi

**Bio: **Sarang is Professor of Bioengineering at the University of Utah.

**Title:** Riemannian Brownian Bridges and Metric Estimation on Landmark Manifolds.

**Abstract:** We present an inference algorithm and connected Monte Carlo based estimation procedures for metric estimation from landmark configurations distributed according to the transition distribution of a Riemannian Brownian motion arising from the Large Deformation Diffeomorphic Metric Mapping (LDDMM) metric. The distribution possesses properties similar to the regular Euclidean normal distribution but its transition density is governed by a high-dimensional non-linear PDE with no closed-form solution. We show how the density can be numerically approximated by Monte Carlo sampling of conditioned Brownian bridges, and we use this to estimate parameters of the LDDMM kernel and thus the metric structure by maximum likelihood.

**30/10/2019**

### Dimbihery Rabenoro

##### Dimby did his PdD at Universite Paris 6 on large deviation theory. He came to visit Inria and gave a talk to present his results.

Abstract: In this talk, we present an overview of the results we have obtained during the PHD, with common thread the theory of Large Deviations Principles (LDP). First, we give the basics of LDP. Then, we recall Cramer’s theorem (a LDP for partial sums) and state our contribution on some functional limit theorems based on a Cramer’s type LDP. Next, we present Sanov’s theorem (a LDP for empirical measures), in order to introduce our results on some Gibbs type conditional limit theorems. We conclude by giving some outlooks of extensions of the last results to manifold-valued data.

**7-8/02/2019**

### Dmitri Alekseevsky

In the context of a collaboration on abstract statistics in symmetric spaces, Dmitri Alekseevsky (Institute for Information Transmission Problems, Moscow) visited Inria from February 6 to 13 2019. Dmitri is a well know pure mathematician in differential geometry, specialist of homogeneous and symmetric spaces, Flags, pseudo-Riemannian geometry, etc.

During his visit, he kindly accepted to give a course on the differential geometry of homogeneous spaces with a focus on:

* invariant connections on homogeneous reductive manifolds, their curvature, torsion and geodesics

* affine & Riemannian symmetric spaces

* invariant structures on homogeneous manifolds and Lie groups

The course was held in room Coriolis (Galois Building) on Thursday 7 et Friday 8 from 10h30 to 12h and 13h30 to 15h.