Date
Share
Pierre Baudot

Pierre Baudot

Median technologies

Pierre Baudot was graduated in 1998 from Ecole Normale supérieure Ulm magister of biology, and passed his PhD in electrophysiology of visual perception studying learning and information coding in natural condition. He started to develop information topology with Daniel Bennequin at Complex System Institute and Mathematical Institute of Jussieu from 2006 to 2013, and then at the Max Planck Institute for Mathematic in the Science at Leipzig. He then joined Inserm at Marseille to develop data applications notably to transcriptomics. Since 2018, he works at Median Technologies, a medical imaging AI company, to detect and predict cancers from CT scans. He received the K2 trophy (mathematics and applications 2017), and best entropy paper prize 2019 for his contributions to topological information data analysis.

Statistical structures learning using Information topology: homological brains

Information theory, probability and statistical dependencies, and algebraic topology provide different views of a unified theory yet currently in development, where uncertainty goes as deep as Galois’s ambiguity theory, topos and motivs. I will review some foundations led notably by Bennequin and Vigneaux, that characterize uniquely entropy as the first group of cohomology, on random variable complexes and probability laws. This framework allows to retrieve most of the usual information functions, like KL divergence, cross entropy, Tsallis entropies, differential entropy in different generality settings. Multivariate interaction/Mutual information (I_k and J_k) appear as coboundaries, and their negative minima, also called synergy, corresponds to homotopical link configurations, which at the image of Borromean links, illustrate what purely collective interactions or emergence can be. Those functions refine and characterize statistical independence in the multivariate case, in the sens that (X1,…,Xn) are independent iff all the I_k=0 (with 1