Matthieu Wyart

Matthieu Wyart

EPFL (École Polytechnique Fédérale de Lausanne)

Matthieu Wyart is a French physicist. He is professor of physics at EPFL (École Polytechnique Fédérale de Lausanne) and the head of the Physics of Complex Systems Laboratory. Wyart’s research encompassed field such as the architecture of allosteric materials, the theory of deep learning, the elasticity and mechanical stability in disordered solids, the granular and suspension flows, the glass and rigidity transitions, the marginal stability at random close packing and other glasses, and the yielding transition. More recently his work has focused on machine learning, in particular data structure and generative models. M. Wyart is the recipient of the Simons Investigator Award, the Sloan Fellowship, the G. Carrier Fellowship, the Dresden Physics prize and is a fellow of the American Physical Society.

Which data are learnable?

Learning generic tasks in high dimension is impossible, as it would require an unreachable number of training data. Yet algorithms or humans can play the game of go, decide what is on an image or learn languages. The only resolution of this paradox is that learnable data are highly structured. I will review ideas in the field of what this structure may be. I will then discuss two recent results of our group. If the data is hierarchical, supervised leaming can occur with a training set size which is polynomial, and not exponential, in the data dimension. I will discuss how novel data, such as the pictures of the fake celebrities below, can be generated by composing known features into a new whole. This theory of composition predicts a phase transition in generative models. I will discuss empirical evidence for its validity.