Scientific Calendar Event



Description
An ICTP Virtual Meeting
 
The aim of the conference is to give young researchers the opportunity to present their results related to high-dimensional statistical problems, as arising in machine learning, inference or statistical physics.
 
Typical problems in high-dimensions include the detection and estimation of noisy signals (compressed sensing, PCA and tensor decomposition etc), the analysis of (deep) neural-networks and their learning and generalization dynamics, the detection of communities in large networks, or disordered spin systems. While these problems may seem of different natures, they actually share many similarities in their common phenomenology and the tools used to analyze them.

The core of the field is made of a very active, diverse and quickly expanding community of physicists, computer scientists, mathematicians, information theorists and engineers, with the common desire to tackle increasingly challenging problems at the forefront of data science.

This conference aims at reinforcing the links among this interdisciplinary community, and in particular of its youngest theory-oriented members, and to bring forward the latest development happening in the high-dimensional world.

Topics:
  • Machine learning
  • High-dimensional statistics
  • Applied maths
  • Computer sciences

Participants interested to present a poster are kindly requested to submit their title and a short abstract in the application form.

Speakers:
L. BENIGNI, University of Chicago, USA
M. CELENTANO, Stanford University, USA
L. CHIZAT, University of Paris-Saclay
SY. CHUNG, Massachusetts Institute of Technology, USA
E. DOBRIBAN, University of Pennsylvania, USA
Z. FAN, Yale University, USA
C. GERBELOT, ENS Paris, France
A. GLIELMO, SISSA, Italy
A. INGROSSO, ICTP, Italy
A. JACOT, EPFL, Switzerland
F. MASTROGIUSEPPE, University College London, UK
F. MIGNACCO, IPHT Saclay, France
T. MISIAKIEWICZ, Stanford University, USA
P. NAKKIRAN, Harvard University, USA
M. REFINETTI, ENS Paris, France
C. RUSH, Columbia University, USA
T. SCHRAMM, Stanford University, USA
S. VILLAR, Johns Hopkins University, USA
I. ZADIK, New York University, USA
J. ZARKA, ENS Paris, France
 

Registration: There is no registration fee.
 

Go to day
  • Tuesday, 15 June 2021
    • 16:25 - 16:30 Introductory words
      Convener: Sebastian GOLDT (SISSA, Italy)
    • 16:30 - 18:00 #1 Statistical Physics and Theory of Deep Learning
      Chair: Marylou GABRIÉ (NYU/Flatiron, USA)
      • 16:30 Classifying High-dimensional Gaussian Mixtures: Where Kernel Methods Fail and Neural Networks Succeed 20'
        Speaker: Maria REFINETTI (ENS Paris, France)
        Material: Video
      • 16:50 The Deep Bootstrap Framework: A New Lens to Understand Generalization 20'
        Speaker: Preetum NAKKIRAN (Harvard University, USA)
        Material: Abstract Video
      • 17:10 Understanding the Dynamics of Multi-pass SGD in High Dimensional Non-convex Problems 20'
        Speaker: Francesca MIGNACCO (IPHT Saclay, France)
        Material: Abstract Video
      • 17:30 Questions & Answers 15'
        Speaker: Maria REFINETTI (ENS, Paris), Preetum NAKKIRAN (Harvard University, USA), Francesca MIGNACCO (IPHT Saclay, France)
      • 17:45 Break 15'
    • 18:00 - 19:15 #2 Feature Learning and Lazy Regimes in Neutral Networks
      Chair: Marco MONDELLI (IST, Austria)
      • 18:00 Minimum Complexity Interpolation in Random Features Models 20'
        Speaker: Theodor MISIAKIEWICZ (Stanford University, USA)
        Material: Abstract Video
      • 18:20 Implicit Regularization of Gradient Descent for Wide Two-layer Relu Neural Networks 20'
        Speaker: Lénaïc CHIZAT (University of Paris-Saclay, France)
        Material: Video
      • 18:40 Saddle-to-Saddle Regime in Deep Linear Networks 20'
        Speaker: Arthur JACOT (EPFL, Switzerland)
        Material: Video
      • 19:00 Questions & Answers 15'
        Speaker: Theodor Jakub MISIAKIEWICZ (Stanford University, USA), Lénaïc CHIZAT (University of Paris-Saclay), Arthur JACOT (EPFL, Switzerland) 
  • Wednesday, 16 June 2021
    • 16:30 - 19:15 #3 Neural Networks and Data Analysis
      Chair: Marylou GABRIÉ (NYU/Flatiron, USA)
      • 16:30 Spectrum of Random Feed-Forward Neural Networks 20'
        Speaker: Lucas BENIGNI (University of Chicago, USA)
        Material: Video
      • 16:50 Separation and Concentration in Deep Networks 20'
        Speaker: John ZARKA (ENS Paris, France)
        Material: Video
      • 17:10 Ranking the Information Content of Distance Measures 20'
        Speaker: Aldo GLIELMO (SISSA, Italy)
        Material: Video
      • 17:30 Questions & Answers 15'
        Speaker: Luca BENIGNI (University of Chicago, USA), John ZARKA (ENS, Paris), Aldo GLIELMO (SISSA, Trieste, Italy)
      • 17:45 Break 15'
      • 18:00 Discussions & Posters 1h15'
        Gather Town
  • Thursday, 17 June 2021
    • 16:30 - 18:00 #4 Theoretical Neuroscience
      Chair: Sebastian GOLDT (SISSA, Italy)
      • 16:30 Evolution of Neural Activity in Circuits Bridging Sensory and Bbstract Knowledge 20'
        Speaker: Francesca MASTROGIUSEPPE (University College London, UK)
      • 16:50 Optimal Learning with Excitatory and Inhibitory Synapses 20'
        Speaker: Alessandro INGROSSO (ICTP, Italy)
        Material: Video
      • 17:10 Perceptual Manifolds in Biological and Artificial Neural Networks 20'
        Speaker: SueYeon CHUNG (Columbia University, USA)
      • 17:30 Questions & Answers 15'
        Speaker: Francesca MASTROGIUSEPPE (University College London, UK), Alessandro INGROSSO (ICTP, Trieste, Italy), Sueyeon CHUNG (Columbia University, USA)
      • 17:45 Break 15'
    • 18:00 - 19:35 #5 Statistics & Theoretical Computer Science
      Chair: Marco MONDELLI (IST, Austria)
      • 18:00 The All-or-Nothing Phenomenon: the Case of Sparse Tensor PCA 20'
        Speaker: Ilias ZADIK (New York University, USA)
        Material: Abstract Video
      • 18:20 Scalars are Universal: Gauge-equivariant Machine Learning, Structured Like Classical Physics 20'
        Speaker: Soledad VILLAR (Johns Hopkins University, USA)
        Material: Video
      • 18:40 What Causes the Test Error? Going Beyond Bias-Variance via ANOVA 20'
        Speaker: Edgar DOBRIBAN (University of Pennsylvania, USA)
        Material: Abstract Video
      • 19:00 Statistical Query Algorithms and Low-Degree Tests Are Almost Equivalent 20'
        Speaker: Tselil SCHRAMM (Stanford University, USA)
      • 19:20 Questions & Answers 15'
        Speaker: Ilias ZADIK (New York University, USA), Soledad VILLAR (Johns Hopkins University, USA), Edgar DOBRIBAN (University of Pennsylvania, USA), Tselil SCHRAMM (Stanford University, USA)
  • Friday, 18 June 2021
    • 16:30 - 19:30 #6 Approximate Message Passing & Computational Issues
      Chair: Jean BARBIER (ICTP, Italy)
      • 16:30 Beyond i.i.d. Gaussian Models: Exact Asymptotics with Realistic Data 20'
        Speaker: Cedric GERBELOT (ENS Paris, France)
        Material: Abstract Slides Video
      • 16:50 Using AMP to Characterize the Optimal Type 1 Error and Power Trade-off for Sorted L1 Penalized Estimation (SLOPE) 20'
        Speaker: Cynthia RUSH (Columbia University, USA)
        Material: Abstract Video
      • 17:10 Approximate Message Passing Algorithms and Orthogonal Spin Glasses 20'
        Speaker: Zhou FAN (Yale University, USA)
        Material: Slides Video
      • 17:30 Debiasing the Lasso with Inaccurate Precision Matrix 20'
        Speaker: Michael CELENTANO (Stanford University, USA)
        Material: Abstract Video
      • 17:50 Questions & Answers 15'
        Speaker: Cedric Daniel Rene GERBELOT-BARRILLON (Ecole Normale Superieure de Paris, France), Cynthia G.RUSH (Columbia University, USA), Zhou FAN (Yale University, USA), Michael CELENTANO (Stanford University, USA)
      • 18:05 Closing remarks 5'
        Speaker: Jean BARBIER (ICTP, Italy)
      • 18:10 Discussions & Posters 1h20'
        Gather Town