Summer Graduate School
| Parent Program: | -- |
|---|---|
| Location: | INdAM |
Show List of Lecturers
- Francesca Mignacco (Princeton University)
- Andrea Montanari (Stanford University)
Show List of Teaching Assistants
- Luca Pesce (École Polytechnique Fédérale de Lausanne (EPFL))
Machine learning and Artificial Intelligence are now ubiquitous in every dimension of contemporary life, from high tech applications to precision medicine, from scientific research to entertainment. Despite this undeniable fact, the theoretical foundation of many popular algorithms and procedures are not yet fully understood, and this poses several questions to the mathematical community. It is to be expected that research motivated by machine learning and AI will play a key role in many areas of mathematics in the incoming years; for this reason, it seems especially important that the greatest number of PhD students and young reseachers are made aware of the most recent developments and open issues in the Mathematics of Machine Learning.
In view of the previous considerations, the aim of this summer school is to provide an introduction to theoretical ideas that have been developed with the objective of understanding machine learning methods and their domain of applicability. The focus will be on proof technique and general mathematical tools. The lecturers are two worldwide experts in the area and the material is regularly taught in Mathematics and Statistics Departments of the top world Universities.
School Structure
There will be two lectures and two problem sessions each day.
Prerequisites
The school is addressed to a wide audience, but nevertheless some previous knowledge of basic mathematical tools, Linear Algebra, Matrix Manipulation and basic optimization techniques and Multivariable Calculus are obviously required. Moreover previous knowledge of basic notions of mathematical statistics and basic probability is essential to benefit fully from the content of the course. Specific knowledge of basic machine learning theory is not required but could be useful.
Such background can be given for instance by classical books on statistical learning theory, i.e. Hastie, James, Tibshirani and Witten, An Introduction to Statistical Learning, 2023, first 3 chapters. Further material on machine learning, for instance the basic notions of excess risk, empirical risk minimization, Bayes estimators, together with regression problems, can be found in the first three chapters of the book "Learning Theory from First Principles", to be published soon by MIT Press and currently available online as https://www.di.ens.fr/~fbach/ltfp_book.pdf
It is also recommended to read the Chapter 24 of the book Spin Glass Theory and Far Beyond, 2023, pp.477-497, Neural Networks: From the Perceptron to Deep Nets, also available on arXiv as https://arxiv.org/pdf/2304.06636
Application Procedure
SLMath is only able to support a limited number of students to attend this school. Therefore, it is likely that only one student per institution will be funded by SLMath.
For eligibility and how to apply, see the Summer Graduate Schools homepage.
Venue
The summer school will take place at the Abdus Salam International Centre for Theoretical Physics (ICTP) in Trieste Italy.
Statistical Learning
Empirical risk minimization and empirical process theory
interpolation
Kernel methods
Random features and neural tangent models
Random matrix theory
Feature learning
Sampling and generative methods
62B10 - Statistical aspects of information-theoretic topics [See also 94A17]
62M45 - Neural nets and related approaches to inference from stochastic processes