Overparametrization and the Bias-Variance Dilemma
Speaker(s): Johannes Schmidt-Hieber (University of Twente)
Time: 10:00-11:00 July 16, 2025
Venue: 智华楼四元厅225 & 腾讯会议号: 693-223-127
Abstract:
For several machine learning methods such as neural networks, good generalisation performance has been reported in the overparametrized regime. In view of the classical bias-variance trade-off, this behaviour is highly counterintuitive. We will present a general framework to establish universal lower bounds for the bias-variance trade-off. This is joint work with Alexis Derumigny.
Bio:
Johannes Schmidt-Hieber is professor of statistics at the University of Twente in the Netherlands. His research focuses on mathematical statistics with contributions to areas such as deep learning theory, Bayesian inference, nonparametric estimation, and high-frequency data analysis. One of his key contributions is his work on the statistical theory underlying deep neural networks. He is also known for his work in Bayesian inference, especially in high-dimensional settings. For his work, he received an ERC consolidator grant from the European Research Council and was named IMS fellow.