Overparametrizationand the Bias-Variance Dilemma

Time: 2025-07-16 Views: Published By:

Speaker(s): Johannes Schmidt-Hieber

Time: 07/16 Wednesday10:00-11:00

Venue: 北京大学智华楼四元厅-225腾讯会议:693-223-127

Abstract:

For several machine learning methods such as neural networks, good generalisationperformance has been reported in the overparametrized regime. In view of the classicalbias-variance trade-off, this behaviour is highly counterintuitive. We will present a generalframework to establish universal lower bounds for the bias-variance trade-off. This is jointwork with Alexis Derumigny.


Speaker

Johannes Schmidt-Hieber University of Twente

Johannes Schmidt-Hieber is professor of statistics at the University of Twente in theNetherlands. His research focuses on mathematical statistics with contributions to areassuch as deep learning theory, Bayesian inference, nonparametric estimation, andhigh-frequency data analysis. One of his key contributions is his work on the statisticaltheory underlying deep neural networks. He is also known for his work in Bayesian inferenceespecially in high-dimensional settings. For his work, he received an ERc consolidator grantfrom the European Research Council and was named lMS fellow