Generalization and robustness of over-parameterized neural networks - a function space perspective

Time: 2023-09-11 Views: Published By: CMLR

Speaker(s): Fanghui Liu(EPFL, Switzerland)

Time: 10:30-11:30 September 11, 2023

Venue: Room 1513, Sciences Building No. 1 (理科一号楼1513室)

Abstract:


The conventional wisdom of simple models in signal processing and machine learning misses the bigger picture, especially over-parameterized neural networks (NNs), where the number of parameters are much larger than the number of training data. Our goal is to explore the mystery behind over-parameterized NNs from a theoretical side.


In this talk, I will discuss the role of over-parameterization in neural networks, to theoretically understand why they can perform well in terms of benign overfitting and double descent. Furthermore, I will talk about the robustness of neural networks, affected by architecture and initialization in a function space theory view. It aims to answer a fundamental question: over-parameterization in NNs helps or hurts robustness?


Brief bio:


3ddc821e7a9d40ccabbcb39c2c81def8.jpg


Fanghui Liu is currently a postdoctoral researcher at EPFL, Switzerland and will be an assistant professor at University of Warwick, UK next month. His research interests focus on statistical learning theory, to build the mathematical foundations of machine learning. For his work on learning theory and cooperation, he was chosen for Rising Star in AI (KAUST 2023) and presented two tutorials at ICASSP 2023 and CVPR 2023. Prior to his current position, Fanghui received his PhD from Shanghai Jiao Tong University and worked as a postdoc researcher at KU Leuven, Belgium.