时间:2026年3月24日(星期二),14:00-15:00
地点:E14-212
主讲人:Elvezio Ronchetti, Research Center for Statistics and GSEM, University of Geneva
主讲人介绍:Elvezio Ronchetti received his PhD in mathematics in 1982 from the Swiss Federal Institute of Technology (ETH) Zürich, Switzerland. Since then he has held academic positions at Princeton University (1983-1986), University of Geneva (1986-now), University of Lugano (20%, 1996-2012), as well as visiting positions in over twenty universities and institutes worldwide, including Dalhousie University (Canada), Stanford, MIT, the University of Sydney, the Australian National University (Canberra), the Fields Institute, and the Institute for Mathematics and its Applications. Since 2021 he has been Emeritus Professor of Statistics at the Research Center for Statistics and member of the Geneva School of Economics and Management, University of Geneva (Switzerland).
His research interests include robust statistics, higher-order approximations, resampling methods, latent variable models, among others. He is well-known worldwide in the field of robust statistics. He has co-authored 6 books, co-edited 3 books, and published hundred refereed journal papers in top journals in statistics and related fields including econometrics. In particular, he is the coauthor of Robust Statistics, one of the most popular and classical books on robust statistics.
报告主题:Robust Bayesian Learning
报告摘要:In the past several decades there has been an important development of the theory and applications of robust statistics. This has taken place mainly within the frequentist framework, while fewer results have concerned the Bayesian approach. Since robust statistics deals with deviations from ideal models and develops statistical procedures which are still reliable and reasonably efficient in a neighborhood of the model, the issue of the stability of inference in the presence of small deviations from the assumptions should clearly concern both approaches. This is even more important nowadays, where the analysis and the modelling of complex data are required in many fields, in particular for the development of AI technology. Fortunately, in the past decade with the development of powerful algorithms, the robustness issue has gained importance within the Bayesian framework.
In this talk we discuss some of these recent developments by focusing on two main aspects. First, we outline the transfer of some fundamental ideas and tools from the classical theory of robust statistics (including M-estimation and testing and Huber’s minimax theory) to the Bayesian setup. One implication of this transfer is the recommendation to replace exact likelihoods with Huber’s least favorable distributions when sampling from posterior distributions. Secondly, we discuss the difficulty of obtaining exact finite sample results in Bayesian robustness, while outlining a possible direction, which aims to combine asymptotic guarantees with exact finite sample bounds. Finally, we briefly illustrate how the Bayesian filter can be robustified.