DTC Seminar Series
Model Selection Principles for Data Analysis
School of Statistics
University of Minnesota
Monday, September 30, 2019
3:30 p.m. reception
4:00 p.m. seminar
401/402 Walter Library
This talk will provide an integrated overview of model selection principles being used by data analysts from various fields. These include regularization techniques that are either explicit or implicit, and various types of cross-validations. The talk will also introduce a new information criterion for model selection that attains the benefits of two well-known model selection techniques, the Akaike information criterion (AIC) and the Bayesian information criterion (BIC), which are the most representative techniques from large-sample perspectives. While the optimality of AIC and BIC is susceptible to model specification, the proposed criterion can achieve the universal optimality in the sense that it can be automatically consistent in well-specified settings, and asymptotically efficient in mis-specified settings. Some misleading folklore concerning model selection in machine learning practice will also be addressed.
Jie Ding obtained the Bachelor's degree from Tsinghua University in June 2012 and the Ph.D. degree from Harvard University in March 2017. His research interests are in statistical methodology, signal processing, and machine learning. His recent research focus on reproducible model selection, robust prediction from online streaming data, and large-scale collaborative learning.
DTC Director Georgios B. Giannakis
Georgios B. Giannakis, Endowed Professor of ECE and Director of DTC won the 2019 IEEE SP Norbert Wiener Society Award. More on Giannakis
Digital Technology Center
499 Walter Library, 117 Pleasant Street SE, Minneapolis, MN 55455
P: (612) 624-9510