Optimal Sparse Regression Learning and Model Compression
When and Where
Speakers
Description
Minimax-rate optimality plays a foundational role in theory of statistical/machine learning. In the context of regression, some key questions are: i) What determines the minimax-rate of convergence for regression estimation? ii) Is it possible to construct estimators that are simultaneously minimax optimal for a countable list of function classes? iii) In high-dimensional linear regression, how does different kinds of sparsity affect the rate of convergence? iv) How do we know if a pre-trained deep neural network model is compressible? If so, by how much?
In this talk, we will address the above questions. After reviewing on the determination of minimax rate of convergence, we will present on minimax optimal adaptive estimation for high-dimensional regression learning under both hard and soft sparsity setups, taking advantage of sharp sparse linear approximation bounds. An application on model compression in neural network learning will be given.
About Yuhong Yang
Yuhong Yang is a Full Professor in the Department of Statistics at the University of Minnesota, and a fellow of the Institute of Mathematical Statistics. Yuhong Yang's research spans many areas of statistics, including nonparametric function estimation, high-dimensional data analysis, model selection and combination in both theory and application, multi-armed bandit problems with covariates, and forecasting. Yang obtained his Ph.D. in Statistics from Yale University.