报告简介 Abstract
Ridge regression was introduced to deal with the instability issue of the ordinary least squares estimate due to multicollinearity. It essentially penalizes the least squares loss by applying a ridge penalty on the regression coefficients. The ridge penalty shrinks the regression coefficient estimate toward zero, but not exactly zero. For this reason, the ridge regression has long been criticized of not being able to perform variable selection. In this article, we proposed a new variable selection method based on an individually penalized ridge regression, a slightly generalized version of the ridge regression. An adaptive version is also provided. Our new methods are shown to perform competitively based on simulation and a real data example.
嘉宾简介 About the Speaker
Yichao Wu is TransUnion Professor at the University of Illinois at Chicago. His research interest includes statistical machine learning, dimension reduction, variable selection and screening, and functional data analysis. He is an elected fellow of ASA and IMS.
海报Poster