Home   >   CSC-OpenAccess Library   >    Manuscript Information
Penalized Regressions with Different Tuning Parameter Choosing Criteria and the Application in Economics
Sheng Gao, Mingwei Sun
Pages - 7 - 17     |    Revised - 31-07-2020     |    Published - 31-08-2020
Volume - 8   Issue - 1    |    Publication Date - August 2020  Table of Contents
MORE INFORMATION
KEYWORDS
Penalized Regression, Lasso, Ridge, Elastic Net, AIC, BIC, AICc, Economic Modeling.
ABSTRACT
Recently a great deal of attention has been paid to modern regression methods such as penalized regressions which perform variable selection and coefficient estimation simultaneously, thereby providing new approaches to analyze complex data of high dimension. The choice of the tuning parameter is vital in penalized regression. In this paper, we studied the effect of different tuning parameter choosing criteria on the performances of some well-known penalization methods including ridge, lasso, and elastic net regressions. Specifically, we investigated the widely used information criteria in regression models such as Bayesian information criterion (BIC), Akaike’s information criterion (AIC), and AIC correction (AICc) in various simulation scenarios and a real data example in economic modeling. We found that predictive performance of models selected by different information criteria is heavily dependent on the properties of a data set. It is hard to find a universal best tuning parameter choosing criterion and a best penalty function for all cases. The results in this research provide reference for the choices of different criteria for tuning parameter in penalized regressions for practitioners, which also expands the nascent field of applications of penalized regressions.
1 Google Scholar 
2 refSeek 
3 Scribd 
4 SlideShare 
A.E. Hoerl and R.W. Kennard. "Ridge regression: Biased estimation for nonorthogonal problems." Technometrics, vol.12, pp. 55-67, Feb. 1970.
C.M. Hurvich and C.L. Tsai. "Regression and time series model selection in small samples." Biometrika, vol.76, pp. 297-307, Jun. 1989.
D.N. Schreiber-Gregory. "Ridge Regression and multicollinearity: An in-depth review." Model Assisted Statistics and Applications, vol.13, pp. 359-365, Jan. 2018.
F. Gunes. "Penalized Regression Methods for Linear Models in SAS/STAT®." in Proceedings of the SAS Global Forum 2015 Conference. Cary, NC: SAS Institute Inc. [on-line] Available: http://support. sas. com/rnd/app/stat/papers/2015/PenalizedRegression_LinearModels. pdf. 2015.
G. Schwarz. "Estimating the dimension of a model." The annals of statistics, Vol. 6, pp.461-464. 1978
H. Akaikei. "Information theory and an extension of maximum likelihood principle," in Proc. 2nd Int. Symp. on Information Theory, 1973, pp. 267-281.
H. Wang, B. Li, and C. Leng. "Shrinkage tuning parameter selection with a diverging number of parameters." Journal of the Royal Statistical Society: Series B (Statistical Methodology), vol. 71, pp.671-683, Jun. 2009.
H. Zou and T. Hastie. "Regularization and variable selection via the Elastic Net." Journal of the Royal Statistical Society: Series B, vol.67, pp.301-320, Apr. 2005.
J. Fan and R. Li. "Variable selection via nonconcave penalized likelihood and its oracle properties." Journal of American Statistical Association, vol.96, pp. 1348-1360, Dec. 2001.
J. Friedman, T. Hastie, and R. Tibshirani. "Regularization paths for generalized linear models via coordinate descent." Journal of statistical software, vol.33, pp. 1-22. Aug. 2010.
J.Fan and H. Peng. "Nonconcave penalized likelihood with a diverging number of parameters." The Annals of Statistics, vol.32, pp.928-961, 2004.
J.O. Ogutu, T. Schulz-Streeck, and H.P. Piepho. "Genomic selection using regularized linear regression models: ridge regression, lasso, elastic net and their extensions." BMC proceedings, vol.6, pp. S10, Dec. 2012.
K. Burnham and D. Anderson. Model Selection and Multimodel Inference: A Practical Information-Theoretic Approach, 2nd edn. NY: Springer, 2002.
L. Mutikani. "Housing starts near eight-year high, but permits fall." Internet: https://finance.yahoo.com/news/uhousing-starts-near-eight-132259302.html, Aug.15, 2015 [Mar.26, 2020].
L.K. Sen and M. Shitan. "The performance of AICC as an order selection criterion in ARMA time series models." Pertanika Journal of Science and Technology, vol.10, pp.25-33. Jan. 2002.
M.J. Pring, The all-season investor: successful strategies for every stage in the business cycle. John Wiley & Sons, 1992.
N. Anaraki. "A housing market without fannie mae and freddie mac: The effect of housing starts." The Heritage Foundation. Oct. 2012.
N. Sugiura. "Further analysts of the data by akaike's information criterion and the finite corrections: Further analysts of the data by akaike's." Communications in Statistics-Theory and Methods, vol.7, pp. 13-26, Jan. 1978.
P. Shi and C.L. Tsai. "Regression model selection-A residual likelihood approach" Journal of the Royal Statistical Society Series B, vol.64, pp. 237-52, May. 2002.
R. Tibshirani. "Regression Shrinkage and Selection via the Lasso." Journal of the Royal Statistical Society: Series B (Methodological), vol. 58, pp. 267-88, Jan.1996.
S. Chand. "On tuning parameter selection of lasso-type methods-a monte carlo study." In Proc. IBCAST, 2012, pp. 120-129.
Y. Ninomiya and S. Kawano. "AIC for the LASSO in generalized linear models." Electronic Journal of Statistics, vol. 10, pp. 22537-2560, 2016.
Mr. Sheng Gao
Mathematics and Computer Science Department, Samford University, Birmingham, 35229 - United States of America
Dr. Mingwei Sun
Mathematics and Computer Science Department, Samford University, Birmingham, 35229 - United States of America
msun1@samford.edu


CREATE AUTHOR ACCOUNT
 
LAUNCH YOUR SPECIAL ISSUE
View all special issues >>
 
PUBLICATION VIDEOS