AdaÌmek, R., S. Smeekes, and I. Wilms (2020). LASSO inference for high-dimensional time series. Technical Report 2007.10952, arxiv.
Ardia, D., K. Bluteau, and K. Boudt (2019). Questioning the news about economic growth: Sparse forecasting using thousands of news-based sentiment values. International Journal of Forecasting 35, 1370â1386.
Audrino, F. and S. D. Knaus (2016). Lassoing the HAR model: A model selection perspective on realized volatility dynamics. Econometric Reviews 35, 1485â1521.
- Babii, A., E. Ghysels, and J. Striaukas (2020a). Inference for high-dimensional regressions with heteroskedasticity and autocorrelation. Technical Report 1912.06307, arxiv.
Paper not yet in RePEc: Add citation now
Babii, A., E. Ghysels, and J. Striaukas (2020b). Machine learning panel data regressions with an application to nowcasting price earnings ratios. Technical Report 2008.03600, arxiv.
Babii, A., E. Ghysels, and J. Striaukas (2020c). Machine learning time series regressions with an application to nowcasting. Technical Report 2005.14057, arxiv.
Balkin, S. D. and J. K. Ord (2000). Automatic neural network modeling for univariate time series. International Journal of Forecasting 16, 509â515.
- Barron, A. (1993). Universal approximation bounds for superpositions of a sigmoidal function. IEEE Transactions on Information Theory 39, 930â945.
Paper not yet in RePEc: Add citation now
- Bartlett, P. and M. M. Traskin (2007). AdaBoost is consistent. Journal of Machine Learning Research 8, 2347â2368.
Paper not yet in RePEc: Add citation now
- Basu, S. and G. Michailidis (2015). Regularized estimation in sparse high-dimensional time series models. Annals of Statistics 43, 1535â1567.
Paper not yet in RePEc: Add citation now
Belloni, A., V. Chernozhukov, and C. Hansen (2014). Inference on treatment effects after selection amongst high-dimensional controls. Review of Economic Studies 81, 608â650.
- Breiman, L. (1996). Bagging predictors. Machine Learning 24, 123â140.
Paper not yet in RePEc: Add citation now
- Breiman, L. (2001). Random forests. Machine Learning 45, 5â32.
Paper not yet in RePEc: Add citation now
- BuÌhlmann, P. (2006). Boosting for high-dimensional linear models. Annals of Statistics 34, 559â583.
Paper not yet in RePEc: Add citation now
- BuÌhlmann, P. L. (2002). Consistency for l2boosting and matching pursuit with trees and treetype basis functions. In Research report/Seminar fuÌr Statistik, EidgenoÌssische Technische Hochschule (ETH), Volume 109. Seminar fuÌr Statistik, EidgenoÌssische Technische Hochschule (ETH).
Paper not yet in RePEc: Add citation now
Callot, L., A. Kock, and M. Medeiros (2017). Modeling and forecasting large realized covariance matrices and portfolio choice. Journal of Applied Econometrics 32, 140â158.
Callot, L., A.B., and Kock (2013). Oracle efficient estimation and forecasting with the adaptive LASSO and the adaptive group LASSO in vector autoregressions. In N. Haldrup, M. Meitz, and P. Saikkonen (Eds.), Essays in Nonlinear Time Series Econometrics. Oxford University Press.
- Chen, S., D. Donoho, and M. Saunders (2001). Atomic decomposition by basis pursuit. SIAM review 43, 129â159.
Paper not yet in RePEc: Add citation now
Chen, X. (2007). Large sample sieve estimation of semi-nonparametric models. In J. Heckman and E. Leamer (Eds.), Handbook of Econometrics. Elsevier.
Chen, X. and S. Shen (1998). Sieve extremum estimates for weakly dependent data. Econometrica 66, 289â314.
- Chen, X., J. Racine, and N. Swanson (2007). Semiparametric ARX neural-network models with an application to forecasting inflation. IEEE Transactions on Neural Networks 12, 674â683.
Paper not yet in RePEc: Add citation now
Chernozhukov, V., D. Chetverikov, M. Demirer, E. Duflo, C. Hansen, and W. Newey (2017). Double/debiased/neyman machine learning of treatment effects. American Economic Review 107, 261â265.
Chernozhukov, V., D. Chetverikov, M. Demirer, E. Duflo, C. Hansen, W. Newey, and J. Robins (2018). Double/debiased machine learning for treatment and structural parameters. Econometrics Journal 21, C1âC68.
Chinco, A., A. Clark-Joseph, and M. Ye (2019). Sparse signals in the cross-section of returns. Journal of Finance 74, 449â492.
Corsi, F. (2009). A simple long memory model of realized volatility. Journal of Financial Econometrics 7, 174â196.
Coulombe, P., M. Leroux, D. Stevanovic, and S. Surprenant (2020). How is machine learning useful for macroeconomic forecasting? Technical report, University of Pennsylvania.
- Cybenko, G. (1989). Approximation by superposition of sigmoidal functions. Mathematics of Control, Signals, and Systems 2, 303â314.
Paper not yet in RePEc: Add citation now
Diebold, F. (2015). Comparing predictive accuracy, twenty years later: A personal perspective on the use and abuse of Diebold-Mariano tests. Journal of Business and Economic Statistics 33, 1â9.
Diebold, F. X. and R. S. Mariano (1995). Comparing predictive accuracy. Journal of Business and Economic Statistics 13, 253â263.
- Duffy, N. and D. Helmbold (2002). Boosting methods for regression. Machine Learning 47, 153â200.
Paper not yet in RePEc: Add citation now
Elliott, G. and A. Timmermann (2008). Economic forecasting. Journal of Economic Literature 46, 3â56.
Elliott, G. and A. Timmermann (2016). Forecasting in economics and finance. Annual Review of Economics 8, 81â110.
Elliott, G., A. Gargano, and A. Timmermann (2013). Complete subset regressions. Journal of Econometrics 177(2), 357â373.
Elliott, G., A. Gargano, and A. Timmermann (2015). Complete subset regressions with largedimensional sets of predictors. Journal of Economic Dynamics and Control 54, 86â110.
Fan, J. and J. Lv (2008). Sure independence screening for ultrahigh dimensional feature space. Journal of the Royal Statistical Society, Series B 70, 849â911.
Fan, J. and R. Li (2001). Variable selection via nonconcave penalized likelihood and its oracle properties. Journal of the American Statistical Association 96, 1348â1360.
- Fan, J., L. Xue, and H. Zou (2014). Strong oracle optimality of folded concave penalized estimation. Annals of Statistics 42, 819â849.
Paper not yet in RePEc: Add citation now
- Foresee, F. D. and M. . T. Hagan (1997). Gauss-newton approximation to Bayesian regularization.
Paper not yet in RePEc: Add citation now
- Friedman, J. (2001). Greedy function approximation: a gradient boosting machine. Annals of Statistics 29, 1189â1232.
Paper not yet in RePEc: Add citation now
- Funahashi, K. (1989). On the approximate realization of continuous mappings by neural networks. Neural Networks 2, 183â192.
Paper not yet in RePEc: Add citation now
Garcia, M., M. Medeiros, and G. Vasconcelos (2017). Real-time inflation forecasting with high-dimensional models: The case of brazil. International Journal of Forecasting 33(3), 679â693.
Genre, V., G. Kenny, A. Meyler, and A. Timmermann (2013). Combining expert forecasts: Can anything beat the simple average? International Journal of Forecasting 29, 108â121.
Giacomini, R. and H. White (2006). Tests of conditional predictive ability. Econometrica 74, 1545â1578.
Granger, C. and M. Machina (2006). Forecasting and decision theory. Handbook of Economic Forecasting 1, 81â98.
- Grenander, U. (1981). Abstract Inference. New York, USA: Wiley.
Paper not yet in RePEc: Add citation now
Gu, S., B. Kelly, and D. Xiu (2020). Empirical asset pricing via machine learning. Review of Financial Studies 33, 2223â2273.
H. Zou, H. (2006). The adaptive LASSO and its oracle properties. Journal of the American Statistical Association 101, 1418â1429.
- Hamilton, J. (1994). Time Series Analysis. Princeton University Press.
Paper not yet in RePEc: Add citation now
- Han, Y. and R. Tsay (2020). High-dimensional linear regression for dependent data with applications to nowcasting. Statistica Sinica 30, 1797â1827.
Paper not yet in RePEc: Add citation now
Hans, C. (2009). Bayesian LASSO regression. Biometrika 96, 835â845.
Hansen, P. (2005). A test for superior predictive ability. Journal of Business and Economic Statistics 23, 365â380.
Hansen, P., A. Lunde, and J. Nason (2011). The model confidence set. Econometrica 79, 453â497.
Harvey, D., S. Leybourne, and P. Newbold (1997). Testing the equality of prediction mean squared errors. International Journal of Forecasting 13, 281â291.
- Hastie, T., R. Tibshirani, and J. Friedman (2009). The elements of statistical learning: data mining, inference, and prediction. Springer.
Paper not yet in RePEc: Add citation now
- Hastie, T., R. Tibshirani, and M. Wainwright (2015). Statistical learning with sparsity: the LASSO and generalizations. CRC Press.
Paper not yet in RePEc: Add citation now
Hecq, A., L. Margaritella, and S. Smeekes (2019). Granger causality testing in high-dimensional VARs: a post-double-selection procedure. Technical Report 1902.10991, arxiv.
Heravi, S., D. Osborne, and C. Birchenhall (2004). Linear versus neural network forecasts for european industrial production series. International Journal of Forecasting 20, 435â446.
- Hillebrand, E. and M. C. Medeiros (2016). Asymmetries, breaks, and long-range dependence. Journal of Business and Economic Statistics 34, 23â41.
Paper not yet in RePEc: Add citation now
Hillebrand, E. and M. Medeiros (2010). The benefits of bagging for forecast models of realized volatility. Econometric Reviews 29, 571â593.
- Hochreiter, S. and J. Schmidhuber (1997). Long short-term memory. Neural Computation 9, 1735â1780.
Paper not yet in RePEc: Add citation now
- Hoerl, A. and R. Kennard (1970). Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12, 55â67.
Paper not yet in RePEc: Add citation now
- Hornik, K., M. Stinchombe, and H. White (1989). Multi-layer Feedforward networks are universal approximators. Neural Networks 2, 359â366.
Paper not yet in RePEc: Add citation now
Hsu, N.-J., H.-L. Hung, and Y.-M. Chang (2008). Subset selection for vector autoregressive processes using LASSO. Computational Statistics & Data Analysis 52, 3645â3657.
- In IEEE International Conference on Neural Networks (Vol. 3), New York, pp. 1930â1935. IEEE.
Paper not yet in RePEc: Add citation now
Inoue, A. and L. Kilian (2008). How useful is bagging in forecasting economic time series? a case study of U.S. consumer price inflation. Journal of the American Statistical Association 103, 511â522.
- James, W. and C. Stein (1961). Estimation with quadratic loss. Proceedings of the Third Berkeley Symposium on Mathematical Statistics and Probability 1, 361â379.
Paper not yet in RePEc: Add citation now
- Jiang, W. (2004). Process consistency for AdaBoost. Annals of Statistics 32, 13â29.
Paper not yet in RePEc: Add citation now
- K-.S.Chan and K. Chen (2011). Subset ARMA selection via the adaptive LASSO. Statistics and its Interface 4, 197â205.
Paper not yet in RePEc: Add citation now
Kim, Y., H. Choi, and H.-S. Oh (2008). Smoothly clipped absolute deviation on high dimensions.
- Knight, K. and W. Fu (2000). Asymptotics for LASSO-type estimators. Annals of Statistics 28, 1356â1378.
Paper not yet in RePEc: Add citation now
Kock, A. (2016). Consistent and conservative model selection with the adaptive lasso in stationary and nonstationary autoregressions. Econometric Theory 32, 243â259.
Kock, A. and L. Callot (2015). Oracle inequalities for high dimensional vector autoregressions. Journal of Econometrics 186, 325â344.
Kock, A. and T. TeraÌsvirta (2014). Forecasting performance of three automated modelling techniques during the economic crisis 2007-2009. International Journal of Forecasting 30, 616â631.
- Kock, A. and T. TeraÌsvirta (2015). Forecasting macroeconomic variables using neural network models and three automated model selection techniques. Econometric Reviews 35, 1753â 1779.
Paper not yet in RePEc: Add citation now
Konzen, E. and F. Ziegelmann (2016). LASSO-type penalties for covariate selection and forecasting in time series. Journal of Forecasting 35, 592â612.
Koo, B., H. Anderson, M. Seo, and W. Yao (2020). High-dimensional predictive regression in the presence of cointegration. Journal of Econometrics 219, 456â477.
- Lederer, J., L. Yu, and I. Gaynanova (2019). Oracle inequalities for high-dimensional prediction.
Paper not yet in RePEc: Add citation now
Lee, J. and Z. G. Z. Shi (2020). On LASSO for predictive regression. Technical Report 1810.03140, arxiv.
- Lee, J., D. Sun, Y. Sun, and J. Taylor (2016). Exact post-selection inference with application to the LASSO. Annals of Statistics 44, 907â927.
Paper not yet in RePEc: Add citation now
Leeb, H. and B. PoÌtscher (2005). Model selection and inference: Facts and fiction. Econometric Theory 21, 21â59.
Leeb, H. and B. PoÌtscher (2008). Sparse estimators and the oracle property, or the return of Hodgesâ estimator. Journal of Econometrics 142, 201â211.
- Li, J., Z. Liao, and R. Quaedvlieg (2020). Conditional superior predictive ability. Technical report, Erasmus School of Economics.
Paper not yet in RePEc: Add citation now
- Lockhart, R., J. Taylor, R. Tibshirani, and R. Tibshirani (2014). On asymptotically optimal confidence regions and tests for high-dimensional models. Annals of Statistics 42, 413â468.
Paper not yet in RePEc: Add citation now
- Lugosi, G. and N. Vayatis (2004). On the Bayes-risk consistency of regularized boosting methods. Annals of Statistics 32, 30â55.
Paper not yet in RePEc: Add citation now
- MacKay, D. J. C. (1992). A practical bayesian framework for backpropagation networks. Neural Computation 4, 448â472.
Paper not yet in RePEc: Add citation now
- MacKay, D. J. C. (1992). Bayesian interpolation. Neural Computation 4, 415â447.
Paper not yet in RePEc: Add citation now
- Masini, R., M. Medeiros, and E. Mendes (2019). Regularized estimation of high-dimensional vector autoregressions with weakly dependent innovations. Technical Report 1912.09002, arxiv.
Paper not yet in RePEc: Add citation now
McAleer, M. and M. C. Medeiros (2008). A multiple regime smooth transition heterogeneous autoregressive model for long memory and asymmetries. Journal of Econometrics 147, 104â 119.
McAleer, M. and M. Medeiros (2011). Forecasting realized volatility with linear and nonlinear models. Journal of Economic Surveys 25, 6â18.
McCracken, M. (2020). Diverging tests of equal predictive ability. Econometrica 88, 1753â1754.
- Medeiros, M. and E. Mendes (2013). Penalized estimation of semi-parametric additive timeseries models. In N. Haldrup, M. Meitz, and P. Saikkonen (Eds.), Essays in Nonlinear Time Series Econometrics. Oxford University Press.
Paper not yet in RePEc: Add citation now
- Medeiros, M. and E. Mendes (2016a). `1-regularization of high-dimensional time-series models with non-gaussian and heteroskedastic errors. Journal of Econometrics 191, 255â271.
Paper not yet in RePEc: Add citation now
Medeiros, M. and E. Mendes (2017). Adaptive LASSO estimation for ARDL models with GARCH innovations. Econometric Reviews 36, 622â637.
Medeiros, M. and G. Vasconcelos (2016). Forecasting macroeconomic variables in data-rich environments. Economics Letters 138, 50â52.
- Medeiros, M. C. and A. Veiga (2005). A flexible coefficient smooth transition time series model. IEEE Transactions on Neural Networks 16, 97â113.
Paper not yet in RePEc: Add citation now
- Medeiros, M. C. and E. F. Mendes (2016b). `1-regularization of high-dimensional time-series models with non-gaussian and heteroskedastic errors. Journal of Econometrics 191(1), 255â 271.
Paper not yet in RePEc: Add citation now
Medeiros, M. C., A. Veiga, and C. Pedreira (2001). Modelling exchange rates: Smooth transitions, neural networks, and linear models. IEEE Transactions on Neural Networks 12, 755â764.
Medeiros, M. C., G. Vasconcelos, A. Veiga, and E. Zilberman (2021). Forecasting inflation in a data-rich environment: The benefits of machine learning methods. Journal of Business and Economic Statistics 39, 98â119.
Medeiros, M. C., T. TeraÌsvirta, and G. Rech (2006). Building neural network models for time series: A statistical approach. Journal of Forecasting 25, 49â75.
- Melnyk, I. and A. Banerjee (2016). Estimating structured vector autoregressive models. In International Conference on Machine Learning, pp. 830â839.
Paper not yet in RePEc: Add citation now
- Mhaska, H., Q. Liao, and T. Poggio (2017). When and why are deep networks better than shallow ones? In Proceedings of the Thirty-First AAAI Conference on Artificial Intelligence (AAAI-17), pp. 2343â2349.
Paper not yet in RePEc: Add citation now
Nardi, Y. and A. Rinaldo (2011). Autoregressive process modeling via the LASSO procedure. Journal of Multivariate Analysis 102, 528â549.
Park, H. and F. Sakaori (2013). Lag weighted LASSO for time series model. Computational Statistics 28, 493â504.
- Park, J. and I. Sandberg (1991). Universal approximation using radial-basis-function networks. Neural Computation 3, 246â257.
Paper not yet in RePEc: Add citation now
Park, T. and G. Casella (2008). The Bayesian LASSO. Journal of the American Statistical Association 103, 681â686.
Ren, Y. and X. Zhang (2010). Subset selection for vector autoregressive processes via adaptive LASSO. Statistics & Probability Letters 80, 1705â1712.
- Samuel, A. (1959). Some studies in machine learning using the game of checkers. IBM Journal of Research and Development 3.3, 210â229.
Paper not yet in RePEc: Add citation now
- Sang, H. and Y. Sun (2015). Simultaneous sparse model selection and coefficient estimation for heavy-tailed autoregressive processes. Statistics 49, 187â208.
Paper not yet in RePEc: Add citation now
Scharth, M. and M. Medeiros (2009). Asymmetric effects and long memory in the volatility of dow jones stocks. International Journal of Forecasting 25, 304â325.
- Scornet, E., G. Biau, and J.-P. Vert (2015). Consistency of random forests. Annals of Statistics 43, 1716â1741.
Paper not yet in RePEc: Add citation now
- Simon, N., J. Friedman, T. Hastie, and R. Tibshirani (2013). A sparse-group LASSO. Journal of computational and Graphical Statistics 22, 231â245.
Paper not yet in RePEc: Add citation now
Smeeks, S. and E. Wijler (2018). Macroeconomic forecasting using penalized regression methods. International Journal of Forecasting 34, 408â430.
Smeeks, S. and E. Wijler (2020). An automated approach towards sparse single-equation cointegration modelling. Journal of Econometrics. forthcoming.
- Srivastava, N., G. Hinton, A. Krizhevsky, I. Sutskever, and R. Salakhutdinov (2014). Simple way to prevent neural networks from overfitting. Journal of Machine Learning Research 15, 1929â1958.
Paper not yet in RePEc: Add citation now
- Stein, C. (1956). Inadmissibility of the usual estimator for the mean of a multivariate distribution. Proceedings of the Third Berkeley Symposium on Mathematical Statistics and Probability 1, 197â206.
Paper not yet in RePEc: Add citation now
- Stinchcombe, M. and S. White (1989). Universal approximation using feedforward neural networks with non-sigmoid hidden layer activation functions. In Proceedings of the International Joint Conference on Neural Networks, Washington, pp. 613â617. IEEE Press, New York, NY.
Paper not yet in RePEc: Add citation now
Suarez-FarinÌas, C. Pedreira, and M. C. Medeiros (2004). Local-global neural networks: A new approach for nonlinear time series modelling. Journal of the American Statistical Association 99, 1092â1107.
Swanson, N. R. and H. White (1995). A model selection approach to assesssing the information in the term structure using linear models and artificial neural networks. Journal of Business and Economic Statistics 13, 265â275.
Swanson, N. R. and H. White (1997a). Forecasting economic time series using flexible versus fixed specification and linear versus nonlinear econometric models. International Journal of Forecasting 13, 439â461.
Swanson, N. R. and H. White (1997b). A model selection approach to real-time macroeconomic forecasting using linear models and artificial neural networks. Review of Economic and Statistics 79, 540â550.
- Taylor, J., R. Lockhart, R. Tibshirani, and R. Tibshirani (2014). Post-selection adaptive inference for least angle regression and the LASSO. Technical Report 1401.3889, arxiv.
Paper not yet in RePEc: Add citation now
- TeraÌsvirta, T. (1994). Specification, estimation, and evaluation of smooth transition autoregressive models. Journal of the American Statistical Association 89, 208â218.
Paper not yet in RePEc: Add citation now
TeraÌsvirta, T., D. TjoÌstheim, and C. Granger (2010). Modelling Nonlinear Economic Time Series. Oxford, UK: Oxford University Press.
TeraÌsvirta, T., D. van Dijk, and M. Medeiros (2005). Linear models, smooth transition autoregressions and neural networks for forecasting macroeconomic time series: A reexamination (with discussion). International Journal of Forecasting 21, 755â774.
- Tibshirani, R. (1996). Regression shrinkage and selection via the LASSO. Journal of the Royal Statistical Society, Series B 58, 267â288.
Paper not yet in RePEc: Add citation now
- Tikhonov, A. (1943). On the stability of inverse problems. Doklady Akademii Nauk SSSR 39, 195â198. in Russian.
Paper not yet in RePEc: Add citation now
- Tikhonov, A. (1963). On the solution of ill-posed problems and the method of regularization. Doklady Akademii Nauk 151, 501â504.
Paper not yet in RePEc: Add citation now
- Tikhonov, A. and V. Arsenin (1977). Solutions of ill-posed problems. V.H Winston and Sons.
Paper not yet in RePEc: Add citation now
Tkacz, G. (2001). Neural network forecasting of Canadian GDP growth. International Journal of Forecasting 17, 57â69.
- Trapletti, A., F. Leisch, and K. Hornik (2000). Stationary and integrated autoregressive neural network processes. Neural Computation 12, 2427â2450.
Paper not yet in RePEc: Add citation now
Uematsu, Y. and S. Tanaka (2019). High-dimensional macroeconomic forecasting and variable selection via penalized regression. The Econometrics Journal 22, 34â56.
- van de Geer, S., P. BuÌhlmann, Y. Ritov, and R. Dezeure (2014). On asymptotically optimal confidence regions and tests for high-dimensional models. Annals of Statistics 42, 1166â1202.
Paper not yet in RePEc: Add citation now
Wager, S. and S. Athey (2018). Estimation and inference of heterogeneous treatment effects using random forests. Journal of the American Statistical Association 113, 1228â1242.
Wang, H. and C. Leng (2008). A note on adaptive group LASSO. Computational Statistics & data analysis 52, 5277â5286.
Wang, H., G. Li, and C.-L. Tsai (2007). Regression coefficient and autoregressive order shrinkage and selection via the LASSO. Journal of the Royal Statistical Society, Series B 69, 63â78.
White, H. (2000). A reality check for data snooping. Econometrica 68, 1097â1126.
- Wong, K., Z. Li, and A. Tewari (2020). LASSO guarantees for β-mixing heavy tailed time series. Annals of Statistics 48, 1124â1142.
Paper not yet in RePEc: Add citation now
- Wu, W. (2005). Nonlinear system theory: Another look at dependence. Proceedings of the National Academy of Sciences 102, 14150â14154.
Paper not yet in RePEc: Add citation now
- Wu, W. and Y. Wu (2016). Performance bounds for parameter estimates of high-dimensional linear models with correlated errors. Electronic Journal of Statistics 10, 352â379.
Paper not yet in RePEc: Add citation now
Xie, F., L. Xu, and Y. Yang (2017). LASSO for sparse linear regression with exponentially β-mixing errors. Statistics & Probability Letters 125, 64â70.
Xue, Y. and M. Taniguchi (2020). Modified LASSO estimators for time series regression models with dependent disturbances. Statistical Methods & Applications 29, 845â869.
- Yang, Y. and H. H. Zou (2015). A fast unified algorithm for solving group-LASSO penalize learning problems. Statistics and Computing 25, 1129â1141.
Paper not yet in RePEc: Add citation now
- Yarotsky, D. (2017). Error bounds for approximations with deep ReLU networks. Neural Networks 94, 103â114.
Paper not yet in RePEc: Add citation now
- Yoon, Y., C. Park, and T. Lee (2013). Penalized regression models with autoregressive error terms. Journal of Statistical Computation and Simulation 83, 1756â1772.
Paper not yet in RePEc: Add citation now
- Yousuf, K. (2018). Variable screening for high dimensional time series. Electronic Journal of Statistics 12, 667â702.
Paper not yet in RePEc: Add citation now
Yuan, M. and Y. Lin (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society, Series B 68, 49â67.
- Zhang, T. and B. Yu (2005). Boosting with early stopping: Convergence and consistency.
Paper not yet in RePEc: Add citation now
- Zhao, P. and B. Yu (2006). On model selection consistency of LASSO. Journal of Machine learning research 7, 2541â2563.
Paper not yet in RePEc: Add citation now
- Zhu, X. (2020). Nonconcave penalized estimation in sparse vector autoregression model. Electronic Journal of Statistics 14, 1413â1448.
Paper not yet in RePEc: Add citation now
- Zou, H. and H. Zhang (2009). On the adaptive elastic-net with a diverging number of parameters.
Paper not yet in RePEc: Add citation now
Zou, H. and T. Hastie (2005). Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society, Series B 67, 301â320.