References

  1. Ahmed, S.E. and Saleh, A.K.M.E. (1988). Estimation strategy using a preliminary test in some univariate normal models. Soochow Journal of Mathematics 14: 135–165.
  2. Ahsanullah, M. and Saleh, A.K.M.E. (1972). Estimation of intercept in a linear regression model with one dependent variable after a preliminary test on the regression coefficient. International Statistical Review 40: 139–145.
  3. Akdeniz, F. and Ozturk, F. (1981). The effect of multicollinearity: a geometric view. Communications de la Faculte des Sciences de l'Universite d'Ankara 30: 17–26.
  4. Akdeniz, F. and Tabakan, G. (2009). Restricted ridge estimators of the parameters in semiparametric regression model. Communications in Statistics ‐ Theory and Methods 38 (11): 1852–1869.
  5. Alkhamisi, M. and Shukur, G. (2008). Developing ridge parameters for SUR model. Communications in Statistics ‐ Theory and Methods 37 (4): 544–564.
  6. Anderson, T.W. (1984). An Introduction to Multivariate Statistical Analysis. New York: Wiley.
  7. Arashi, M. (2012). Preliminary test and Stein estimators in simultaneous linear equations. Linear Algebra and its Applications 436 (5): 1195–1211.
  8. Arashi. M. and Norouzirad, M. (2016). Steinian shrinkage estimation in high dimensional regression. In: 13th Iranian Statistics Conference. Kerman, Iran: Shahid Bahonar University of Kerman.
  9. Arashi, M. and Roozbeh, M. (2016). Some improved estimation strategies in high‐dimensional semiparametric regression models with application to the Riboflavin production data. Statistical Papers. doi: 10.1007/s00362‐016‐0843‐y.
  10. Arashi, M. and Tabatabaey, S.M.M. (2009). Improved variance estimation under sub‐space restriction. Journal of Multivariate Analysis 100: 1752–1760.
  11. Arashi, M. and Valizadeh, T. (2015). Performance of Kibria's methods in partial linear ridge regression model. Statistical Papers 56 (1): 231–246.
  12. Arashi, M., Saleh, A.K.M.E, and Tabatabaey, S.M.M. (2010). Estimation of parameters of parallelism model with elliptically distributed errors. Metrika 71: 79–100.
  13. Arashi, M., Roozbeh, M., and Niroomand, H.A. (2012). A note on Stein type shrinkage estimator in partial linear models. Statistics 64 (5): 673–685.
  14. Arashi, M., Janfada, M., and Norouzirad, M. (2015). Singular ridge regression with stochastic constraints. Communication in Statistics ‐ Theory and Methods 44: 1281–1292.
  15. Arashi, M., Kibria, B.M.G., and Valizadeh, T. (2017). On ridge parameter estimators under stochastic subspace hypothesis. The Journal of Statistical Computation and Simulation 87 (5): 966–983.
  16. Asar, M., Arashi, M., and Wu, J. (2017). Restricted ridge estimator in the logistic regression model. Communications in Statistics ‐ Simulation and Computation 46 (8): 6538–6544.
  17. Aslam, M. (2014). Performance of Kibria's method for the heteroscedastic ridge regression model: some Monte Carlo evidence. Communications in Statistics ‐ Simulation and Computation 43 (4): 673–686.
  18. Avalos, M., Grandvalet, Y., and Ambroise, C. (2007). Parsimonious additive models. Computational Statistics and Data Analysis 51: 2851–2870.
  19. Aydin, D., Yuzbasi, B., and Ahmed, S.E. (2016). Modified ridge type estimator in partially linear regression models and numerical comparisons. Journal of Computational and Theoretical Nanoscience 13 (10): 7040–7053.
  20. Baltagi, B.H. (1980). On seemingly unrelated regressions with error components. Econometrica 48: 1547–1551.
  21. Bancroft, T. (1964). Analysis and inference for incompletely specified models involving the use of preliminary test(s) of significance. Biometrics 57: 579–594.
  22. Beaver, W.H. (1966). Financial ratios as predictors of failure. Journal of Accounting Research 4 (3): 71–111.
  23. Belloni, A. and Chernozhukov, V. (2013). Least squares after model selection in high‐dimensional sparse models. Bernoulli 19: 521–547.
  24. Breiman, L. (1996). Heuristics of instability and stabilization in model selection. The Annals of Statistics 24: 2350–2383.
  25. Buhlmann, P., Kalisch, M., and Meier, L. (2014). High‐dimensional statistics with a view toward applications in biology. Annual Review of Statistics and its Applications 1: 255–278.
  26. Chandrasekhar, C.K., Bagyalakshmi, H., Srinivasan, M.R., and Gallo, M. (2016). Partial ridge regression under multicollinearity. Journal of Applied Statistics 43 (13): 2462–2473.
  27. Dempster, A.P., Schatzoff, M., and Wermuth, N. (1977). A simulation study of alternatives to ordinary least squares. Journal of the American Statistical Association 72: 77–91.
  28. Dicker, L.H. (2016). Ridge regression and asymptotic minimum estimation over spheres of growing dimension. Bernoulli 22: 1–37.
  29. Donoho, D.L. and Johnstone, I.M. (1994). Ideal spatial addaption by wavelet shrinkage. Biometrika 81 (3): 425–455.
  30. Draper, N.R. and Nostrand, R.C.V. (1979). Ridge regression and James‐Stein estimation: review and comments. Technometrics 21: 451–466.
  31. Fallah, R., Arashi, M., and Tabatabaey, S.M.M. (2017). On the ridge regression estimator with sub‐space restriction. Communication in Statistics ‐ Theory and Methods 46 (23): 11854–11865.
  32. Fan, J. (2017). Discusion of “post selection shrinkage estimation for high dimensional data anlysis”. Applied Stochastic Model in Business and Industry 33: 121–122.
  33. Fan, J. and Li, R. (2001). Variable selection via nonconcave penalized likelihood and its oracle properties. Journal of the American Statistical Association 96: 1348–1360.
  34. Farrar, D.E. and Glauber, R.R. (1967). Multicollinearity in regression analysis: the problem revisited. Review of Economics and Statistics 49 (1): 92–107.
  35. Foschi, P., Belsley, D.A., and Kontoghiorghes, E.J. (2003). A comparative study of algorithms for solving seemingly unrelated regressions models. Computational Statistics and Data Analysis 44: 3–35.
  36. Frank, I.E. and Friedman, J.H. (1993). A statistical view of some chemometrics regression tools. Technometrics 35: 109–135.
  37. Gao, J.T. (1995). Asymptotic theory for partially linear models. Communication in Statistics ‐ Theory and Methods 22: 3327–3354.
  38. Gao, J. (1997). Adaptive parametric test in a semi parametric regression model. Communication in Statistics ‐ Theory and Methods 26: 787–800.
  39. Gao, X., Ahmed, S.E., and Feng, Y. (2017). Post selection shrinkage estimation for high dimensional data analysis. Applied Stochastic Models in Business and Industry 33: 97–120.
  40. Gibbons, D.G. (1981). A simulation study of some ridge estimators. Journal of the American Statistical Association 76: 131–139.
  41. Golub, G.H., Heath, M., and Wahba, G. (1979). Generalized cross‐validation as a method for choosing a good ridge parameter. Technometrics 21 (2): 215–223.
  42. Grandvalet, Y. (1998). Least absolute shrinkage is equivalent to quadratic penalization. In: ICANN'98, Perspectives in Neural Computing, vol. 1 (ed. L. Niklasson, M. Boden, and T. Ziemske), 201–206. Springer.
  43. Grob, J. (2003). Restricted ridge estimator. Statistics and Probability Letters 65: 57–64.
  44. Gruber, M.H.J. (1998). Improving Efficiency by Shrinkage: The James‐Stein and Ridge Regression Estimators. New York: Marcel Dekker.
  45. Gruber, M.H.J. (2010). Regression Estimators, 2e. Baltimore, MD: Johns Hopkins University Press.
  46. Gunst, R.F. (1983). Regression analysis with multicollinear predictor variables: definition, detection, and effects. Communication in Statistics ‐ Theory and Methods 12 (19): 2217–2260.
  47. Hansen, B.E. (2016). The risk of James‐Stein and lasso shrinkage. Econometric Reviews 35: 1456–1470.
  48. Hardle, W., Liang, H., and Gao, J. (2000). Partially Linear Models. Heidelberg: Springer Physica‐Verlag.
  49. Haykin, S. (1999). Neural Networks and Learning Machines, 3e. Pearson Prentice‐Hall.
  50. Hefnawy, E.A. and Farag, A. (2013). A combined nonlinear programming model and Kibria method for choosing ridge parameter regression. Communications in Statistics ‐ Simulation and Computation 43 (6): 1442–1470.
  51. Hoerl, A.E. (1962). Application of ridge analysis to regression problems. Chemical Engineering Progress 58 (3): 54–59.
  52. Hoerl, A.E. and Kennard, R.W. (1970). Ridge regression: biased estimation for non‐orthogonal problems. Technometrics 12: 55–67.
  53. Hoerl, A.E., Kennard, R.W., and Baldwin, K.F. (1975). Ridge regression: some simulations. Communications in Statistics ‐ Theory and Methods 4: 105–123.
  54. Hosmer, D.W. and Lemeshow, J.S. (1989). Applied Logistic Regression, 2e. Wiley.
  55. Huang, C.C.L., Jou, Y.J., and Cho, H.J. (2016). A new multicollinearity diagnostic for generalized linear models. Journal of Applied Statistics 43 (11): 2029–2043.
  56. Ismail, B. and Suvarna, M. (2016). Estimation of linear regression model with correlated regressors in the presence of auto correlation. International Journal of Statistics and Applications 6 (2): 35–39.
  57. Jamal, N. and Rind, M.Q. (2007). Ridge regression: a tool to forecast wheat area and production. Pakistan Journal of Statistics and Operation Research 3 (2): 125–134.
  58. James, W. and Stein, C. (1961). Estimation with quadratic loss. In: Proceeding Berkeley Symposium on Mathematical Statistics and Probability, vol. 1, 361–379. University of California.
  59. James, G., Witten, D., Hastie, T., and Tibshirani, R. (2013). An Introduction to Statistical Learning: with Applications in R. Springer.
  60. Judge, G.G. and Bock, M.E. (1978). The Statistical Implications of Pre‐test and Stein‐Rule Estimators in Econometrics. Amsterdam: North‐Holland Publishing Company.
  61. Jureckova, J. (1971). Non parametric estimate of regression coefficients. The Annals of Mathematical Statistics 42: 1328–1338.
  62. Jureckova, J. and Sen, P.K. (1996). Robust Statistical Procedures: Asymptotic and Interrelations. New York: Wiley.
  63. Kaciranlar, S., Sakallioglu, S., Akdeniz, F. et al. (1999). A new biased estimator in linear regression and a detailed analysis of the widely‐analyzed data set on Portland cement. Sankhya The Indian Journal of Statistics 61: 443–456.
  64. Khalaf, G. and Shukur, G. (2005). Choosing ridge parameters for regression problems. Communications in Statistics ‐ Theory and Methods 34 (5): 1177–1182.
  65. Kibria, B.M.G. (2003). Performance of some new ridge regression estimators. Communications in statistics ‐ Simulation and Computations 32 (2): 419–435.
  66. Kibria, B.M.G. (2012). Some Liu and Ridge type estimators and their properties under the Ill‐conditioned Gaussian linear regression model. Journal of Statistical Computation and Simulation 82 (1): 1–17.
  67. Kibria, B.M.G. and Banik, S. (2016). Some ridge regression estimators and their performances. Journal of Modern Applied Statistical Methods 15 (1): 206–238.
  68. Kibria, B.M.G. and Saleh, A.K.M.E. (2012). Improving the estimators of the parameters of a probit regression model: a ridge regression approach. Journal of Statistical Planning and Inference 14 (2): 1421–1435.
  69. Knight, K. and Fu, W.J. (2000). Asymptotic for Lasso‐type estimators. Annals of Statistics 28: 1356–1378.
  70. Kontoghiorghes, E.J. (2000). Inconsistencies and redundancies in SURE models: computational aspects. Computational Economics 16 (1‐2): 63–70.
  71. Kontoghiorghes, E.J. (2004). Computational methods for modifying seemingly unrelated regressions models. Journal of Computational and Applied Mathematics 162 (1): 247–261.
  72. Kontoghiorghes, E.J. and Clarke, M.R.B. (1995). An alternative approach for the numerical solution of seemingly unrelated regression equations models. Computational Statistics and Data Analysis 19 (4): 369–377.
  73. Lawless, J.F. and Wang, P.A. (1976). simulation study of ridge and other regression estimators. Communications in Statistics ‐ Theory and Methods 5: 307–323.
  74. Liang, H. and Hardle, W. (1999). Large sample theory of the estimation of the error distribution for semi parametric models. Communications in Statistics ‐ Theory and Methods 28 (9): 2025–2036.
  75. Luo, J. (2010). The discovery of mean square error consistency of ridge estimator. Statistics and Probability Letters 80: 343–347.
  76. Luo, J. (2012). Asymptotic efficiency of ridge estimator in linear and semi parametric linear models. Statistics and Probability Letters 82: 58–62.
  77. Luo, J. and Zuo, Y.J. (2011). A new test for large dimensional regression coefficients. Open Journal of Statistics 1: 212–216.
  78. McDonald, G.C. and Galarneau, D.I. (1975). A Monte Carlo evaluation of ridge‐type estimators. Journal of the American Statistical Association 70 (350): 407–416.
  79. Mansson, K., Shukur, G., and Kibria, B.M.G. (2010). On some ridge regression estimators: a Monte Carlo simulation study under different error variances. Journal of Statistics 17 (1): 1–22.
  80. Maronna, R.A. (2011). Robust ridge regression for high‐dimensional data. Technometrics 53 (1): 44–53.
  81. Marquardt, D.W. and Snee, R.D. (1975). Ridge regression in practice. The American Statistician 29 (1): 3–20.
  82. Martin, D. (1977). Early warning of bank failure: a logit regression approach. Journal of Banking and Finance 1 (3): 249–276.
  83. Montgomery, D.C., Peck, E.A., and Vining, G.G. (2012). Introduction to Linear Regression Analysis. 5e. Hoboken, NJ: Wiley.
  84. Müller, M. and Rönz, B. (1999). Credit Scoring Using Semiparametric Methods, Discussion Papers, Interdisciplinary Research Project 373: Quantification and Simulation of Economic Processes, No. 1999,93, Humboldt‐Universität Berlin http://nbnresolving.de/urn:nbn:de:kobv:11-10046812.
  85. Muniz, G. and Kibria, B.M.G. (2009). On some ridge regression estimators: an empirical comparison. Communications in Statistics ‐ Simulation and Computation 38 (3): 621–630.
  86. Muniz, G., Kibria, B.M.G., Mansson, K., and Shukur, G. (2012). On developing ridge regression parameters: a graphical investigation. Statistics and Operations Research Transactions 36 (2): 115–138.
  87. Najarian, S., Arashi, M., and Kibria, B.M. (2013). A simulation study on some restricted ridge regression estimators. Communication in Statistics ‐ Simulation and Computation 42 (4): 871–890.
  88. Norouzirad, M. and Arashi, M. (2017). Preliminary test and Stein‐type shrinkage ridge estimators in robust regression. Statistical Papers. doi: 10.1007/s00362‐017‐0899‐3.
  89. Norouzirad, M., Arashi, M., and Ahmed, S. E. (2017). Improved robust ridge M‐estimation. Journal of Statistical Computation and Simulation 87 (18): 3469–3490.
  90. Ozturk, F. and Akdeniz, F. (2000). Ill‐conditioning and multicollinearity. Linear Algebra and its Applications 321: 295–305.
  91. Puri, M.L. and Sen, P.K. (1986). Nonparametric Methods in General Linear Models. New York: Wiley.
  92. Raheem, E., Ahmed, S.E., and Doksum, K. (2012). Absolute penalty and shrinkage estimation in partially linear models. Computational Statistics and Data Analysis 56 (4): 874–891.
  93. Roozbeh, M. (2015). Shrinkage ridge estimators in semi parametric regression models. Journal of Multivariate Analysis 136: 56–74.
  94. Roozbeh, M. and Arashi, M. (2013). Feasible ridge estimator in partially linear models. Journal of Multivariate Analysis 116: 35–44.
  95. Roozbeh, M. and Arashi, M. (2016a). New ridge regression estimator in semi‐parametric regression models. Communications in Statistics ‐ Simulation and Computation 45 (10): 3683–3715.
  96. Roozbeh, M. and Arashi, M. (2016b). Shrinkage ridge regression in partial linear models. Communications in Statistics ‐ Theory and Methods 45 (20): 6022–6044.
  97. Roozbeh, M., Arashi, M., and Gasparini, M. (2012). Seemingly unrelated ridge regression in semi parametric models. Communications in Statistics ‐ Theory and Methods 41: 1364–1386.
  98. Sakallioglu, S. and Akdeniz, F. (1998). Generalized inverse estimator and comparison with least squares estimator. Communication in Statistics ‐ Theory and Methods 22: 77–84.
  99. Saleh, A. (2006). Theory of Preliminary Test and Stein‐type Estimation with Applications. New York: Wiley.
  100. Saleh, A.K.M.E. and Kibria, B.M.G. (1993). Performances of some new preliminary test ridge regression estimators and their properties. Communications in Statistics ‐ Theory and Methods 22: 2747–2764.
  101. Saleh, A.K.M.E. and Sen, P.K. (1978). Non‐parametric estimation of location parameter after a preliminary test on regression. Annals of Statistics 6: 154–168.
  102. Saleh, A.K.M.E. and Sen, P.K. (1985). On shrinkage R‐estimation in a multiple regression model. Communication in Statistics ‐Theory and Methods 15: 2229–2244.
  103. Saleh, A.K.M.E., Arashi, M., and Tabatabaey, S.M.M. (2014). Statistical Inference for Models with Multivariate t‐Distributed Errors. Hoboken, NJ: Wiley.
  104. Saleh, A.K.M.E., Arashi, M., Norouzirad, M., and Kibria, B.M.G. (2017). On shrinkage and selection: ANOVA modle. Journal of Statistical Research 51 (2): 165–191.
  105. Sarkar, N. (1992). A new estimator combining the ridge regression and the restricted least squares methods of estimation. Communication in Statistics ‐ Theory and Methods 21: 1987–2000.
  106. Sen, P.K. and Saleh, A.K.M.E. (1979). Non parametric estimation of location parameter after a preliminary test on regression in multivariate case. Journal of Multivariate Analysis 9: 322–331.
  107. Sen, P.K. and Saleh, A.K.M.E. (1985). On some shrinkage estimators of multivariate location. Annals of Statistics 13: 172–281.
  108. Sen, P.K. and Saleh, A.K.M.E. (1987). On preliminary test and shrinkage M‐estimation in linear models. Annals of Statistics 15: 1580–1592.
  109. Shao, J. and Deng, X. (2012). Estimation in high‐dimensional linear models with deterministic design matrices. Annals of Statistics 40 (2): 812–831.
  110. Shi, J. and Lau, T.S. (2000). Empirical likelihood for partially linear models. Journal of Multivariate Analysis 72 (1): 132–148.
  111. Speckman, P. (1988). Kernel smoothing in partial linear models. Journal of the Royal Statistical Society: Series B (Statistical Methodology) 50: 413–436.
  112. Stamey, T., Kabalin, J., McNeal, J. et al. (1989). Prostate specific antigen in the diagnosis and treatment of adenocarcinoma of the prostate II: radical prostatectomy treated patients. The Journal of Urology 16: 1076–1083.
  113. Stein, C. (1956). Inadmissibility of the usual estimator for the mean of a multivariate normal distribution. Proceedings of the 3rd Berkeley Symposium on Mathematical Statistics and Probability, pp. 1954–1955.
  114. Tam, K. and Kiang, M. (1992). Managerial applications of neural networks: the case of bank failure predictions. Management Science 38 (7): 926–947.
  115. Tibshirani, R. (1996). Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society: Series B (Statistical Methodology) 58 (1): 267–288.
  116. Tikhonov, A.N. (1963). Translated in “solution of incorrectly formulated problems and the regularization method”. Soviet Mathematics 4: 1035–1038.
  117. Wan, A.T.K. (2002). On generalized ridge regression estimators under collinearity and balanced loss. Applied Mathematics and Computation 129: 455–467.
  118. Wang, X., Dunson, D., and Leng, C. (eds.) (2016). No penalty no tears: least squares in high dimensional linear models. Proceedings of the 33rd International Conference on Machine Learning (ICML 2016).
  119. Woods, H., Steinnour, H.H., and Starke, H.R. (1932). Effect of composition of Portland cement on heat evolved during hardening. Industrial and Engineering Chemistry 24: 1207–1241.
  120. Yuzbasi, B. and Ahmed, S.E. (2015). Shrinkage ridge regression estimators in high‐dimensional linear models. In: Proceedings of the 9th International Conference on Management Science and Engineering Management, Advances in Intelligent Systems and Computing, vol. 362 (ed. J. Xu, S. Nickel, V. Machado, and A. Hajiyev). Berlin, Heidelberg: Springer‐Verlag.
  121. Zhang, M., Tsiatis, A.A., and Davidian, M. (2008). Improving efficiency of inferences in randomized clinical trials using auxiliary covariates. Biometrics 64 (3): 707–715.
  122. Zou, H. (2006). The adaptive lasso and its oracle properties. Journal of the American Statistical Association 101: 1418–1429.
  123. Zou, H. and Hastie, T. (2005). Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society: Series B (Statistical Methodology) 67: 301–320.
..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset