Main Article Content

Abstract

Support Vector Regression (SVR) formulates is an optimization problem to learn a regression function that maps from input predictor variables to output observed response values. The SVR is useful because it balances model complexity and prediction error, and it has good performance for handling high-dimensional data. In this paper, we use the SVR model to improve the principal component analysis and the factor analysis methods. Simulation experiments are performed to assessment the new method. Some useful applications to real data sets are presented for comparing the competitive SVR models. It is noted that with increasing sample size, the -SVR type under the principal component analysis is the best model. However, under the small sample sizes the SVR type under the factor analysis provided adequate results.

Keywords

Support Vector Regression Factor Analysis Kernel Functions Principal Component Analysis v -Support Vector Regression

Article Details

How to Cite
Salem, M., & Khalil , M. G. (2022). The Support Vector Regression Model: A new Improvement for some Data Reduction Methods with Application. Pakistan Journal of Statistics and Operation Research, 18(2), 427-435. https://doi.org/10.18187/pjsor.v18i2.4049

References

  1. Chowdhury, U. N., Chakravarty, S. K., and Hossain, M. T. (2018). Short-term financial time series forecasting integrating principal component analysis and independent component analysis with support vector regression. Journal of Computer and Communications, 6(03), 51.†DOI: https://doi.org/10.4236/jcc.2018.63004
  2. Yu, H., Chen, R., and Zhang, G. (2014). A SVM stock selection model within PCA. Procedia computer science, 31, 406-412 DOI: https://doi.org/10.1016/j.procs.2014.05.284
  3. †Glaser, J. I., Benjamin, A. S., Farhoodi, R., and Kording, K. P. (2019). The roles of supervised machine learning in systems neuroscience. Progress in neurobiology, 175, 126-137.†DOI: https://doi.org/10.1016/j.pneurobio.2019.01.008
  4. Lee, J. A., and Verleysen, M. (2009). Quality assessment of dimensionality reduction: Rank-based criteria. Neurocomputing, 72(7-9), 1431-1443.†DOI: https://doi.org/10.1016/j.neucom.2008.12.017
  5. Jolliffe, I. T., and Cadima, J. (2016). Principal component analysis: a review and recent developments. Philosophical Transactions of the Royal Society A: Mathematical, Physical and Engineering Sciences, 374(2065), 20150202.†DOI: https://doi.org/10.1098/rsta.2015.0202
  6. Rosipal, R., Girolami, M., Trejo, L. J., and Cichocki, A. (2001). Kernel PCA for feature extraction and de-noising in nonlinear regression. Neural Computing and Applications, 10(3), 231-243.†DOI: https://doi.org/10.1007/s521-001-8051-z
  7. Shi, S., Li, G., Chen, H., Hu, Y., Wang, X., Guo, Y., and Sun, S. (2018). An efficient VRF system fault diagnosis strategy for refrigerant charge amount based on PCA and dual neural network model. Applied Thermal Engineering, 129, 1252-1262.†DOI: https://doi.org/10.1016/j.applthermaleng.2017.09.117
  8. Jolliffe, I. T. (1986). Principal components in regression analysis. In Principal component analysis (pp. 129-155). Springer, New York, NY.†DOI: https://doi.org/10.1007/978-1-4757-1904-8_8
  9. Chao, D., Zhou, W., Ye, C., Zhang, Q., Chen, Y., Gu, L., and Qiao, S. Z. (2019). An electrolytic Zn-MnO2 battery for highâ€voltage and scalable energy storage. Angewandte Chemie International Edition, 58(23), 7823-7828.†DOI: https://doi.org/10.1002/anie.201904174
  10. Vapnik, V. (2013). The nature of statistical learning theory. Springer science and business media.â€
  11. Mechelli, A., and Vieira, S. (Eds.). (2019). Machine learning: methods and applications to brain disorders. Academic Press.â€
  12. Astuti, W. (2018, March). Support vector machine and principal component analysis for microarray data classification. In Journal of Physics: Conference Series (Vol. 971, No. 1, p. 012003). IOP Publishing.†DOI: https://doi.org/10.1088/1742-6596/971/1/012003
  13. Drucker, H., Burges, C. J., Kaufman, L., Smola, A., and Vapnik, V. (1997). Support vector regression machines. Advances in neural information processing systems, 9, 155-161.
  14. Jiang, M., Zhu, L., Wang, Y., Xia, L., Shou, G., Liu, F., and Crozier, S. (2011). Application of kernel principal component analysis and support vector regression for reconstruction of cardiac transmembrane potentials. Physics in Medicine and Biology, 56(6), 1727. DOI: https://doi.org/10.1088/0031-9155/56/6/013
  15. Shokri, S., Sadeghi, M. T., Marvast, M. A., and Narasimhan, S. (2015). Integrating principal component analysis and vector quantization with support vector regression for sulfur content prediction in HDS process. Chemical Industry and Chemical Engineering Quarterly, 21(3), 379-390. DOI: https://doi.org/10.2298/CICEQ140418039S
  16. Chowdhury, U. N., Rayhan, M. A., Chakravarty, S. K., and Hossain, M. T. (2017). Integration of principal component analysis and support vector regression for financial time series forecasting. International Journal of Computer Science and Information Security (IJCSIS), 15(8).â€
  17. Vapnik, V. N. (1998). Adaptive and learning systems for signal processing communications, and control. Statistical learning theory.â€