Abstract
Twin support vector machine (TSVM) is an effective machine learning tool for classification problems. However, TSVM classifier works on empirical risk principle only and also while training, each sample contributes equally, even if it is a noise or an outlier. It does not incorporate the uncertainties associated with data into modeling and hence its generalization ability declines. To address these issues, intuitionistic fuzzy regularized least square twin support vector machine having intuitionistic fuzzy network has been proposed in this paper. The non-parallel classifiers are obtained by solving two systems of linear equations only rather than the solution of two quadratic programming problems as in TSVM, which leads to speed up the training process. Moreover, the method follows both structural risk and empirical risk minimization principles. In order to de-escalate the effect of pollutant patterns, their contribution of the patterns into learning the decision function has been made according to their importance in the classification. The significance of the training patterns is measured in terms of intuitionistic fuzzy numbers based on their geometrical locations and surroundings. The method is further extended to find non-parallel decision planes in the feature space using nonlinear kernel function, which also gives rise to the solution of two systems of linear equations. To show the efficacy of the proposed method, computer simulations on fourteen standard and six big UCI datasets using linear and Gaussian kernels are performed and their results have been compared with well-established methods in the literature. The experimental results are represented in terms of accuracy, computational time, F-measure, sensitivity and specificity rates. The outcomes demonstrate that the proposed method outperforms the existing methods and is also feasible for big datasets. The comparison and statistical inferences using two non-parametric: Friedman and Nemenyi tests, conclude that the proposed approach is fast and yields better generalization.
Similar content being viewed by others
References
Acosta, S.M., Amoroso, A.L., Sant’Anna, Â.M.O., & Junior, O.C. (2021). Predictive modeling in a steelmaking process using optimized relevance vector regression and support vector regression. Annals of Operations Research, 1-22.
Akyildirim, E., Goncu, A., & Sensoy, A. (2021). Prediction of cryptocurrency returns using machine learning. Annals of Operations Research, 297, 3–36.
Atanassov, K. T. (1999). Intuitionistic fuzzy sets: theory and applications. New York: Physica-Verlag.
Azar, A. T., & El-Said, S. A. (2014). Performance analysis of support vector machines classifiers in breast cancer mammography recognition. Neural Computing and Applications, 24(5), 1163–1177.
Blake, C.L., & Merz, C.J. (1998). UCI repository for machine learning databases. Department of Information and Computer Sciences University of California Irvine http://www.ics.uci.edu/~mlearn/MLRepository.html.
Burges, C. J. C. (1998). A tutorial on support vector machines for pattern recognition. Data Mining and Knowledge Discovery, 2, 121–167.
Cormen, T. H., Leiserson, C. E., Rivest, R. L., & Stein, C. (2009). Introduction to algorithms. Cambridge: MIT Press.
Chen, S. G., & Wu, X. J. (2018). A new fuzzy twin support vector machine for pattern classification. International Journal of Machine Learning and Cybernetics, 9(9), 1553–1564.
Cortes, C., & Vapnik, V. (1995). Support vector networks. Machine Learning, 20(3), 273–297.
Cristianini, N., & Shawe-Taylor, J. (2000). An introduction to support vector machines and other kernel-based learning methods. Cambridge: Cambridge University Press.
Demšar, J. (2006). Statistical comparisons of classifiers over multiple data sets. Journal of Machine Learning Research, 7, 1–30.
Ding, S., Zhu, Z., & Zhang, X. (2017). An overview on semi-supervised support vector machine. Neural Computing and Applications, 28(5), 969–978.
Gillman, A., Young, P. M., & Martinsson, P. G. (2012). A direct solver with O(N) complexity for integral equations on one-dimensional domains. Frontiers of Mathematics in China, 7(2), 217–247.
Ha, M., Wang, C., & Chen, J. (2013). The support vector machine based on intuitionistic fuzzy number and kernel function. Soft Computing, 17, 635–641.
Ha, M. H., Huang, S., Wang, C., & Wang, X. L. (2011). Intuitionistic fuzzy support vector machine. Journal of Hebei University (Natural Science Edition), 3, 225–229.
Hao, P. Y. (2010). New support vector algorithms with parametric insensitive/margin model. Neural Networks, 23(1), 60–73.
Hao, P. Y., Kung, C. F., Chang, C. Y., & Ou, J. B. (2021). Predicting stock price trends based on financial news articles and using a novel twin support vector machine with fuzzy hyperplane. Applied Soft Computing, 98, 106806.
Hripcsak, G., & Rothschild, A. S. (2005). Agreement, the f-measure, and reliability in information retrieval. Journal of the American Medical Informatics Association, 12(3), 296–298.
Jayadeva, Khemchandani, R., & Chandra, S. (2004). Fast and robust learning through fuzzy linear proximal support vector machines. Neurocomputing, 61, 401–411.
Jayadeva, Khemchandani, R., & Chandra, S. (2007). Twin support vector machines for pattern classification. IEEE Transactions on Pattern Analysis and Machine Intelligence, 29(5), 905–910.
Ketabchi, S., Moosaei, H., Razzaghi, M., & Pardalos, P. M. (2019). An improvement on parametric \(\nu \)-support vector algorithm for classification. Annals of Operations Research, 276(1), 155–168.
Khemchandani, R., Pal, A., & Chandra, S. (2018a). Fuzzy least squares twin support vector clustering. Neural Computing and Applications, 29(2), 553–563.
Khemchandani, R., Saigal, P., & Chandra, S. (2018b). Angle-based twin support vector machine. Annals of Operations Research, 269(1), 387–417.
Laxmi, S., & Gupta, S. K. (2020). Intuitionistic Fuzzy Proximal Support Vector Machines for Pattern Classification. Neural Processing Letters, 51, 2701–2735.
Le Thi, H. A., & Nguyen, M. C. (2017). DCA based algorithms for feature selection in multi-class support vector machine. Annals of Operations Research, 249(1–2), 273–300.
Lin, C. F., & Wang, S. D. (2002). Fuzzy support vector machines. IEEE Transactions on Neural Networks, 13(2), 464–471.
Liu, T., Zhu, H., Wu, M., & Zhang, W. (2020). Rotor displacement self-sensing method for six-pole radial hybrid magnetic bearing using mixed-kernel fuzzy support vector machine. IEEE Transactions on Applied Superconductivity, 30(4), 1–4.
Liu, Y. H., & Chen, Y. T. (2007). Face recognition using total margin-based adaptive fuzzy support vector machines. IEEE Transactions on Neural Networks, 18(1), 178–192.
Mangasarian, O.L., & Wild, E.W. (2001). Proximal support vector machine classifiers. In Proceedings KDD-2001: Knowledge Discovery and Data Mining.
Mangasarian, O. L., & Wild, E. W. (2006). Multisurface proximal support vector machine classification via generalized eigenvalues. IEEE Transactions on Pattern Analysis and Machine Intelligence, 28(1), 69–74.
Panagopoulos, O. P., Xanthopoulos, P., Razzaghi, T., & Şeref, O. (2019). Relaxed support vector regression. Annals of Operations Research, 276(1), 191–210.
Procházka, A., Vyšata, O., Ťupa, O., Yadollahi, M., & Vališ, M. (2014). Discrimination of axonal neuropathy using sensitivity and specificity statistical measures. Neural Computing and Applications, 25(6), 1349–1358.
Rezvani, S., Wang, X., & Pourpanah, F. (2019). Intuitionistic fuzzy twin support vector machines. IEEE Transactions on Fuzzy Systems, 27(11), 2140–2151.
Saigal, P., Chandra, S., & Rastogi, R. (2019). Multi-category ternion support vector machine. Engineering Applications of Artificial Intelligence, 85, 229–242.
Sartakhti, J. S., Afrabandpey, H., & Ghadiri, N. (2019). Fuzzy least squares twin support vector machines. Engineering Applications of Artificial Intelligence, 85, 402–409.
Tharwat, A., Hassanien, A. E., & Elnaghi, B. E. (2017). A BA-based algorithm for parameter optimization of support vector machine. Pattern Recognition Letters, 93, 13–22.
Vapnik, V. N. (1995). The nature of statistical learning theory. Verlag: Springer.
Wang, T., Qiu, Y., & Hua, J. (2020). Centered kernel alignment inspired fuzzy support vector machine. Fuzzy Sets and Systems, 394, 110–123.
Wang, L., & Zhu, J. (2010). Financial market forecasting using a two-step kernel learning method for the support vector regression. Annals of Operations Research, 174(1), 103–120.
Xu, Y., Yang, Z., & Pan, X. (2017). A novel twin support-vector machine with pinball loss. IEEE Transactions on Neural Networks and Learning Systems, 28(2), 359–370.
Ye, J. (2018). Generalized dice measures for multiple attribute decision making under intuitionistic and interval-valued intuitionistic fuzzy environments. Neural Computing and Applications, 30(12), 3623–3632.
Yu, H., Sun, C., Yang, X., Zheng, S., & Zou, H. (2019). Fuzzy support vector machine with relative density information for classifying imbalanced data. IEEE Transactions on Fuzzy Systems, 27(12), 2353–2367.
Zhang, S. X., & Gales, M. J. F. (2013). Structured SVMs for automatic speech recognition. IEEE Transactions on Audio, Speech, and Language Processing, 21(3), 544–555.
Zhou, M. M., Li, L., & Lu, Y. L. (2009). Fuzzy support vector machine based on density with dual membership. In IEEE International Conference on Machine Learning and Cybernetics, 2, 674–678.
Acknowledgements
The authors sincerely thank the reviewers for the recommendation, valuable comments and the interesting suggestions which have considerably improved the presentation of the paper. The first author is also grateful to the Ministry of Human Resource Development, India, for financial support, to carry out this work.
Author information
Authors and Affiliations
Corresponding author
Additional information
Publisher's Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Appendix A
Appendix A
For any \(C_1,C_2>0,\) the matrix \(C_1\{H^TH+C_2I+C_1(S_2G)^T(S_2G)\}\) is always invertible.
Proof
Let x be a non-zero column vector of \( (n+1)\times 1\) order.
Now,
Hence, \(H^TH\) is a positive semi-definite matrix.
Similarly, \(C_1(S_2G)^T(S_2G),\) for any \(C_1>0\) is a positive semi-definite matrix.
Since, the sum of two positive semi-definite matrices is also a positive semi-definite matrix, therefore, \(H^TH+C_1(S_2G)^T(S_2G)\) is a positive semi-definite matrix.
Now, as I (identity matrix) is a positive definite matrix, hence, for all \(C_1,C_2>0,\) \(C_1\{H^TH+C_2I+C_1(S_2G)^T(S_2G)\}\) is a positive definite matrix which in term yields the result. \(\square \)
Rights and permissions
About this article
Cite this article
Laxmi, S., Gupta, S.K. & Kumar, S. Intuitionistic fuzzy least square twin support vector machines for pattern classification. Ann Oper Res (2022). https://doi.org/10.1007/s10479-022-04626-2
Accepted:
Published:
DOI: https://doi.org/10.1007/s10479-022-04626-2