Skip to main content
Log in

Intuitionistic fuzzy least square twin support vector machines for pattern classification

  • Original Research
  • Published:
Annals of Operations Research Aims and scope Submit manuscript

Abstract

Twin support vector machine (TSVM) is an effective machine learning tool for classification problems. However, TSVM classifier works on empirical risk principle only and also while training, each sample contributes equally, even if it is a noise or an outlier. It does not incorporate the uncertainties associated with data into modeling and hence its generalization ability declines. To address these issues, intuitionistic fuzzy regularized least square twin support vector machine having intuitionistic fuzzy network has been proposed in this paper. The non-parallel classifiers are obtained by solving two systems of linear equations only rather than the solution of two quadratic programming problems as in TSVM, which leads to speed up the training process. Moreover, the method follows both structural risk and empirical risk minimization principles. In order to de-escalate the effect of pollutant patterns, their contribution of the patterns into learning the decision function has been made according to their importance in the classification. The significance of the training patterns is measured in terms of intuitionistic fuzzy numbers based on their geometrical locations and surroundings. The method is further extended to find non-parallel decision planes in the feature space using nonlinear kernel function, which also gives rise to the solution of two systems of linear equations. To show the efficacy of the proposed method, computer simulations on fourteen standard and six big UCI datasets using linear and Gaussian kernels are performed and their results have been compared with well-established methods in the literature. The experimental results are represented in terms of accuracy, computational time, F-measure, sensitivity and specificity rates. The outcomes demonstrate that the proposed method outperforms the existing methods and is also feasible for big datasets. The comparison and statistical inferences using two non-parametric: Friedman and Nemenyi tests, conclude that the proposed approach is fast and yields better generalization.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7

Similar content being viewed by others

References

  • Acosta, S.M., Amoroso, A.L., Sant’Anna, Â.M.O., & Junior, O.C. (2021). Predictive modeling in a steelmaking process using optimized relevance vector regression and support vector regression. Annals of Operations Research, 1-22.

  • Akyildirim, E., Goncu, A., & Sensoy, A. (2021). Prediction of cryptocurrency returns using machine learning. Annals of Operations Research, 297, 3–36.

    Article  Google Scholar 

  • Atanassov, K. T. (1999). Intuitionistic fuzzy sets: theory and applications. New York: Physica-Verlag.

    Book  Google Scholar 

  • Azar, A. T., & El-Said, S. A. (2014). Performance analysis of support vector machines classifiers in breast cancer mammography recognition. Neural Computing and Applications, 24(5), 1163–1177.

    Article  Google Scholar 

  • Blake, C.L., & Merz, C.J. (1998). UCI repository for machine learning databases. Department of Information and Computer Sciences University of California Irvine http://www.ics.uci.edu/~mlearn/MLRepository.html.

  • Burges, C. J. C. (1998). A tutorial on support vector machines for pattern recognition. Data Mining and Knowledge Discovery, 2, 121–167.

    Article  Google Scholar 

  • Cormen, T. H., Leiserson, C. E., Rivest, R. L., & Stein, C. (2009). Introduction to algorithms. Cambridge: MIT Press.

    Google Scholar 

  • Chen, S. G., & Wu, X. J. (2018). A new fuzzy twin support vector machine for pattern classification. International Journal of Machine Learning and Cybernetics, 9(9), 1553–1564.

    Article  Google Scholar 

  • Cortes, C., & Vapnik, V. (1995). Support vector networks. Machine Learning, 20(3), 273–297.

    Google Scholar 

  • Cristianini, N., & Shawe-Taylor, J. (2000). An introduction to support vector machines and other kernel-based learning methods. Cambridge: Cambridge University Press.

    Book  Google Scholar 

  • Demšar, J. (2006). Statistical comparisons of classifiers over multiple data sets. Journal of Machine Learning Research, 7, 1–30.

    Google Scholar 

  • Ding, S., Zhu, Z., & Zhang, X. (2017). An overview on semi-supervised support vector machine. Neural Computing and Applications, 28(5), 969–978.

    Article  Google Scholar 

  • Gillman, A., Young, P. M., & Martinsson, P. G. (2012). A direct solver with O(N) complexity for integral equations on one-dimensional domains. Frontiers of Mathematics in China, 7(2), 217–247.

    Article  Google Scholar 

  • Ha, M., Wang, C., & Chen, J. (2013). The support vector machine based on intuitionistic fuzzy number and kernel function. Soft Computing, 17, 635–641.

    Article  Google Scholar 

  • Ha, M. H., Huang, S., Wang, C., & Wang, X. L. (2011). Intuitionistic fuzzy support vector machine. Journal of Hebei University (Natural Science Edition), 3, 225–229.

    Google Scholar 

  • Hao, P. Y. (2010). New support vector algorithms with parametric insensitive/margin model. Neural Networks, 23(1), 60–73.

    Article  Google Scholar 

  • Hao, P. Y., Kung, C. F., Chang, C. Y., & Ou, J. B. (2021). Predicting stock price trends based on financial news articles and using a novel twin support vector machine with fuzzy hyperplane. Applied Soft Computing, 98, 106806.

    Article  Google Scholar 

  • Hripcsak, G., & Rothschild, A. S. (2005). Agreement, the f-measure, and reliability in information retrieval. Journal of the American Medical Informatics Association, 12(3), 296–298.

    Article  Google Scholar 

  • Jayadeva, Khemchandani, R., & Chandra, S. (2004). Fast and robust learning through fuzzy linear proximal support vector machines. Neurocomputing, 61, 401–411.

    Article  Google Scholar 

  • Jayadeva, Khemchandani, R., & Chandra, S. (2007). Twin support vector machines for pattern classification. IEEE Transactions on Pattern Analysis and Machine Intelligence, 29(5), 905–910.

  • Ketabchi, S., Moosaei, H., Razzaghi, M., & Pardalos, P. M. (2019). An improvement on parametric \(\nu \)-support vector algorithm for classification. Annals of Operations Research, 276(1), 155–168.

    Article  Google Scholar 

  • Khemchandani, R., Pal, A., & Chandra, S. (2018a). Fuzzy least squares twin support vector clustering. Neural Computing and Applications, 29(2), 553–563.

    Article  Google Scholar 

  • Khemchandani, R., Saigal, P., & Chandra, S. (2018b). Angle-based twin support vector machine. Annals of Operations Research, 269(1), 387–417.

    Article  Google Scholar 

  • Laxmi, S., & Gupta, S. K. (2020). Intuitionistic Fuzzy Proximal Support Vector Machines for Pattern Classification. Neural Processing Letters, 51, 2701–2735.

    Article  Google Scholar 

  • Le Thi, H. A., & Nguyen, M. C. (2017). DCA based algorithms for feature selection in multi-class support vector machine. Annals of Operations Research, 249(1–2), 273–300.

    Article  Google Scholar 

  • Lin, C. F., & Wang, S. D. (2002). Fuzzy support vector machines. IEEE Transactions on Neural Networks, 13(2), 464–471.

    Article  Google Scholar 

  • Liu, T., Zhu, H., Wu, M., & Zhang, W. (2020). Rotor displacement self-sensing method for six-pole radial hybrid magnetic bearing using mixed-kernel fuzzy support vector machine. IEEE Transactions on Applied Superconductivity, 30(4), 1–4.

    Google Scholar 

  • Liu, Y. H., & Chen, Y. T. (2007). Face recognition using total margin-based adaptive fuzzy support vector machines. IEEE Transactions on Neural Networks, 18(1), 178–192.

    Article  Google Scholar 

  • Mangasarian, O.L., & Wild, E.W. (2001). Proximal support vector machine classifiers. In Proceedings KDD-2001: Knowledge Discovery and Data Mining.

  • Mangasarian, O. L., & Wild, E. W. (2006). Multisurface proximal support vector machine classification via generalized eigenvalues. IEEE Transactions on Pattern Analysis and Machine Intelligence, 28(1), 69–74.

    Article  Google Scholar 

  • Panagopoulos, O. P., Xanthopoulos, P., Razzaghi, T., & Şeref, O. (2019). Relaxed support vector regression. Annals of Operations Research, 276(1), 191–210.

    Article  Google Scholar 

  • Procházka, A., Vyšata, O., Ťupa, O., Yadollahi, M., & Vališ, M. (2014). Discrimination of axonal neuropathy using sensitivity and specificity statistical measures. Neural Computing and Applications, 25(6), 1349–1358.

    Article  Google Scholar 

  • Rezvani, S., Wang, X., & Pourpanah, F. (2019). Intuitionistic fuzzy twin support vector machines. IEEE Transactions on Fuzzy Systems, 27(11), 2140–2151.

    Article  Google Scholar 

  • Saigal, P., Chandra, S., & Rastogi, R. (2019). Multi-category ternion support vector machine. Engineering Applications of Artificial Intelligence, 85, 229–242.

    Article  Google Scholar 

  • Sartakhti, J. S., Afrabandpey, H., & Ghadiri, N. (2019). Fuzzy least squares twin support vector machines. Engineering Applications of Artificial Intelligence, 85, 402–409.

    Article  Google Scholar 

  • Tharwat, A., Hassanien, A. E., & Elnaghi, B. E. (2017). A BA-based algorithm for parameter optimization of support vector machine. Pattern Recognition Letters, 93, 13–22.

    Article  Google Scholar 

  • Vapnik, V. N. (1995). The nature of statistical learning theory. Verlag: Springer.

    Book  Google Scholar 

  • Wang, T., Qiu, Y., & Hua, J. (2020). Centered kernel alignment inspired fuzzy support vector machine. Fuzzy Sets and Systems, 394, 110–123.

    Article  Google Scholar 

  • Wang, L., & Zhu, J. (2010). Financial market forecasting using a two-step kernel learning method for the support vector regression. Annals of Operations Research, 174(1), 103–120.

    Article  Google Scholar 

  • Xu, Y., Yang, Z., & Pan, X. (2017). A novel twin support-vector machine with pinball loss. IEEE Transactions on Neural Networks and Learning Systems, 28(2), 359–370.

    Article  Google Scholar 

  • Ye, J. (2018). Generalized dice measures for multiple attribute decision making under intuitionistic and interval-valued intuitionistic fuzzy environments. Neural Computing and Applications, 30(12), 3623–3632.

    Article  Google Scholar 

  • Yu, H., Sun, C., Yang, X., Zheng, S., & Zou, H. (2019). Fuzzy support vector machine with relative density information for classifying imbalanced data. IEEE Transactions on Fuzzy Systems, 27(12), 2353–2367.

    Article  Google Scholar 

  • Zhang, S. X., & Gales, M. J. F. (2013). Structured SVMs for automatic speech recognition. IEEE Transactions on Audio, Speech, and Language Processing, 21(3), 544–555.

    Article  Google Scholar 

  • Zhou, M. M., Li, L., & Lu, Y. L. (2009). Fuzzy support vector machine based on density with dual membership. In IEEE International Conference on Machine Learning and Cybernetics, 2, 674–678.

    Google Scholar 

Download references

Acknowledgements

The authors sincerely thank the reviewers for the recommendation, valuable comments and the interesting suggestions which have considerably improved the presentation of the paper. The first author is also grateful to the Ministry of Human Resource Development, India, for financial support, to carry out this work.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to S. K. Gupta.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Appendix A

Appendix A

For any \(C_1,C_2>0,\) the matrix \(C_1\{H^TH+C_2I+C_1(S_2G)^T(S_2G)\}\) is always invertible.

Proof

Let x be a non-zero column vector of \( (n+1)\times 1\) order.

Now,

$$\begin{aligned} x^T(H^TH)x=(Hx)^T(Hx). \\ ~~~~~~~~=||Hx||^2 \\ \ge 0. \end{aligned}$$

Hence, \(H^TH\) is a positive semi-definite matrix.

Similarly, \(C_1(S_2G)^T(S_2G),\) for any \(C_1>0\) is a positive semi-definite matrix.

Since, the sum of two positive semi-definite matrices is also a positive semi-definite matrix, therefore, \(H^TH+C_1(S_2G)^T(S_2G)\) is a positive semi-definite matrix.

Now, as I (identity matrix) is a positive definite matrix, hence, for all \(C_1,C_2>0,\) \(C_1\{H^TH+C_2I+C_1(S_2G)^T(S_2G)\}\) is a positive definite matrix which in term yields the result. \(\square \)

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Laxmi, S., Gupta, S.K. & Kumar, S. Intuitionistic fuzzy least square twin support vector machines for pattern classification. Ann Oper Res (2022). https://doi.org/10.1007/s10479-022-04626-2

Download citation

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1007/s10479-022-04626-2

Keywords

Navigation