Skip to main content
Log in

Novel self-adjusted particle swarm optimization algorithm for feature selection

  • Regular Paper
  • Published:
Computing Aims and scope Submit manuscript

Abstract

Due to the ever increasing number of features in the most practical application fields, i.e, expert and intelligent systems, feature selection (FS) has become a promising pre-processing step for a particular task (i.e., classification and regression) in the last few decades. FS aims at selecting the optimal feature subset from the original feature dataset by removing redundant and irrelevant features, which improve the performance of the learning models. In this paper, a novel self-adjusted particle swarm optimization algorithm (SAPSO) is proposed for selecting the optimal feature subset for classification datasets. In SAPSO, we make three improvements: First, a new learning model of particles, which can extract much more useful knowledge from multiple information providers, is used to enhance the diversity of particles. Second, one-flip neighborhood search strategy is adopted to strengthen the local search ability of a swarm when the swarm enters a period of stagnation. Finally, a population replacement process is conducted, which bases on the part of new particles generated by the neighborhood search strategy, to enhance the diversity of the swarm. Moreover, the k-nearest neighbor method is used as a classifier to evaluate the classification accuracy of a particle. The proposed method is benchmarked on 10 well-known UCI datasets and the results are compared with 9 state-of-the-art wrapper-based FS methods. From the results, it is observed that the proposed approach significantly outperforms others on most the 10 datasets.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2

Similar content being viewed by others

References

  1. Moradi P, Ahmadian S, Akhlaghian F (2015) An effective trust-based recommendation method using a novel graph clustering algorithm. Physica A: Stat Mech Appl 436:462–481

    Article  Google Scholar 

  2. Wang S, Chang X, Li X et al (2016) Multi-task support vector machines for feature selection with shared knowledge discovery. Signal Process 120:746–753

    Article  Google Scholar 

  3. Wang Y, Liu Y, Feng L et al (2015) Novel feature selection method based on harmony search for email classification. Knowl-Based Syst 73(1):311–323

    Article  Google Scholar 

  4. Sharma S, Singh G (2020) Diagnosis of cardiac arrhythmia using Swarm-intelligence based metaheuristic techniques: a comparative analysis. EAI Endors Trans Pervas Health Technol 6(22):1–11

    Google Scholar 

  5. Parham M, Mozhgan G (2016) A hybrid particle swarm optimization for feature subset selection by integrating a novel local search strategy. Appl Soft Comput 43:117–130

    Article  Google Scholar 

  6. Liu H, Sun J, Liu L et al (2009) Feature selection with dynamic mutual information. Pattern Recogn 42(7):1330–1339

    Article  MATH  Google Scholar 

  7. Zhang Y, Gong D, Hu Y et al (2015) Feature selection algorithm based on bare bones particle swarm optimization. Neurocomputing 148:150–157

    Article  Google Scholar 

  8. Bharti KK, Singh PK (2016) Opposition chaotic fitness mutation based adaptive inertia weight BPSO for feature selection in text clustering. Appl Soft Comput 43:20–34

    Article  Google Scholar 

  9. Xue B, Zhang M, Browne WN et al (2016) A survey on evolutionary computation approaches to feature selection. IEEE Trans Evol Comput 20(4):606–626

    Article  Google Scholar 

  10. Khalilpourazari S, Pasandideh S (2020) Sine-cosine crow search algorithm: theory and applications. Neural Comput Appl 32:7725–7742

    Article  Google Scholar 

  11. Khalilpourazari S, Naderi B, Khalilpourazary S (2020) Multi-objective stochastic fractal search: a powerful algorithm for solving complex multi-objective optimization problems. Soft Comput 24:3037–3066

    Article  Google Scholar 

  12. Khalilpourazari S, Pasandideh S, Niaki S (2019) Optimizing a multi-item economic order quantity problem with imperfect items, inspection errors, and backorders. Soft Comput 23:11671–11698

    Article  Google Scholar 

  13. Gheyas IA, Smith LS (2010) Feature subset selection in large dimensionality domains. Pattern Recogn 43(1):5–13

    Article  MATH  Google Scholar 

  14. Manizheh G, Mohammad-Reza F (2014) Forest optimization algorithm. Expert Syst Appl 41(15):6676–6687

    Article  Google Scholar 

  15. Chuang LY, Tsai SW, Yang CH (2011) Improved binary particle swarm optimization using catfish effect for feature selection. Expert Syst Appl 38(10):12699–12707

    Article  Google Scholar 

  16. Xue B, Zhang M, Browne WN (2014) Particle swarm optimisation for feature selection in classification: novel initialisation and updating mechanisms. Appl Soft Comput 18:261–276

    Article  Google Scholar 

  17. Tabakhi S, Najafi A, Ranjbar R et al (2015) Gene selection for microarray data classification using a novel ant colony optimization. Neurocomputing 168:1024–1036

    Article  Google Scholar 

  18. Lin SW, Tseng TY, Chou SY et al (2008) A simulated-annealing-based approach for simultaneous parameter optimization and feature selection of back-propagation networks. Expert Syst Appl 34(2):1491–1499

    Article  Google Scholar 

  19. Faris H, Mafarja MM, Heidari AA et al (2018) An efficient binary salp swarm algorithm with crossover scheme for feature selection problems. Knowl-Based Syst 154:43–67

    Article  Google Scholar 

  20. Arora S, Singh H, Sharma M et al (2019) A new hybrid algorithm based on grey wolf optimization and crow search algorithm for unconstrained function optimization and feature selection. IEEE Access 7:26343–26361

    Article  Google Scholar 

  21. Sharma M, Kaur P (2020) A comprehensive analysis of nature-inspired meta-heuristic techniques for feature selection problem. Arch Comput Methods Eng. https://doi.org/10.1007/s11831-020-09412-6

  22. Nguyen B, Xue B, Zhang M (2020) A survey on swarm intelligence approaches to feature selection in data mining. Swarm Evol Comput 54:1. https://doi.org/10.1016/j.swevo.2020.100663

    Article  Google Scholar 

  23. Tzanetos A, Dounias G (2020) A comprehensive survey on the applications of swarm intelligence and bio-inspired evolutionary strategies. Mach Learn Paradigms 18:337–378

    Article  Google Scholar 

  24. Chang JF (2009) A performance comparison between genetic algorithms and particle swarm optimization applied in constructing equity portfolios. Int J Innov Comput Inf Control 5(12):5069–5079

    Google Scholar 

  25. Marinakis Y, Marinaki M, Doumpos M et al (2009) Ant colony and particle swarm optimization for financial classification problems. Expert Syst Appl 36(7):10604–10611

    Article  Google Scholar 

  26. Gautam R, Kaur P, Sharma M (2019) A comprehensive review on nature inspired computing algorithms for the diagnosis of chronic disorders in human beings. Progress Artif Intell 8:401–424

    Article  Google Scholar 

  27. Chai X (2020) Task scheduling based on swarm intelligence algorithms in high performance computing environment. J Ambient Intell Hum Comput. https://doi.org/10.1007/s12652-020-01994-0

  28. Gu S, Cheng R, Jin Y (2016) Feature selection for high-dimensional classification using a competitive swarm optimizer. Soft Comput 22(3):811–822

    Article  Google Scholar 

  29. Yu L, Liu H (2003) Feature selection for high-dimensional data: a fast correlation-based filter solution. Presented at the Proceedings of the 20th international conference on machine learning, Washington, USA, Augest 21–24

  30. Raileanu LE, Stoffel K (2004) Theoretical comparison between the Gini index and information gain criteria. Ann Math Artif Intell 41:77–93

    Article  MathSciNet  MATH  Google Scholar 

  31. Theodoridis S, Koutroumbas K (2008) Pattern recognition. Academic Press, Oxford

    MATH  Google Scholar 

  32. He XF, Cai D, Niyogi P (2005) Laplacian score for feature selection. Advances in Neural Information Processing System 18, British Columbia, Canada, pp 507–514

  33. Zhu LL, Miao LS, Zhang DQ (2012) Iterative laplacian score for feature selection. Chin Conf Pattern Recognit Commun Comput Inf Sci 321:80–87

    Google Scholar 

  34. Gu Q, Li Z, Han J (2011) Generalized fisher score for feature selection. In: Proceedings of the 27th conference on uncertainty in artificial intelligence, Barcelona, Spain, pp 14–17

  35. Peng H, Long F, Ding C (2005) Feature selection based on mutual information criteria of max-dependency, max-relevance, and min-redundancy. IEEE Trans Pattern Anal Mach Intell 27(8):1226–1238

    Article  Google Scholar 

  36. Lai C, Reinders MJT, Wessels L (2006) Random subspace method for multivariate feature selection. Pattern Recogn Lett 27(10):1067–1076

    Article  Google Scholar 

  37. Moradi P, Rostami M (2015) Integration of graph clustering with ant colony optimization for feature selection. Knowl-Based Syst 84:144–161

    Article  Google Scholar 

  38. Che J, Yang Y, Li L et al (2017) Maximum relevance minimum common redundancy feature selection for nonlinear data. Inf Sci 409:68–86

    Article  MATH  Google Scholar 

  39. Gao W, Hu L, Zhang P et al (2018) Feature selection by integrating two groups of feature evaluation criteria. Expert Syst Appl 110:11–19

    Article  Google Scholar 

  40. Unler A, Murat A, Chinnam RB (2011) mr2PSO: a maximum relevance minimum redundancy feature selection method based on swarm intelligence for support vector machine classification. Inf Sci 181(20):4625–4641

    Article  Google Scholar 

  41. Liu Y, Zheng YF (2006) FS-SFS: a novel feature selection method for support vector machines. Pattern Recogn 39(7):1333–1345

    Article  MATH  Google Scholar 

  42. Manikandan G, Susi E, Abirami S (2017) Feature selection on high dimensional data using wrapper based subset selection. In: IEEE 2nd international conference on recent trends and challenges in computational models, Tamilnadu, India, pp 263–268

  43. Tawhid MA, Ibrahim AM (2019) Feature selection based on rough set approach, wrapper approach, and binary whale optimization algorithm. Int J Mach Learn Cybern. https://doi.org/10.1007/s13042-019-00996-5

  44. Siedlecki W, Sklansky J (1989) A note on genetic algorithms for large-scale feature selection. Pattern Recogn Lett 10(5):335–347

    Article  MATH  Google Scholar 

  45. Tabakhi S, Moradi P (2015) RelevanceCredundancy feature selection based on ant colony optimization. Pattern Recogn 48(9):2798–2811

    Article  Google Scholar 

  46. Schiezaro M, Pedrini H (2013) Data feature selection based on Artificial Bee Colony algorithm. Eurasip J Image Video Process 43:1–8

    Google Scholar 

  47. Li Y, Zhang S, Zeng X (2009) Research of multi-population agent genetic algorithm for feature selection. Expert Syst Appl 36(9):11570–11581

    Article  Google Scholar 

  48. Hamdani TM, Won JM, Alimi AM et al (2011) Hierarchical genetic algorithm with new evaluation function and bi-coded representation for the selection of features considering their confidence rate. Appl Soft Comput 11(2):2501–2509

    Article  Google Scholar 

  49. Kabir MM, Shahjahan M, Murase K (2011) A new local search based hybrid genetic algorithm for feature selection. Neurocomputing 74(17):2914–2928

    Article  Google Scholar 

  50. Kennedy J, Eberhart RC (1995) Particle swarm optimization. In: Proceedings of IEEE international conference on neural network, Perth, Western Australia, pp 1942–1948

  51. Wang F, Zhang H, Li KS et al (2018) A hybrid particle swarm optimization algorithm using adaptive learning strategy. Inf Sci 436–437:162–177

    Article  MathSciNet  Google Scholar 

  52. Jiang J, Bo Y, Song C et al (2012) Hybrid algorithm based on particle swarm optimization and artificial fish swarm algorithm. Presented at the international conference on advances in neural networks, Shenyang, China, pp 607–614

  53. Xue Y, Xue B, Zhang M (2019) Self-adaptive particle swarm optimization for large-scale feature selection in classification. ACM Trans Knowl Discov Data 13(5):50

    Article  Google Scholar 

  54. Chen K, Zhou FY, Yuan XF (2019) Hybrid particle swarm optimization with spiral-shaped mechanism for feature selection. Expert Syst Appl 128:140–156

    Article  Google Scholar 

  55. Moradi P, Gholampour M (2016) A hybrid particle swarm optimization for feature subset selection by integrating a novel local search strategy. Appl Soft Comput 43:117–130

    Article  Google Scholar 

  56. Amoozegar M, Minaei-Bidgoli B (2018) Optimizing multi-objective PSO based feature selection method using a feature elitism mechanism. Expert Syst Appl 113:499–514

    Article  Google Scholar 

  57. Kennedy J, Eberhart RC (1997) A discrete binary version of the particle swarm algorithm. In: Procs of (1997) Conference on Systems, Man, and Cybernetics. Computational Cybernetics and Simulation. IEEE, pp 4104–4108

  58. Pan QK, Fatih TM, Liang YC (2008) A discrete particle swarm optimization algorithm for the no-wait flowshop scheduling problem. Comput Oper Res 35(9):2807–2839

    Article  MathSciNet  MATH  Google Scholar 

  59. Lu Y, Liang M, Ye Z et al (2015) Improved particle swarm optimization algorithm and its application in text feature selection. Appl Soft Comput 35:629–636

    Article  Google Scholar 

  60. Liang JJ, Qin AK, Suganthan PN et al (2006) Comprehensive learning particle swarm optimizer for global optimization of multimodal function. IEEE Trans Evol Comput 10(3):281–295

    Article  Google Scholar 

  61. Wei B, Zhang WS, Xia XW et al (2019) Efficient feature selection algorithm based on particle swarm optimization with learning memory. IEEE Access 7(1):166066–166078

    Article  Google Scholar 

  62. Chuang LY, Yang CS, Wu KC et al (2011) Gene selection and classification using Taguchi chaotic binary particle swarm optimization. Expert Syst Appl 38(10):13367–13377

    Article  Google Scholar 

  63. Kohavi R, John GH (1997) Wrappers for feature subset selection. Artif Intell 97(1):273–324

    Article  MATH  Google Scholar 

  64. Pfahringer B, Reutemann P, Witten IH et al (2009) The WEKA data mining software: an update. ACM SIGKDD Explor Newsl 11(1):10–18

    Article  Google Scholar 

  65. Mirjalili S, Lewis A (2014) Adaptive gbest-guided gravitational search algorithm. Neural Comput Appl 25:1569–1584

    Article  Google Scholar 

  66. Faris H, Mafarja M, Heidari A et al (2018) An efficient binary salp swarm algorithm with crossover scheme for feature selection problems. Knowl-Based Syst 154(15):43–67

    Article  Google Scholar 

Download references

Acknowledgements

This study was funded by the National Natural Science Foundation of China (Grant Nos: 61806204, 61663009, 61672466), Joint Fund of Zhejiang Provincial Natural Science Foundation (LSZ19F010001). Opening Foundation of Key Lab of Intelligent Optimization and Information Processing, Minnan Normal University (NO.ZNYH202002). The Science and Technology Plan Projects of Zhangzhou (Grant No: ZZ2020J06).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Xuewen Xia.

Ethics declarations

Conflict of interest

We declare that we have no financial and personal relationships with other people or organizations that can inappropriately influence our work, there is no professional or other personal interest of any nature or kind in any product, service and/or company that could be construed as influencing the position presented in, or the review of, the manuscript entitled.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Wei, B., Wang, X., Xia, X. et al. Novel self-adjusted particle swarm optimization algorithm for feature selection. Computing 103, 1569–1597 (2021). https://doi.org/10.1007/s00607-020-00891-w

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s00607-020-00891-w

Keywords

Mathematics Subject Classification

Navigation