e-Informatica Software Engineering Journal Boosting and Comparing Performance of Machine Learning Classifiers with Meta-heuristic Techniques to Detect Code Smell

Boosting and Comparing Performance of Machine Learning Classifiers with Meta-heuristic Techniques to Detect Code Smell

2024
[1]Shivani Jain and Anju Saha, "Boosting and Comparing Performance of Machine Learning Classifiers with Meta-heuristic Techniques to Detect Code Smell", In e-Informatica Software Engineering Journal, vol. 18, no. 1, pp. 240107, 2024. DOI: 10.37190/e-Inf240107.

Download article (PDF)Get article BibTeX file

Authors

Shivani Jain, Anju Saha

Abstract

Background: Continuous modifications, suboptimal software design practices, and stringent project deadlines contribute to the proliferation of code smells. Detecting and refactoring these code smells are pivotal to maintaining complex and essential software systems. Neglecting them may lead to future software defects, rendering systems challenging to maintain, and eventually obsolete. Supervised machine learning techniques have emerged as valuable tools for classifying code smells without needing expert knowledge or fixed threshold values. Further enhancement of classifier performance can be achieved through effective feature selection techniques and the optimization of hyperparameter values.

Aim: Performance measures of multiple machine learning classifiers are improved by fine tuning its hyperparameters using various type of meta-heuristic algorithms including swarm intelligent, physics, math, and bio-based etc. Their performance measures are compared to find the best meta-heuristic algorithm in the context of code smell detection and its impact is evaluated based on statistical tests.

Method: This study employs sixteen contemporary and robust meta-heuristic algorithms to optimize the hyperparameters of two machine learning algorithms: Support Vector Machine (SVM) and k-nearest Neighbors (k-NN). The No Free Lunch theorem underscores that the success of an optimization algorithm in one application may not necessarily extend to others. Consequently, a rigorous comparative analysis of these algorithms is undertaken to identify the best-fit solutions for code smell detection. A diverse range of optimization algorithms, encompassing Arithmetic, Jellyfish Search, Flow Direction, Student Psychology Based, Pathfinder, Sine Cosine, Jaya, Crow Search, Dragonfly, Krill Herd, Multi-Verse, Symbiotic Organisms Search, Flower Pollination, Teaching Learning Based, Gravitational Search, and Biogeography-Based Optimization, have been implemented.

Results: In the case of optimized SVM, the highest attained accuracy, AUC, and F-measure values are 98.75%, 100%, and 98.57%, respectively. Remarkably, significant increases in accuracy and AUC, reaching 32.22% and 45.11% respectively, are observed. For k-NN, the best accuracy, AUC, and F-measure values are all perfect at 100%, with noteworthy hikes in accuracy and ROC-AUC values, amounting to 43.89% and 40.83%, respectively.

Conclusion: Optimized SVM exhibits exceptional performance with the Sine Cosine Optimization algorithm, while k-NN attains its peak performance with the Flower Optimization algorithm. Statistical analysis underscores the substantial impact of employing meta-heuristic algorithms for optimizing machine learning classifiers, enhancing their performance significantly. Optimized SVM excels in detecting the God Class, while optimized k-NN is particularly effective in identifying the Data Class. This innovative fusion automates the tuning process and elevates classifier performance, simultaneously addressing multiple longstanding challenges.

Keywords

Code Smell, Machine Learning, Meta-heuristics, Support Vector Machine, k-Nearest Neighbors, Optimization

References

1. I. Ozkaya, “The next frontier in software development: Ai-augmented software development processes,” IEEE Software, Vol. 40, No. 4, 2023, pp. 4–9.

2. H.J. Christanto and Y.A. Singgalen, “Analysis and design of student guidance information system through software development life cycle (sdlc) and waterfall model,” Journal of Information Systems and Informatics, Vol. 5, No. 1, 2023, pp. 259–270.

3. M. Almashhadani, A. Mishra, A. Yazici, and M. Younas, “Challenges in agile software maintenance for local and global development: An empirical assessment,” Information, Vol. 14, No. 5, 2023, p. 261.

4. M. Tufano, F. Palomba, G. Bavota, R. Oliveto, M. Di Penta et al., “When and why your code starts to smell bad,” in 2015 IEEE/ACM 37th IEEE International Conference on Software Engineering, Vol. 1. IEEE, 2015, pp. 403–414.

5. S.M. Olbrich, D.S. Cruzes, and D.I. Sjøberg, “Are all code smells harmful? a study of god classes and brain classes in the evolution of three open source systems,” in 2010 IEEE International Conference on Software Maintenance. IEEE, 2010, pp. 1–10.

6. M. Fowler, Refactoring: improving the design of existing code. Addison-Wesley Professional, 2018.

7. G. Langelier, H. Sahraoui, and P. Poulin, “Visualization-based analysis of quality for large-scale software systems,” in Proceedings of the 20th IEEE/ACM international Conference on Automated software engineering, 2005, pp. 214–223.

8. M. Hall, E. Frank, G. Holmes, B. Pfahringer, P. Reutemann et al., “The weka data mining software: an update,” ACM SIGKDD explorations newsletter, Vol. 11, No. 1, 2009, pp. 10–18.

9. R. Marinescu, “Detection strategies: Metrics-based rules for detecting design flaws,” in 20th IEEE International Conference on Software Maintenance, 2004. Proceedings. IEEE, 2004, pp. 350–359.

10. G. Travassos, F. Shull, M. Fredericks, and V.R. Basili, “Detecting defects in object-oriented designs: using reading techniques to increase software quality,” ACM Sigplan Notices, Vol. 34, No. 10, 1999, pp. 47–56.

11. G. Ganea, I. Verebi, and R. Marinescu, “Continuous quality assessment with incode,” Science of Computer Programming, Vol. 134, 2017, pp. 19–36.

12. H. Li and S. Thompson, “Let’s make refactoring tools user-extensible!” in Proceedings of the Fifth Workshop on Refactoring Tools, 2012, pp. 32–39.

13. E. Fernandes, J. Oliveira, G. Vale, T. Paiva, and E. Figueiredo, “A review-based comparative study of bad smell detection tools,” in Proceedings of the 20th International Conference on Evaluation and Assessment in Software Engineering, 2016, pp. 1–12.

14. M.V. Mäntylä and C. Lassenius, “Subjective evaluation of software evolvability using code smells: An empirical study,” Empirical Software Engineering, Vol. 11, No. 3, 2006, pp. 395–431.

15. E. Alpaydin, Introduction to machine learning. MIT press, 2020.

16. S. Jain and A. Saha, “Improving performance by genetically optimizing support vector machine to detect code smells,” in Proceedings of the International Conference on Smart Data Intelligence (ICSMDI 2021), 2021.

17. G.A. Pradipta, R. Wardoyo, A. Musdholifah, I.N.H. Sanjaya, and M. Ismail, “Smote for handling imbalanced data problem: A review,” in 2021 Sixth International Conference on Informatics and Computing (ICIC). IEEE, 2021, pp. 1–8.

18. H. Gupta, S. Misra, L. Kumar, and N. Murthy, “An empirical study to investigate data sampling techniques for improving code-smell prediction using imbalanced data,” in International Conference on Information and Communication Technology and Applications. Springer, 2020, pp. 220–233.

19. S. Jain and A. Saha, “Improving performance with hybrid feature selection and ensemble machine learning techniques for code smell detection,” Science of Computer Programming, Vol. 212, 2021, p. 102713.

20. J. Friedman, T. Hastie, and R. Tibshirani, The elements of statistical learning, Vol. 1. Springer series in statistics New York, 2001.

21. I. Syarif, A. Prugel-Bennett, and G. Wills, “Svm parameter optimization using grid search and genetic algorithm to improve classification performance,” TELKOMNIKA (Telecommunication Computing Electronics and Control), Vol. 14, No. 4, 2016, pp. 1502–1509.

22. D.M. Belete and M.D. Huchaiah, “Grid search in hyperparameter optimization of machine learning models for prediction of hiv/aids test results,” International Journal of Computers and Applications, Vol. 44, No. 9, 2022, pp. 875–886.

23. M. Karimi-Mamaghan, M. Mohammadi, P. Meyer, A.M. Karimi-Mamaghan, and E.G. Talbi, “Machine learning at the service of meta-heuristics for solving combinatorial optimization problems: A state-of-the-art,” European Journal of Operational Research, Vol. 296, No. 2, 2022, pp. 393–422.

24. V.C. SS and A. HS, “Nature inspired meta heuristic algorithms for optimization problems,” Computing, Vol. 104, No. 2, 2022, pp. 251–269.

25. E. Osaba, E. Villar-Rodriguez, J. Del Ser, A.J. Nebro, D. Molina et al., “A tutorial on the design, experimentation and application of metaheuristic algorithms to real-world optimization problems,” Swarm and Evolutionary Computation, Vol. 64, 2021, p. 100888.

26. J. McDermott, “When and why metaheuristics researchers can ignore “no free lunch” theorems,” SN Computer Science, Vol. 1, No. 1, 2020, p. 60.

27. V.W. Porto, “Evolutionary programming,” in Evolutionary Computation 1. CRC Press, 2018, pp. 127–140.

28. A. Lambora, K. Gupta, and K. Chopra, “Genetic algorithm-a literature review,” in 2019 international conference on machine learning, big data, cloud and parallel computing (COMITCon). IEEE, 2019, pp. 380–384.

29. M. Pant, H. Zaheer, L. Garcia-Hernandez, A. Abraham et al., “Differential evolution: A review of more than two decades of research,” Engineering Applications of Artificial Intelligence, Vol. 90, 2020, p. 103479.

30. M.G.P. de Lacerda, L.F. de Araujo Pessoa, F.B. de Lima Neto, T.B. Ludermir, and H. Kuchen, “A systematic literature review on general parameter control for evolutionary and swarm-based algorithms,” Swarm and Evolutionary Computation, Vol. 60, 2021, p. 100777.

31. J. Kennedy and R. Eberhart, “Particle swarm optimization,” in Proceedings of ICNN’95-international conference on neural networks, Vol. 4. IEEE, 1995, pp. 1942–1948.

32. M. Dorigo, M. Birattari, and T. Stutzle, “Ant colony optimization,” IEEE computational intelligence magazine, Vol. 1, No. 4, 2006, pp. 28–39.

33. S. Chattopadhyay, A. Marik, and R. Pramanik, “A brief overview of physics-inspired metaheuristic optimization techniques,” arXiv preprint arXiv:2201.12810, 2022.

34. K. Karthikeyan and P. Dhal, “Multi verse optimization (mvo) technique based voltage stability analysis through continuation power flow in ieee 57 bus,” Energy Procedia, Vol. 117, 2017, pp. 583–591.

35. Z. Wei, C. Huang, X. Wang, T. Han, and Y. Li, “Nuclear reaction optimization: A novel and powerful physics-based algorithm for global optimization,” IEEE Access, Vol. 7, 2019, pp. 66 084–66 109.

36. S. Cheng, Q. Qin, J. Chen, and Y. Shi, “Brain storm optimization algorithm: a review,” Artificial Intelligence Review, Vol. 46, 2016, pp. 445–458.

37. T. Rahkar Farshi, “Battle royale optimization algorithm,” Neural Computing and Applications, Vol. 33, No. 4, 2021, pp. 1139–1157.

38. M.D. Li, H. Zhao, X.W. Weng, and T. Han, “A novel nature-inspired algorithm for optimization: Virus colony search,” Advances in Engineering Software, Vol. 92, 2016, pp. 65–88.

39. G.G. Wang, S. Deb, and L.D.S. Coelho, “Earthworm optimisation algorithm: a bio-inspired metaheuristic algorithm for global optimisation problems,” International journal of bio-inspired computation, Vol. 12, No. 1, 2018, pp. 1–22.

40. S. Chinnasamy, M. Ramachandran, M. Amudha, and K. Ramu, “A review on hill climbing optimization methodology,” Recent Trends in Management and Commerce, Vol. 3, No. 1, 2022.

41. A.I. Hafez, H.M. Zawbaa, E. Emary, and A.E. Hassanien, “Sine cosine optimization algorithm for feature selection,” in 2016 international symposium on innovations in intelligent systems and applications (INISTA). IEEE, 2016, pp. 1–5.

42. S. Hassaine, F. Khomh, Y.G. Guéhéneuc, and S. Hamel, “Ids: An immune-inspired approach for the detection of software design smells,” in 2010 Seventh International Conference on the Quality of Information and Communications Technology. IEEE, 2010, pp. 343–348.

43. N. Moha, Y.G. Guéhéneuc, L. Duchien, and A.F. Le Meur, “Decor: A method for the specification and detection of code and design smells,” IEEE Transactions on Software Engineering, Vol. 36, No. 1, 2009, pp. 20–36.

44. A. Maiga, N. Ali, N. Bhattacharya, A. Sabane, Y.G. Guéhéneuc et al., “Smurf: A svm-based incremental anti-pattern detection approach,” in 2012 19th Working conference on reverse engineering. IEEE, 2012, pp. 466–475.

45. F. Khomh, S. Vaucher, Y.G. Guéhéneuc, and H. Sahraoui, “Bdtex: A gqm-based bayesian approach for the detection of antipatterns,” Journal of Systems and Software, Vol. 84, No. 4, 2011, pp. 559–572.

46. F.A. Fontana, M.V. Mäntylä, M. Zanoni, and A. Marino, “Comparing and experimenting machine learning techniques for code smell detection,” Empirical Software Engineering, Vol. 21, No. 3, 2016, pp. 1143–1191.

47. M. Kessentini and A. Ouni, “Detecting android smells using multi-objective genetic programming,” in 2017 IEEE/ACM 4th International Conference on Mobile Software Engineering and Systems (MOBILESoft). IEEE, 2017, pp. 122–132.

48. A. Kaur, S. Jain, and S. Goel, “Sp-j48: a novel optimization and machine-learning-based approach for solving complex problems: special application in software engineering for detecting code smells,” Neural Computing and Applications, Vol. 32, No. 11, 2020, pp. 7009–7027.

49. S. Jain and A. Saha, “Rank-based univariate feature selection methods on machine learning classifiers for code smell detection,” Evolutionary Intelligence, Vol. 15, No. 1, 2022, pp. 609–638.

50. M. Boussaa, W. Kessentini, M. Kessentini, S. Bechikh, and S. Ben Chikha, “Competitive coevolutionary code-smells detection,” in Search Based Software Engineering: 5th International Symposium, SSBSE 2013, St. Petersburg, Russia, August 24-26, 2013. Proceedings 5. Springer, 2013, pp. 50–65.

51. W. Kessentini, M. Kessentini, H. Sahraoui, S. Bechikh, and A. Ouni, “A cooperative parallel search-based software engineering approach for code-smells detection,” IEEE Transactions on Software Engineering, Vol. 40, No. 9, 2014, pp. 841–861.

52. D. Sahin, M. Kessentini, S. Bechikh, and K. Deb, “Code-smell detection as a bilevel problem,” ACM Transactions on Software Engineering and Methodology (TOSEM), Vol. 24, No. 1, 2014, pp. 1–44.

53. U. Mansoor, M. Kessentini, B.R. Maxim, and K. Deb, “Multi-objective code-smells detection using good and bad design examples,” Software Quality Journal, Vol. 25, No. 2, 2017, pp. 529–552.

54. G. Saranya, H.K. Nehemiah, A. Kannan, and V. Nithya, “Model level code smell detection using egapso based on similarity measures,” Alexandria engineering journal, Vol. 57, No. 3, 2018, pp. 1631–1642.

55. G. Saranya, H.K. Nehemiah, and A. Kannan, “Hybrid particle swarm optimisation with mutation for code smell detection,” International Journal of Bio-Inspired Computation, Vol. 12, No. 3, 2018, pp. 186–195.

56. M.M. Draz, M.S. Farhan, S.N. Abdulkader, and M. Gafar, “Code smell detection using whale optimization algorithm,” CMC-COMPUTERS MATERIALS & CONTINUA, Vol. 68, No. 2, 2021, pp. 1919–1935.

57. B. Amal, M. Kessentini, S. Bechikh, J. Dea, and L.B. Said, “On the use of machine learning and search-based software engineering for ill-defined fitness function: a case study on software refactoring,” in International Symposium on Search Based Software Engineering. Springer, 2014, pp. 31–45.

58. A. Ghannem, G.E. Boussaidi, and M. Kessentini, “Model refactoring using interactive genetic algorithm,” in International symposium on search based software engineering. Springer, 2013, pp. 96–110.

59. M. Fokaefs, N. Tsantalis, E. Stroulia, and A. Chatzigeorgiou, “Jdeodorant: identification and application of extract class refactorings,” in 2011 33rd International Conference on Software Engineering (ICSE). IEEE, 2011, pp. 1037–1039.

60. T.J. Dea, M. Kessentini, W.I. Grosky, and K. Deb, “Software refactoring using cooperative parallel evolutionary algorithms,” 2016.

61. G. Saranya, H. Nehemiah, A. Kannan, and V. Pavithra, “Prioritizing code smell correction task using strength pareto evolutionary algorithm,” Indian Journal of Science and Technology, Vol. 11, No. 20, 2018, pp. 1–12.

62. A. Ouni, M. Kessentini, S. Bechikh, and H. Sahraoui, “Prioritizing code-smells correction tasks using chemical reaction optimization,” Software Quality Journal, Vol. 23, No. 2, 2015, pp. 323–361.

63. A. Kaur, S. Jain, and S. Goel, “Sandpiper optimization algorithm: a novel approach for solving real-life engineering problems,” Applied Intelligence, Vol. 50, No. 2, 2020, pp. 582–619.

64. G. Lacerda, F. Petrillo, M. Pimenta, and Y.G. Guéhéneuc, “Code smells and refactoring: A tertiary systematic review of challenges and observations,” Journal of Systems and Software, Vol. 167, 2020, p. 110610.

65. R.S. Menshawy, A.H. Yousef, and A. Salem, “Code smells and detection techniques: a survey,” in 2021 international mobile, intelligent, and ubiquitous computing conference (MIUCC). IEEE, 2021, pp. 78–83.

66. M. Zhang, T. Hall, and N. Baddoo, “Code bad smells: a review of current knowledge,” Journal of Software Maintenance and Evolution: research and practice, Vol. 23, No. 3, 2011, pp. 179–202.

67. E. Tempero, C. Anslow, J. Dietrich, T. Han, J. Li et al., “The qualitas corpus: A curated collection of java code for empirical studies,” in 2010 Asia Pacific Software Engineering Conference. IEEE, 2010, pp. 336–345.

68. G. Van Rossum and F.L. Drake Jr, Python reference manual. Centrum voor Wiskunde en Informatica Amsterdam, 1995.

69. F. Pedregosa, G. Varoquaux, A. Gramfort, V. Michel, B. Thirion et al., “Scikit-learn: Machine learning in Python,” Journal of Machine Learning Research, Vol. 12, 2011, pp. 2825–2830.

70. S.B. Kotsiantis, D. Kanellopoulos, and P.E. Pintelas, “Data preprocessing for supervised leaning,” International journal of computer science, Vol. 1, No. 2, 2006, pp. 111–117.

71. C.V.G. Zelaya, “Towards explaining the effects of data preprocessing on machine learning,” in 2019 IEEE 35th international conference on data engineering (ICDE). IEEE, 2019, pp. 2086–2090.

72. M. Mehmood, N. Alshammari, S.A. Alanazi, and F. Ahmad, “Systematic framework to predict early-stage liver carcinoma using hybrid of feature selection techniques and regression techniques,” Complexity, Vol. 2022, 2022, pp. 1–11.

73. J. Benesty, J. Chen, Y. Huang, and I. Cohen, “Pearson correlation coefficient,” in Noise reduction in speech processing. Springer, 2009, pp. 1–4.

74. T.T. Wong and P.Y. Yeh, “Reliable accuracy estimates from k-fold cross validation,” IEEE Transactions on Knowledge and Data Engineering, Vol. 32, No. 8, 2019, pp. 1586–1594.

75. M.N. Ab Wahab, S. Nefti-Meziani, and A. Atyabi, “A comprehensive review of swarm optimization algorithms,” PloS one, Vol. 10, No. 5, 2015, p. e0122827.

76. T.A. Jumani, M.W. Mustafa, A.S. Alghamdi, M.M. Rasid, A. Alamgir et al., “Swarm intelligence-based optimization techniques for dynamic response and power quality enhancement of ac microgrids: A comprehensive review,” IEEE Access, Vol. 8, 2020, pp. 75 986–76 001.

77. L. Abualigah, A. Diabat, S. Mirjalili, M. Abd Elaziz, and A.H. Gandomi, “The arithmetic optimization algorithm,” Computer methods in applied mechanics and engineering, Vol. 376, 2021, p. 113609.

78. J.S. Chou and D.N. Truong, “A novel metaheuristic optimizer inspired by behavior of jellyfish in ocean,” Applied Mathematics and Computation, Vol. 389, 2021, p. 125535.

79. H. Karami, M.V. Anaraki, S. Farzin, and S. Mirjalili, “Flow direction algorithm (fda): a novel optimization approach for solving optimization problems,” Computers & Industrial Engineering, Vol. 156, 2021, p. 107224.

80. B. Das, V. Mukherjee, and D. Das, “Student psychology based optimization algorithm: A new population based optimization algorithm for solving optimization problems,” Advances in Engineering software, Vol. 146, 2020, p. 102804.

81. H. Yapici and N. Cetinkaya, “A new meta-heuristic optimizer: Pathfinder algorithm,” Applied soft computing, Vol. 78, 2019, pp. 545–568.

82. S. Mirjalili, “Sca: a sine cosine algorithm for solving optimization problems,” Knowledge-based systems, Vol. 96, 2016, pp. 120–133.

83. R. Rao, “Jaya: A simple and new optimization algorithm for solving constrained and unconstrained optimization problems,” International Journal of Industrial Engineering Computations, Vol. 7, No. 1, 2016, pp. 19–34.

84. A. Askarzadeh, “A novel metaheuristic method for solving constrained engineering optimization problems: crow search algorithm,” Computers & structures, Vol. 169, 2016, pp. 1–12.

85. S. Mirjalili, “Dragonfly algorithm: a new meta-heuristic optimization technique for solving single-objective, discrete, and multi-objective problems,” Neural computing and applications, Vol. 27, 2016, pp. 1053–1073.

86. A.H. Gandomi and A.H. Alavi, “Krill herd: a new bio-inspired optimization algorithm,” Communications in nonlinear science and numerical simulation, Vol. 17, No. 12, 2012, pp. 4831–4845.

87. S. Mirjalili, S.M. Mirjalili, and A. Hatamlou, “Multi-verse optimizer: a nature-inspired algorithm for global optimization,” Neural Computing and Applications, Vol. 27, No. 2, 2016, pp. 495–513.

88. M.Y. Cheng and D. Prayogo, “Symbiotic organisms search: a new metaheuristic optimization algorithm,” Computers & Structures, Vol. 139, 2014, pp. 98–112.

89. X.S. Yang, “Flower pollination algorithm for global optimization,” in International conference on unconventional computing and natural computation. Springer, 2012, pp. 240–249.

90. R.V. Rao, V.J. Savsani, and D. Vakharia, “Teaching–learning-based optimization: a novel method for constrained mechanical design optimization problems,” Computer-aided design, Vol. 43, No. 3, 2011, pp. 303–315.

91. E. Rashedi, H. Nezamabadi-Pour, and S. Saryazdi, “Gsa: a gravitational search algorithm,” Information sciences, Vol. 179, No. 13, 2009, pp. 2232–2248.

92. D. Simon, “Biogeography-based optimization,” IEEE transactions on evolutionary computation, Vol. 12, No. 6, 2008, pp. 702–713.

93. W.S. Noble, “What is a support vector machine?” Nature biotechnology, Vol. 24, No. 12, 2006, pp. 1565–1567.

94. G. Guo, H. Wang, D. Bell, Y. Bi, and K. Greer, “Knn model-based approach in classification,” in On The Move to Meaningful Internet Systems 2003: CoopIS, DOA, and ODBASE: OTM Confederated International Conferences, CoopIS, DOA, and ODBASE 2003, Catania, Sicily, Italy, November 3-7, 2003. Proceedings. Springer, 2003, pp. 986–996.

95. V. López, A. Fernández, and F. Herrera, “On the importance of the validation technique for classification with imbalanced datasets: Addressing covariate shift when data is skewed,” Information Sciences, Vol. 257, 2014, pp. 1–13.

96. R.F. Woolson, “Wilcoxon signed-rank test,” Wiley encyclopedia of clinical trials, 2007, pp. 1–3.

97. T. Harris and J.W. Hardin, “Exact wilcoxon signed-rank and wilcoxon mann–whitney ranksum tests,” The Stata Journal, Vol. 13, No. 2, 2013, pp. 337–343.

98. D.H. Wolpert and W.G. Macready, “No free lunch theorems for optimization,” IEEE transactions on evolutionary computation, Vol. 1, No. 1, 1997, pp. 67–82.

99. M. Meselhi, R. Sarker, D. Essam, and S. Elsayed, “A decomposition approach for large-scale non-separable optimization problems,” Applied Soft Computing, Vol. 115, 2022, p. 108168.

100. Z. Yang, H. Qiu, L. Gao, D. Xu, and Y. Liu, “A general framework of surrogate-assisted evolutionary algorithms for solving computationally expensive constrained optimization problems,” Information Sciences, Vol. 619, 2023, pp. 491–508.

101. W. Li, T. Zhang, R. Wang, S. Huang, and J. Liang, “Multimodal multi-objective optimization: Comparative study of the state-of-the-art,” Swarm and Evolutionary Computation, 2023, p. 101253.

102. Z. Beheshti and S.M.H. Shamsuddin, “A review of population-based meta-heuristic algorithms,” Int. j. adv. soft comput. appl, Vol. 5, No. 1, 2013, pp. 1–35.

103. H. Grodzicka, A. Ziobrowski, Z. Łakomiak, M. Kawa, and L. Madeyski, “Code smell prediction employing machine learning meets emerging java language constructs,” Data-Centric Business and Applications: Towards Software Development (Volume 4), 2020, pp. 137–167.

104. S. Suthaharan, “Support vector machine,” in Machine learning models and algorithms for big data classification. Springer, 2016, pp. 207–235.

105. O. Kramer, “K-nearest neighbors,” in Dimensionality reduction with unsupervised nearest neighbors. Springer, 2013, pp. 13–23.

106. J. Han, J. Pei, and M. Kamber, Data mining: concepts and techniques. Elsevier, 2011.

107. J.F. O’Callaghan and D.M. Mark, “The extraction of drainage networks from digital elevation data,” Computer vision, graphics, and image processing, Vol. 28, No. 3, 1984, pp. 323–344.

©2015 e-Informatyka.pl, All rights reserved.

Built on WordPress Theme: Mediaphase Lite by ThemeFurnace.