Abstract

Global optimization problems arise in many scientific and engineering domains and are often addressed using population-based metaheuristic algorithms. Among these, differential evolution (DE) and particle swarm optimization (PSO) are widely applied due to their robustness when handling nonconvex, black-box, and constrained objectives. The performance of DE and PSO strongly depends on how their parameters are set, yet these parameters are often selected by guessing and repeated testing rather than through an automatic optimization process. This study applies sequential model-based optimization (SMBO) with Gaussian process surrogates to systematically tune the parameters of DE and PSO under a fixed evaluation budget. The framework utilizes Matern and radial basis function (RBF) kernels within a Bayesian Optimization process (BO), supported by early stopping criteria to ensure efficient and reliable convergence. The approach is tested on three classical constrained engineering design problems: the welded beam (WBD), the pressure vessel (PVD), and the compression spring (CSD). Constraints are handled through a penalty formulation, and tuned configurations are compared with standard defaults across multiple independent trials under equalized evaluation budgets. Results show that SMBO-tuned DE and PSO achieve improved solution quality and reduced variability relative to their default counterparts. These findings demonstrate the effectiveness of surrogate-based parameter tuning in improving the reliability, stability, and reproducibility of metaheuristic algorithms for real-world constrained global optimization problems.

Keywords

  • Keywords Metaheuristics
  • DE
  • PSO
  • SMBO
  • parameters
  • constraints
  • design
  • fixed-budget
  • statistical validation

References

  1. Arora, J. S. (2011). Introduction to Optimum Design. Academic Press.
  2. Rao, S. S. (2019). Engineering Optimization: Theory and Practice. John Wiley & Sons.
  3. Storn, R., & Price, K. (1997). Differential evolution–a simple and efficient heuristic for global optimization over continuous spaces. Journal of global optimization, 11(4), 341-359.
  4. Kennedy, J., & Eberhart, R. (1995, November). Particle swarm optimization. In Proceedings of ICNN'95-international conference on neural networks, vol. 4, pp. 1942-1948. IEEE.
  5. Parouha, R. P., & Verma, P. (2022). A systematic overview of developments in differential evolution and particle swarm optimization with their advanced suggestion. Applied Intelligence, 52(12), 10448–10492. https://doi.org/10.1007/s10489-021-02557-4
  6. Hutter, F., Hoos, H. H., & Leyton-Brown, K. (2010). Sequential model-based optimization for general algorithm configuration (extended version). Technical Report TR-2010–10, University of British Columbia, Computer Science, Tech. Rep.
  7. Rasmussen, C. E., & Williams, C. K. I. (2006). Gaussian Processes for Machine Learning. MIT Press.
  8. Mockus, J., Tiesis, V., & Žilinskas, A. (1978). The application of Bayesian methods for seeking the extremum. In L. C. W. Dixon & G. P. Szegö (Eds.), Towards Global Optimization 2 (pp. 117–129). North-Holland.
  9. Jones, D. R., Schonlau, M., & Welch, W. J. (1998). Efficient global optimization of expensive black-box functions. Journal of Global Optimization, 13(4), 455–492. https://doi.org/10.1023/A:1008306431147
  10. Bartz-Beielstein, T., Lasarczyk, C., & Preuss, M. (2005). Sequential parameter optimization. In Proceedings of the 2005 IEEE Congress on Evolutionary Computation (CEC 2005) (pp. 773–780). IEEE. https://doi.org/10.1109/CEC.2005.1554721
  11. Hutter, F., Hoos, H. H., Leyton-Brown, K., & Murphy, K. P. (2009). An experimental investigation of model-based parameter optimization: SPO and beyond. In Proceedings of the 11th Annual Conference on Genetic and Evolutionary Computation (GECCO ’09) (pp. 271–278). ACM. https://doi.org/10.1145/1569901.1569933
  12. Bartz-Beielstein, T. (2010). Sequential parameter optimization—an annotated bibliography. Update.
  13. Boçi, B., & Simoni, A. (2025). Refining wage predictions with machine learning and Bayesian optimization. Mathematics and Statistics, 13(5), 279–296. https://doi.org/10.13189/ms.2025.130503
  14. Bergstra, J., Komer, B., Eliasmith, C., & Warde-Farley, D. (2014). Preliminary evaluation of Hyperopt algorithms on HPOLib. Proceedings of the ICML 2014 AutoML Workshop.
  15. Eiben, A. E., & Smit, S. K. (2011). Parameter tuning for configuring and analyzing evolutionary algorithms. Swarm and Evolutionary Computation, 1(1), 19–31.
  16. Brest, J., Greiner, S., Bošković, B., Mernik, M., & Žumer, V. (2006). Self-adapting control parameters in differential evolution: A comparative study on numerical benchmark problems. IEEE Transactions on Evolutionary Computation, 10(6), 646–657. https://doi.org/10.1109/TEVC.2006.87213
  17. Qin, A. K., Huang, V. L., & Suganthan, P. N. (2009). Differential evolution algorithm with strategy adaptation for global numerical optimization. IEEE Transactions on Evolutionary Computation, 13(2), 398–417. https://doi.org/10.1109/TEVC.2008.927706
  18. Zamuda, A., & Brest, J. (2015). Self-adaptive control parameters’ randomization frequency and propagations in differential evolution. Swarm and Evolutionary Computation, 25, 72–99. https://doi.org/10.1016/j.swevo.2015.01.004
  19. Das, S., Mullick, S. S., & Suganthan, P. N. (2016). Recent advances in differential evolution: An updated survey. Swarm and Evolutionary Computation, 27, 1–30. https://doi.org/10.1016/j.swevo.2016.01.004
  20. Ahmad, M. F., Isa, N. A. M., Lim, W. H., & Ang, K. M. (2022). Differential evolution: A recent review based on state-of-the-art works. Alexandria Engineering Journal, 61(10), 3831–3872. https://doi.org/10.1016/j.aej.2021.09.013
  21. Gianni, A. M., Tsoulos, I. G., Charilogis, V., & Kyrou, G. (2025). Enhancing Differential Evolution: A Dual Mutation Strategy with Majority Dimension Voting and New Stopping Criteria. Symmetry, 17(6), 844. https://doi.org/10.3390/sym17060844
  22. Shi, Y., & Eberhart, R. C. (1998). A modified particle swarm optimizer. In 1998 IEEE International Conference on Evolutionary Computation Proceedings (pp. 69–73). IEEE. https://doi.org/10.1109/ICEC.1998.69914
  23. Clerc, M., & Kennedy, J. (2002). The particle swarm—Explosion, stability, and convergence in a multidimensional complex space. IEEE Transactions on Evolutionary Computation, 6(1), 58–73. https://doi.org/10.1109/4235.985692
  24. Ganesan, T., Vasant, P., & Elamvazuthy, I. (2012). A hybrid PSO approach for solving non-convex optimization problems. Archives of Control Sciences, 22(1), 87-105.
  25. Yun, Y., Gen, M., & Erdene, T. N. (2023). Applying GA-PSO-TLBO approach to engineering optimization problems. Math. Biosci. Eng, 20(1), 552-571. doi: 10.3934/mbe.2023025
  26. Bonyadi, M. R., & Michalewicz, Z. (2017). Particle swarm optimization for single objective continuous space problems: a review. Evolutionary computation, 25(1), 1-54.
  27. Abualigah, L. (2025). Particle swarm optimization: Advances, applications, and experimental insights. Computers, Materials & Continua, 82(2), 1539–1556. https://doi.org/10.32604/cmc.2025.060765
  28. Piotrowski, A. P., Napiorkowski, J. J., & Piotrowska, A. E. (2023). Particle swarm optimization or differential evolution—A comparison. Engineering Applications of Artificial Intelligence, 121, 106008. https://doi.org/10.1016/j.engappai.2023.106008
  29. Zhang, Y., Li, H., Bao, E., Zhang, L., & Yu, A. (2019). A hybrid global optimization algorithm based on particle swarm optimization and Gaussian process. International Journal of Computational Intelligence Systems, 12(2), 1270–1281. https://doi.org/10.2991/ijcis.d.191101.004
  30. Klus, J., Grunt, P., & Dobrovolný, M. (2021). Hyper-optimization with Gaussian process and differential evolution algorithm. Neural Information Processing Systems (NeurIPS) BlackBox Optimization Challenge 2020. arXiv:2101.10625. https://arxiv.org/abs/2101.10625
  31. Vincent, A. M., & Jidesh, P. (2023). An improved hyperparameter optimization framework for AutoML systems using evolutionary algorithms. Sci Rep 13, 4737.
  32. Roman, I., Ceberio, J., Mendiburu, A., & Lozano, J. A. (2016, July). Bayesian optimization for parameter tuning in evolutionary algorithms. In 2016 IEEE Congress on Evolutionary Computation (CEC) pp. 4839-4845. IEEE.
  33. Trindade, Á. R., & Campelo, F. (2019). Tuning metaheuristics by sequential optimisation of regression models. Applied Soft Computing, 85, 105829. https://doi.org/10.1016/j.asoc.2019.105829
  34. Atkinson, L., Müller-Bady, R., & Kappes, M. (2020, July). Hybrid bayesian evolutionary optimization for hyperparameter tuning. In Proceedings of the 2020 Genetic and Evolutionary Computation Conference Companion (pp. 225-226).
  35. Ruther, C., & Rieck, J. (2021). A Bayesian optimization approach for tuning a genetic algorithm solving practical-oriented pickup and delivery problems. IEEE Transactions on Automation Science and Engineering, 1-12.
  36. Rüther, C., & Rieck, J. (2024). A Bayesian optimization approach for tuning a grouping genetic algorithm for solving practically oriented pickup and delivery problems. Logistics, 8(1), 14.
  37. Soares, R. C., Silva, J. C., Lucena Junior, J. A. de, Lima Filho, A. C., Ramos, J. G. G. de S., & Brito, A. V. (2025). Integration of Bayesian optimization into hyperparameter tuning of the particle swarm optimization algorithm to enhance neural networks in bearing failure classification. Engineering Applications of Artificial Intelligence, 135, 107252. https://doi.org/10.1016/j.measurement.2024.115829
  38. Coello, C. A. C. (2000). Use of a self-adaptive penalty approach for engineering optimization problems. Computers in Industry, 41(2), 113-127.
  39. Coello, C. A. C. (2002). Theoretical and numerical constraint-handling techniques used with evolutionary algorithms: a survey of the state of the art. Computer methods in applied mechanics and engineering, 191(11-12), 1245-1287. https://doi.org/10.1016/S0045-7825(01)00323-1
  40. Homaifar, A., Qi, C. X., & Lai, S. H. (1994). Constrained optimization via genetic algorithms. Simulation, 62(4), 242-253.
  41. Mezura-Montes, E., & Coello Coello, C. A. (2011). Constraint-handling in nature-inspired numerical optimization: Past, present and future. Swarm and Evolutionary Computation, 1(4), 173–194. https://doi.org/10.1016/j.swevo.2011.10.001
  42. Li, X. (2010). Niching without niching parameters: Particle swarm optimization using a ring topology. IEEE Transactions on Evolutionary Computation, 14(1), 150–169. https://doi.org/10.1109/TEVC.2009.2017514
  43. Pluháček, M., Šenkeřík, R., Viktorin, A., & Kadavý, T. (2018). Study on velocity clamping in PSO using CEC13 benchmark. Proceedings-European Council for Modelling and Simulation, ECMS.
  44. Lim, S. P., Hoon, H., & Song, W. H. (2020, May). A comparative study on different parameter factors and velocity clamping for particle swarm optimisation. In IOP Conference Series: Materials Science and Engineering. Vol. 864, No. 1, p. 012068. IOP Publishing.
  45. Li, X., Mao, K., Lin, F., & Zhang, X. (2021). Particle swarm optimization with state-based adaptive velocity limit strategy. Neurocomputing, 447, 64-79.
  46. Zielinski, K., & Laur, R. (2008). Stopping criteria for differential evolution in constrained single-objective optimization. In Advances in differential evolution (pp. 111-138). Berlin, Heidelberg: Springer Berlin Heidelberg.
  47. Snoek, J., Larochelle, H., & Adams, R. P. (2012). Practical Bayesian optimization of machine learning algorithms. Advances in neural information processing systems, 25.
  48. Wilcoxon, F. (1945). Individual comparisons by ranking methods. Biometrics Bulletin, 1(6), 80–83
  49. Hollander, M., Wolfe, D. A., & Chicken, E. (2013). Nonparametric statistical methods. John Wiley & Sons.
  50. Derrac, J., García, S., Molina, D., & Herrera, F. (2011). A practical tutorial on the use of nonparametric statistical tests as a methodology for comparing evolutionary and swarm intelligence algorithms. Swarm and Evolutionary Computation, 1(1), 3-18.
  51. Li, J., & Sun, K. (2023). Pressure vessel design problem using improved Gray Wolf optimizer based on Cauchy distribution. Applied Sciences, 13(22), 12290.
  52. Houssein, E. H., Saad, M. R., Hashim, F. A., Shaban, H., & Hassaballah, M. (2020). Lévy flight distribution: A new metaheuristic algorithm for solving engineering optimization problems. Engineering Applications of Artificial Intelligence, 94, 103731.
  53. Rather, S. A., & Bala, P. S. (2021). Application of constriction coefficient-based particle swarm optimisation and gravitational search algorithm for solving practical engineering design problems. International Journal of Bio-Inspired Computation, 17(4), 246-259.
  54. El-Shorbagy, M. A., & El-Refaey, A. M. (2022). A hybrid genetic–firefly algorithm for engineering design problems. Journal of Computational Design and Engineering, 9(2), 706-730.
  55. Shami, T. M., Mirjalili, S., Al-Eryani, Y., Daoudi, K., Izadi, S., & Abualigah, L. (2023). Velocity pausing particle swarm optimization: a novel variant for global optimization. Neural Computing and Applications, 35(12), 9193-9223.
  56. Qiao, J., Wang, G., Yang, Z., Luo, X., Chen, J., Li, K., & Liu, P. (2024). A hybrid particle swarm optimization algorithm for solving engineering problem. Scientific Reports, 14(1), 8357.
  57. Xia, H., Ke, Y., Liao, R., & Zhang, H. (2025). Fractional order dung beetle optimizer with reduction factor for global optimization and industrial engineering optimization problems. Artificial Intelligence Review, 58(10), 308.