Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
Skip to main content

Golden lichtenberg algorithm: a fibonacci sequence approach applied to feature selection

  • Original Article
  • Published:
Neural Computing and Applications Aims and scope Submit manuscript

Abstract

Computational and technological advancements have led to an increase in data generation and storage capacity. Many annotated datasets have been used to train machine learning models for predictive tasks. Feature selection (FS) is a combinatorial binary optimization problem that arises from a need to reduce dataset dimensionality by finding the subset of features with maximum predictive accuracy. While different methodologies have been proposed, metaheuristics adapted to binary optimization have proven to be reliable and efficient techniques for FS. This paper applies the first and unique population-trajectory metaheuristic, the Lichtenberg algorithm (LA), and enhances it with a Fibonacci sequence to improve its exploration capabilities in FS. Substituting the random scales that controls the Lichtenberg figures' size and the population distribution in the original version by a sequence based on the golden ratio, a new optimal exploration–exploitation LF's size decay is presented. The new few hyperparameters golden Lichtenberg algorithm (GLA), LA, and eight other popular metaheuristics are then equipped with the v-shaped transfer function and associated with the K-nearest neighbor classifier in the search of the optimized feature subsets through a double cross-validation experiment method on 15 UCI machine learning repository datasets. The binary GLA selected reduced subsets of features, leading to the best predictive accuracy and fitness values at the lowest computational cost.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7

Similar content being viewed by others

Explore related subjects

Discover the latest articles, news and stories from top researchers in related subjects.

Data availability

The datasets used are available in the UCI machine learning repository, the link is referred as https://archive.ics.uci.edu/ml/index.php. The MATLAB code of the Lichtenberg algorithm can be found at https://www.mathworks.com/matlabcentral/fileexchange/84732-lichtenberg-algorithm-la.

Abbreviations

DM:

Data mining

ML:

Machine learning

FS:

Feature selection

NFL:

No-free-lunch theorem

LA:

Lichtenberg algorithm

GLA:

Golden Lichtenberg algorithm

DLA:

Diffusion-limited aggregation

LF:

Lichtenberg figure

S :

Stickiness factor

N p :

Particle's number

R c :

Creation radius

Ref :

Refinement

M :

Figure switching factor

Pop :

Population

N iter :

Iteration's Number

KNN:

K-nearest neighborhood

BLA:

Binary Lichtenberg algorithm

MH:

Metaheuristics

PSO:

Particle swarm optimization

DE:

Differential evolution

GA:

Genetic algorithm

MBO:

Monarch butterfly optimization

SSA:

Salp swarm algorithm

WOA:

Whale optimization algorithm

HHO:

Harris Hawks optimization

MRFO:

Manta ray foraging optimization

References

  1. Tubishat M, Idris N, Shuib L, Abushariah MA, Mirjalili S (2020) Improved Salp Swarm Algorithm based on opposition based learning and novel local search algorithm for feature selection. Expert Syst Appl 145:113122

    Article  Google Scholar 

  2. Arora S, Anand P (2019) Binary butterfly optimization approaches for feature selection. Expert Syst Appl 116:147–160

    Article  Google Scholar 

  3. Ma BJ, Liu S, Heidari AA (2022) Multi-strategy ensemble binary hunger games search for feature selection. Knowl-Based Syst 248:108787

    Article  Google Scholar 

  4. Pereira JLJ, Ma BJ, Francisco MB, Junior RFR, Gomes GF (2023) A comparison between chaos theory and Lévy flights in sunflower optimization for feature selection. Expert Syst 40(8):e13330

    Article  Google Scholar 

  5. Tubishat M, Ja’afarr S, Alswaitti M, Mirjalili S, Idris N, Ismail MA, Omar MS (2021) Dynamic salp swarm algorithm for feature selection. Expert Syst Appl 164:113873

    Article  Google Scholar 

  6. Xie J, Sage M, Zhao YF (2023) Feature selection and feature learning in machine learning applications for gas turbines: A review. Eng Appl Artif Intell 117:105591

    Article  Google Scholar 

  7. Chantar H, Mafarja M, Alsawalqah H, Heidari AA, Aljarah I, Faris H (2020) Feature selection using binary grey wolf optimizer with elite-based crossover for Arabic text classification. Neural Comput Appl 32:12201–12220

    Article  Google Scholar 

  8. Sheikhpour R, Berahmand K, Forouzandeh S (2023) Hessian-based semi-supervised feature selection using generalized uncorrelated constraint. Knowl-Based Syst 269:110521

    Article  Google Scholar 

  9. Sharma M, Kaur P (2021) A comprehensive analysis of nature-inspired meta-heuristic techniques for feature selection problem. Archives of Computational Methods in Engineering 28:1103–1127

    Article  MathSciNet  Google Scholar 

  10. Emary E, Zawbaa HM, Hassanien AE (2016) Binary grey wolf optimization approaches for feature selection. Neurocomputing 172:371–381

    Article  Google Scholar 

  11. Neggaz N, Houssein EH, Hussain K (2020) An efficient henry gas solubility optimization for feature selection. Expert Syst Appl 152:113364

    Article  Google Scholar 

  12. Alazzam H, Sharieh A, Sabri KE (2020) A feature selection algorithm for intrusion detection system based on pigeon inspired optimizer. Expert Syst Appl 148:113249

    Article  Google Scholar 

  13. Zhang Y, Liu R, Wang X, Chen H, Li C (2021) Boosted binary Harris hawks optimizer and feature selection. Engineering with Computers 37:3741–3770

    Article  Google Scholar 

  14. Song XF, Zhang Y, Guo YN, Sun XY, Wang YL (2020) Variable-size cooperativecoevolutionaryy particle swarm optimization for feature selection on high-dimensional data. IEEE Trans Evol Comput 24(5):882–895

    Article  Google Scholar 

  15. Xue B, Zhang M, Browne WN (2012) Particle swarm optimization for feature selection in classification: A multi-objective approach. IEEE transactions on cybernetics 43(6):1656–1671

    Article  Google Scholar 

  16. Dhiman G, Oliva D, Kaur A, Singh KK, Vimal S, Sharma A, Cengiz K (2021) BEPO: A novel binary emperor penguin optimizer for automatic feature selection. Knowl-Based Syst 211:106560

    Article  Google Scholar 

  17. Paniri M, Dowlatshahi MB, Nezamabadi-Pour H (2020) MLACO: A multi-label feature selection algorithm based on ant colony optimization. Knowl-Based Syst 192:105285

    Article  Google Scholar 

  18. Hu P, Pan JS, Chu SC (2020) Improved binary grey wolf optimizer and its application for feature selection. Knowl-Based Syst 195:105746

    Article  Google Scholar 

  19. Mafarja M, Aljarah I, Heidari AA, Faris H, Fournier-Viger P, Li X, Mirjalili S (2018) Binary dragonfly optimization for feature selection using time-varying transfer functions. Knowl-Based Syst 161:185–204

    Article  Google Scholar 

  20. Hammouri AI, Mafarja M, Al-Betar MA, Awadallah MA, Abu-Doush I (2020) An improved dragonfly algorithm for feature selection. Knowl-Based Syst 203:106131

    Article  Google Scholar 

  21. de Souza RCT, de Macedo CA, dos Santos Coelho L, Pierezan J, Mariani VC (2020) Binary coyote optimization algorithm for feature selection. Pattern Recogn 107:107470

    Article  Google Scholar 

  22. Taradeh M, Mafarja M, Heidari AA, Faris H, Aljarah I, Mirjalili S, Fujita H (2019) An evolutionary gravitational search-based feature selection. Inf Sci 497:219–239

    Article  Google Scholar 

  23. Neggaz N, Ewees AA, Abd Elaziz M, Mafarja M (2020) Boosting salp swarm algorithm by sine cosine algorithm and disrupt operator for feature selection. Expert Syst Appl 145:113103

    Article  Google Scholar 

  24. Kwakye BD, Li Y, Mohamed HH, Baidoo E, Asenso TQ (2024) Particle guided metaheuristic algorithm for global optimization and feature selection problems. Expert Syst Appl 248:123362

    Article  Google Scholar 

  25. Gaugel S, Reichert M (2024) Data-driven multi-objective optimization of hydraulic pump test cycles via wrapper feature selection. CIRP J Manuf Sci Technol 50:14–25

    Article  Google Scholar 

  26. Tijjani S, Ab Wahab MN, Noor MHM (2024) An enhanced particle swarm optimization with position update for optimal feature selection. Expert Syst Appl 247:123337

    Article  Google Scholar 

  27. Abd Elaziz M, Mirjalili S (2019) A hyper-heuristic for improving the initial population of whale optimization algorithm. Knowl-Based Syst 172:42–63

    Article  Google Scholar 

  28. Yang XS (2020) Nature-inspired optimization algorithms: Challenges and open problems. Journal of Computational Science 46:101104

    Article  MathSciNet  Google Scholar 

  29. Yang XS (2020) Nature-inspired optimization algorithms. Academic Press

    Google Scholar 

  30. Ho YC, Pepyne DL (2002) Simple explanation of the no-free-lunch theorem and its implications. J Optim Theory Appl 115:549–570

    Article  MathSciNet  Google Scholar 

  31. Pereira JLJ, Francisco MB, Diniz CA, Oliver GA, Cunha SS Jr, Gomes GF (2021) Lichtenberg algorithm: A novel hybrid physics-based meta-heuristic for global optimization. Expert Syst Appl 170:114522

    Article  Google Scholar 

  32. Witten TA Jr, Sander LM (1981) Diffusion-limited aggregation, a kinetic critical phenomenon. Phys Rev Lett 47(19):1400

    Article  Google Scholar 

  33. Witten TA, Sander LM (1983) Diffusion-limited aggregation. Phys Rev B 27(9):5686

    Article  MathSciNet  Google Scholar 

  34. Pereira JLJ, Chuman M, Cunha SS Jr, Gomes GF (2021) Lichtenberg optimization algorithm applied to crack tip identification in thin plate-like structures. Eng Comput 38(1):151–166

    Article  Google Scholar 

  35. Pereira JLJ, Francisco MB, da Cunha Jr SS, Gomes GF (2021) A powerful Lichtenberg Optimization Algorithm: A damage identification case study. Eng Appl Artif Intell 97:104055

    Article  Google Scholar 

  36. Francisco MB, Junqueira DM, Oliver GA, Pereira JLJ, da Cunha Jr Jr, S.S. and Gomes, G.F. (2021) Design optimizations of carbon fibre reinforced polymer isogrid lower limb prosthesis using particle swarm optimization and Lichtenberg algorithm. Eng Optim 53(11):1922–1945

    Article  Google Scholar 

  37. Francisco M, Roque L, Pereira J, Machado S, da Cunha Jr SS, Gomes GF (2021) A statistical analysis of high-performance prosthetic isogrid composite tubes using response surface method. Eng Comput 38(6):2481–2504

    Article  Google Scholar 

  38. Pereira JLJ, Francisco MB, Ribeiro RF, Cunha SS, Gomes GF (2022) Deep multi-objective design optimization of CFRP isogrid tubes using lichtenberg algorithm. Soft Comput 26(15):7195–7209

    Article  Google Scholar 

  39. Pereira JLJ, Oliver GA, Francisco MB, Cunha SS Jr, Gomes GF (2022) Multi-objective lichtenberg algorithm: A hybrid physics-based meta-heuristic for solving engineering problems. Expert Syst Appl 187:115939

    Article  Google Scholar 

  40. De Souza TAZ, Pereira JLJ, Francisco MB, Sotomonte CAR, Jun Ma B, Gomes GF, Coronado CJR (2023) Multi-objective optimization for methane, glycerol, and ethanol steam reforming using lichtenberg algorithm. Int J Green Energy 20(4):390–407

    Article  Google Scholar 

  41. Challan M, Jeet S, Bagal DK, Mishra L, Pattanaik AK, Barua A (2022) Fabrication and mechanical characterization of red mud based Al2025-T6 MMC using Lichtenberg optimization algorithm and Whale optimization algorithm. Materials Today: Proceedings 50:1346–1353

    Google Scholar 

  42. Mohanty A, Nag KS, Bagal DK, Barua A, Jeet S, Mahapatra SS, Cherkia H (2022) Parametric optimization of parameters affecting dimension precision of FDM printed part using hybrid Taguchi-MARCOS-nature inspired heuristic optimization technique. Materials Today: Proceedings 50:893–903

    Google Scholar 

  43. Tian Z, Wang J (2022) Variable frequency wind speed trend prediction system based on combined neural network and improved multi-objective optimization algorithm. Energy 254:124249

    Article  Google Scholar 

  44. Pereira JLJ, Francisco MB, de Oliveira LA, Chaves JAS, Cunha SS Jr, Gomes GF (2022) Multi-objective sensor placement optimization of helicopter rotor blade based on Feature Selection. Mech Syst Signal Process 180:109466

    Article  Google Scholar 

  45. Horadam AF (1961) A generalized Fibonacci sequence. Am Math Mon 68(5):455–459

    Article  MathSciNet  Google Scholar 

  46. Kiefer J (1953) Sequential minimax search for a maximum. Proceedings of the American mathematical society 4(3):502–506

    Article  MathSciNet  Google Scholar 

  47. Keshavarz-Ghorbani F, Pasandideh SHR (2021) Optimizing a two-level closed-loop supply chain under the vendor managed inventory contract and learning: Fibonacci, GA, IWO, MFO algorithms. Neural Comput Appl 33:9425–9450

    Article  Google Scholar 

  48. Horla D, Sadalla T (2020) Optimal tuning of fractional-order controllers based on Fibonacci-search method. ISA Trans 104:287–298

    Article  Google Scholar 

  49. Nematollahi AF, Rahiminejad A, Vahidi B (2020) A novel meta-heuristic optimization method based on golden ratio in nature. Soft Comput 24(2):1117–1151

    Article  Google Scholar 

  50. Etminaniesfahani A, Ghanbarzadeh A, Marashi Z (2018) Fibonacci indicator algorithm: A novel tool for complex optimization problems. Eng Appl Artif Intell 74:1–9

    Article  Google Scholar 

  51. Yuan P, Zhang T, Yao L, Lu Y, Zhuang W (2022) A hybrid golden jackal optimization and golden sine algorithm with dynamic lens-imaging learning for global optimization problems. Appl Sci 12(19):9709

    Article  Google Scholar 

  52. Etminaniesfahani A, Gu H, Salehipour A (2022) ABFIA: A hybrid algorithm based on artificial bee colony and Fibonacci indicator algorithm. Journal of Computational Science 61:101651

    Article  Google Scholar 

  53. Sahoo, S.K., Reang, S., Saha, A.K. and Chakraborty, S., 2024. F-WOA: an improved whale optimization algorithm based on Fibonacci search principle for global optimization. In Handbook of Whale Optimization Algorithm Academic Press. 217–233

  54. Sahoo SK, Houssein EH, Premkumar M, Saha AK, Emam MM (2023) Self-adaptive moth flame optimizer combined with crossover operator and Fibonacci search strategy for COVID-19 CT image segmentation. Expert Syst Appl 227:120367

    Article  Google Scholar 

  55. Hartono N, Pham DT (2024) A novel Fibonacci-inspired enhancement of the Bees Algorithm: application to robotic disassembly sequence planning. Cogent Engineering 11(1):2298764

    Article  Google Scholar 

  56. Mukherjee, D.S. and Yeri, N.G., 2021, December. Investigation of weight initialization using Fibonacci Sequence on the performance of neural networks. In 2021 IEEE Pune Section International Conference (PuneCon) (pp. 1–8). IEEE.

  57. Garain A, Ray B, Giampaolo F, Velasquez JD, Singh PK, Sarkar R (2022) GRaNN: feature selection with golden ratio-aided neural network for emotion, gender and speaker identification from voice signals. Neural Comput Appl 34(17):14463–14486

    Article  Google Scholar 

  58. Dincer S, Ulutas G, Ustubioglu B, Tahaoglu G, Sklavos N (2024) Golden ratio based deep fake video detection system with fusion of capsule networks. Comput Electr Eng 117:109234

    Article  Google Scholar 

  59. Asuncion, A. and Newman, D., 2007. UCI machine learning repository.

  60. Mirjalili S (2016) Dragonfly algorithm: a new meta-heuristic optimization technique for solving single-objective, discrete, and multi-objective problems. Neural Comput Appl 27:1053–1073

    Article  Google Scholar 

  61. Altman NS (1992) An introduction to kernel and nearest-neighbor nonparametric regression. Am Stat 46(3):175–185

    Article  MathSciNet  Google Scholar 

  62. Tahir MA, Bouridane A, Kurugollu F (2007) Simultaneous feature selection and feature weighting using Hybrid Tabu Search/K-nearest neighbor classifier. Pattern Recogn Lett 28(4):438–446

    Article  Google Scholar 

  63. Sayed GI, Khoriba G, Haggag MH (2018) A novel chaotic salp swarm algorithm for global optimization and feature selection. Appl Intell 48:3462–3481

    Article  Google Scholar 

  64. Mafarja M, andMirjalili. MS (2018) Whale optimization approaches for wrapper feature selection. Appl Soft Comput 62:441–453

    Article  Google Scholar 

  65. Alweshah, M., Khalaileh, S.A., Gupta, B.B., Almomani, A., Hammouri, A.I. and Al-Betar, M.A., 2022. The monarch butterfly optimization algorithm for solving feature selection problems. Neural Computing and Applications, pp.1–15.

  66. Dokeroglu T, Deniz A, Kiziloz HE (2022) A comprehensive survey on recent metaheuristics for feature selection. Neurocomputing 494:269–296

    Article  Google Scholar 

  67. Wang M, Wu C, Wang L, Xiang D, Huang X (2019) A feature selection approach for hyperspectral image based on modified ant lion optimizer. Knowl-Based Syst 168:39–48

    Article  Google Scholar 

  68. Cruz-Duarte JM, Amaya I, Ortiz-Bayliss JC, Conant-Pablos SE, Terashima-Marín H, Shi Y (2021) Hyper-heuristics to customise metaheuristics for continuous optimisation. Swarm Evol Comput 66:100935

    Article  Google Scholar 

  69. Rao H, Shi X, Rodrigue AK, Feng J, Xia Y, Elhoseny M, Yuan X, Gu L (2019) Feature selection based on artificial bee colony and gradient boosting decision tree. Appl Soft Comput 74:634–642

    Article  Google Scholar 

  70. Hamouda E, El-Metwally S, Tarek M (2018) Ant Lion Optimization algorithm for kidney exchanges. PLoS ONE 13(5):e0196707

    Article  Google Scholar 

  71. Yang, C.S., Chuang, L.Y., Ke, C.H. and Yang, C.H., 2008, June. Boolean binary particle swarm optimization for feature selection. In 2008 IEEE congress on evolutionary computation (IEEE world congress on computational intelligence) (pp. 2093–2098). IEEE.

  72. Zhang X, Mei C, Chen D, Yang Y (2018) A fuzzy rough set-based feature selection method using representative instances. Knowl-Based Syst 151:216–229

    Article  Google Scholar 

  73. Zouache D, Abdelaziz FB (2018) A cooperative swarm intelligence algorithm based on quantum-inspired and rough sets for feature selection. Comput Ind Eng 115:26–36

    Article  Google Scholar 

  74. Mostafa RR, Gaheen MA, Abd ElAziz M, Al-Betar MA, Ewees AA (2023) An improved gorilla troops optimizer for global optimization problems and feature selection. Knowl-Based Syst 269:110462

    Article  Google Scholar 

  75. Mirjalili S, Lewis A (2013) S-shaped versus V-shaped transfer functions for binary particle swarm optimization. Swarm Evol Comput 9:1–14

    Article  Google Scholar 

  76. Ghosh KK, Guha R, Bera SK, Kumar N, Sarkar R (2021) S-shaped versus V-shaped transfer functions for binary Manta ray foraging optimization in feature selection problem. Neural Comput Appl 33(17):11027–11041

    Article  Google Scholar 

  77. Saremi S, Mirjalili S, Lewis A (2015) How important is a transfer function in discrete heuristic algorithms. Neural Comput Appl 26:625–640

    Article  Google Scholar 

  78. Francisco MB, Pereira JLJ, Vasconcelos GAVB, da Cunha Jr SS, Gomes GF (2022) November. Multi-objective design optimization of double arrowhead auxetic model using Lichtenberg algorithm based on metamodelling. Structures 45:1199–1211

    Article  Google Scholar 

  79. Merrill FH, Von Hippel A (1939) The atomphysical interpretation of Lichtenberg figures and their application to the study of gas discharge phenomena. J Appl Phys 10(12):873–887

    Article  Google Scholar 

  80. Pereira JLJ, Francisco MB, de Almeida FA, Ma BJ, Cunha SS Jr, Gomes GF (2023) Enhanced Lichtenberg algorithm: a discussion on improving meta-heuristics. Soft Comput 27(21):15619–15647

    Article  Google Scholar 

  81. Hastie, T., Tibshirani, R., Friedman, J.H. and Friedman, J.H., 2009. The elements of statistical learning: data mining, inference, and prediction (Vol. 2, pp. 1–758). New York: springer.

  82. Bello, R., Gomez, Y., Nowe, A. and Garcia, M.M., 2007, October. Two-step particle swarm optimization to solve the feature selection problem. In Seventh international conference on intelligent systems design and applications (ISDA 2007)(pp. 691–696). IEEE.

  83. Kabir MM, Shahjahan M, Murase K (2011) A new local search based hybrid genetic algorithm for feature selection. Neurocomputing 74(17):2914–2928

    Article  Google Scholar 

  84. Aljarah I, Habib M, Faris H, Al-Madi N, Heidari AA, Mafarja M, Abd Elaziz M, Mirjalili S (2020) A dynamic locality multi-objective salp swarm algorithm for feature selection. Comput Ind Eng 147:106628

    Article  Google Scholar 

  85. Abdel-Basset M, Ding W, El-Shahat D (2021) A hybrid Harris Hawks optimization algorithm with simulated annealing for feature selection. Artif Intell Rev 54(1):593–637

    Article  Google Scholar 

  86. Derrac J, García S, Molina D, Herrera F (2011) A practical tutorial on the use of nonparametric statistical tests as a methodology for comparing evolutionary and swarm intelligence algorithms. Swarm Evol Comput 1(1):3–18

    Article  Google Scholar 

  87. Saary MJ (2008) Radar plots: a useful way for presenting multivariate health care data. J Clin Epidemiol 61(4):311–317

    Article  Google Scholar 

  88. Algamal ZY, Qasim MK, Lee MH, Ali HTM (2020) High-dimensional QSAR/QSPR classification modeling based on improving pigeon optimization algorithm. Chemom Intell Lab Syst 206:104170

    Article  Google Scholar 

  89. Al-Thanoon NA, Algamal ZY, Qasim OS (2021) Feature selection based on a crow search algorithm for big data classification. Chemom Intell Lab Syst 212:104288

    Article  Google Scholar 

  90. Hamed Alnaish ZA, Algamal ZY (2023) Improving binary crow search algorithm for feature selection. J Intell Syst 32(1):20220228

    Google Scholar 

  91. Ewees AA, Al-Qaness MA, Abualigah L, Algamal ZY, Oliva D, Yousri D, Elaziz MA (2023) Enhanced feature selection technique using slime mould algorithm: A case study on chemical data. Neural Comput Appl 35(4):3307–3324

    Article  Google Scholar 

Download references

Acknowledgements

The authors acknowledge the financial support from the FAPESP (São Paulo Research Foundation, grants #2023/10419-0, #2022/10683-7, and #2021/06870-3) and FAPEMIG (Fundação de Amparo à Pesquisa do Estado de Minas Gerais, grant APQ-00385-18).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to João Luiz Junho Pereira.

Ethics declarations

Conflict of interest

The authors declare that they have no conflict of interest.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Pereira, J.L.J., Francisco, M.B., Ma, B.J. et al. Golden lichtenberg algorithm: a fibonacci sequence approach applied to feature selection. Neural Comput & Applic (2024). https://doi.org/10.1007/s00521-024-10155-9

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1007/s00521-024-10155-9

Keywords