Abstract
The input for stock market prediction is usually a period of stock price data with time series characteristics, which will keep changing over time and have more complex background relationships. How to effectively mine and fuse multiple heterogeneous data of the stock market is difficult to be handled by traditional recurrent neural networks (RNN). To solve this problem, we divide the regression model into an encoder and decoder structure. In this paper, we first use RNN technique for missing value complementation, then use the fusion model of bidirectional gate recurrent unit (BiGRU) and bidirectional long short-term memory network (BiLSTM) as the encoder to extract. Finally, the group method of data handling (GMDH) model is used as a decoder to obtain stock market prediction results based on the time series data features. A deep heuristic evolutionary regression model (BBGMDH) based on the fusion of BiGRU and BiLSTM is proposed by the above process. We have conducted extensive experiments on four real stock data, and the results show that BBGMDH significantly outperforms existing methods, verifying the effectiveness of the encoding-decoding stepwise regression model in stock price prediction tasks. The reason is that the encoding layer utilizes the powerful time series data processing technology of RNN to effectively extract the hidden features of stock data, and the decoding layer utilizes the GMDH heuristic evolutionary computation method to simulate the “genetic mutation selection evolution” process of an organism for the regression task of stock market prediction, making full use of their complementary properties. We provide a new solution to the regression prediction problem.
Similar content being viewed by others
Explore related subjects
Discover the latest articles, news and stories from top researchers in related subjects.Data Availability
The datasets generated during and analyzed during the current study are available from the corresponding author on reasonable request.
References
Chen Y, Dong G, Han J, Wah BW, Wang J. Multi-dimensional regression analysis of time-series data streams. In: International Conference on VLDB. 2002. p 323–34.
Weigend AS. Time series prediction: forecasting the future and understanding the past. 2018.
Cook RD. Detection of influential observation in linear regression. Technometrics. 1977;19(1):15–8.
Bates DM, Watts DG. Nonlinear regression analysis and its applications. 1981.
Rendle S. Factorization machines. In: Proceedings of the 2010 IEEE International Conference on Data Mining, ICDM ’10. IEEE Computer Society, 2010. p 995–1000.
Cordell HJ, Clayton DG. A unified stepwise regression procedure for evaluating the relative effects of polymorphisms within a gene using case/control or family data: application to HLA in type 1 diabetes. Am J Hum Genet. 2002;70(1):124–41.
Hoerl AE, Kennard RW. Ridge regression: biased estimation for nonorthogonal problems. Technometrics a Journal of Stats for the Physical Chemical & Engineering Sciences. 2000;42.
Hans C. Bayesian lasso regression. Biometrika. 2009;96(4):835–45.
Hui Z, Trevor H. Regression shrinkage and selection via the elastic net, with applications to microarrays. JR Stat Soc Ser B. 2003;67:301–20.
Scott M. Six approaches to calculating standardized logistic regression coefficients. Am Stat. 2004;58(4):364–364.
Prasad AM, Iverson LR, Liaw A. Newer classification and regression tree techniques: bagging and random forests for ecological prediction. Ecosystems. 2006;9:181–99.
Cherkassky V, Ma Y. Practical selection of svm parameters and noise estimation for svm regression. Neural Netw. 2004;17(1):113–26.
Specht DF. The general regression neural network-rediscovered. Neural Netw. 1993;6(7):1033–4.
Werbos JP. Backpropagation through time: what it does and how to do it. Proc IEEE. 1990;78(10):1550–60.
Williams RJ, Zipser D. A learning algorithm for continually running fully recurrent neural networks. Neural Comput. 1998;1(2).
Sepp H. Untersuchungen zu dynamischen neuronalen Netzen, vol 1 [Master’s thesis]. Institut fur Informatik, Technische Universitat, Munchen; 1991. p 1–150.
Mozer MC. Induction of multiscale temporal structure. Morgan Kaufmann Publishers Inc. 1997.
Gf A, Schmidhuber J, Cummins F. Learning to forget: continual prediction with LSTM. In: Istituto Dalle Molle Di Studi Sull Intelligenza Artificiale. 1999.
Hochreiter S, Schmidhuber J. Long short-term memory. Neural Comput. 1997;9(8):1735–80.
Gers FA, Schraudolph NN, Schmidhuber J. Learning precise timing with lstm recurrent networks. J Mach Learn Res. 2003;3(1):115–43.
Pérez-Ortiz JA, Gers FA, Eck D, Schmidhuber J. Kalman filters improve LSTM network performance in problems unsolvable by traditional recurrent nets. Neural Netw. 2003;16(2):241–50.
Graves A, Schmidhuber J. Framewise phoneme classification with bidirectional LSTM networks. In: IEEE International Joint Conference on Neural Networks. 2005.
Xu Y, Chhim L, Zheng B, Nojima Y. Stacked deep learning structure with bidirectional long-short term memory for stock market prediction. In: International Conference on Neural Computing for Advanced Applications. Springer; 2020. p 447–60.
Chen Q, Zhang W, Lou Y. Forecasting stock prices using a hybrid deep learning model integrating attention mechanism, multi-layer perceptron, and bidirectional long-short term memory neural network. IEEE Access. 2020;PP(99):1–1.
Lu W, Li J, Wang J, Qin L. A CNN-BILSTM-AM method for stock price prediction. Neural Comput Appl. 2021;33(10):4741–53.
Lai G, Chang WC, Yang Y, Liu H. Modeling long- and short-term temporal patterns with deep neural networks. In: International ACM SIGIR Conference on Research and Development in Information Retrieval. 2018.
Shih SY, Sun FK, Lee HY. Temporal pattern attention for multivariate time series forecasting. Mach Learn. 2019;108(8–9):1421–41.
Nama S, Saha AK. A bio-inspired multi-population-based adaptive backtracking search algorithm. Cogn Comput. 2022;14(2):900–25.
Martínez-Cagigal V, Santamaría-Vázquez E, H Roberto. Brain-computer interface channel selection optimization using meta-heuristics and evolutionary algorithms. Appl Soft Comput. 2022;115:108176.
Nawaz MS, Nawaz MZ, Hasan O, Fournier-Viger P, Sun M. Proof searching and prediction in HOL4 with evolutionary/heuristic and deep learning techniques. Appl Intell. 2021;51:1580–1601.
Hochreiter S, Schmidhuber J. Long short-term memory. Neural Comput. 1997;9(8):1735–80.
Cho K, Van Merrienboer B, Gulcehre C, Bahdanau D, Bougares F, Schwenk H, Bengio Y. Learning phrase representations using RNN encoder-decoder for statistical machine translation. Comput Sci. 2014.
Schuster M, Paliwal KK. Bidirectional recurrent neural networks. IEEE Trans Signal Process. 1997;45(11):2673–81.
Band SS, Mohammadzadeh A, Csiba P, Mosavi A, Varkonyi-Koczy AR. Voltage regulation for photovoltaics-battery-fuel systems using adaptive group method of data handling neural networks (GMDH-NN). IEEE Access. 2020.
Ivakhnenko AG. Sorting methods for modeling and clusterization (survey of GMDH papers for the years 1983–1988). The present stage of GMDH development. Soviet Journal of Automation and Information Sciences (English translation of Avtomatyka). 1988;21(4).
Yang CH, Liao MY, Chen PL, Huang MT, Huang CW, Huang JS, Chung JB. Constructing financial distress prediction model using group method of data handling technique. In: International Conference on Machine Learning & Cybernetics. 2009.
Xu L, Lu XW, Xiao BJ, Qi L, Enhong C, Xiaoyi J, Bin L. Probabilistic SVM classifier ensemble selection based on GMDH-type neural network. Pattern Recogn. 2020;106:107373.
Xu L, Lu B, Xiao J, Liu Q, Chen E, Wang X, Tang Y. Multiple graph kernel learning based on GMDH-type neural network. Information Fusion. 2021;66:100–10.
Radman A, Suandi SA. BILSTM regression model for face sketch synthesis using sequential patterns. Neural Comput Appl. 2021;33:12689–702.
Gupta B, Prakasam P, Velmurugan T. Integrated BERT embeddings, BILSTM-BIGRU and 1-D CNN model for binary sentiment classification analysis of movie reviews. Multimed Tools Appl. 2022;81(23):33067–86.
Funding
This work was financially supported by Key International Cooperation Projects of National Natural Science Foundation of China (61860206004), National Natural Science Foundation of China (62176085, 62172458, 61672114), Nature Science Research Project of Anhui province (1908085MF185), Talent Fund of Hefei University (20RC25). Industry-University-Research Cooperation Project under Grant No.GP/026/2020 and HF-010-2021 Zhuhai City, Guangdong Province, China.
Author information
Authors and Affiliations
Corresponding author
Ethics declarations
Ethical Approval
This article does not contain any studies with human or animal subjects performed by any of the authors.
Informed Consent
Informed consent was obtained from all individual participants included in the study.
Conflict of Interest
The authors declare no competing interests.
Additional information
Publisher's Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Rights and permissions
Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.
About this article
Cite this article
Xu, L., Xu, W., Cui, Q. et al. Deep Heuristic Evolutionary Regression Model Based on the Fusion of BiGRU and BiLSTM. Cogn Comput 15, 1672–1686 (2023). https://doi.org/10.1007/s12559-023-10135-6
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s12559-023-10135-6