Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
research-article

Robust noise-aware algorithm for randomized neural network and its convergence properties

Published: 02 July 2024 Publication History

Abstract

The concept of randomized neural networks (RNNs), such as the random vector functional link network (RVFL) and extreme learning machine (ELM), is a widely accepted and efficient network method for constructing single-hidden layer feedforward networks (SLFNs). Due to its exceptional approximation capabilities, RNN is being extensively used in various fields. While the RNN concept has shown great promise, its performance can be unpredictable in imperfect conditions, such as weight noises and outliers. Thus, there is a need to develop more reliable and robust RNN algorithms. To address this issue, this paper proposes a new objective function that addresses the combined effect of weight noise and training data outliers for RVFL networks. Based on the half-quadratic optimization method, we then propose a novel algorithm, named noise-aware RNN (NARNN), to optimize the proposed objective function. The convergence of the NARNN is also theoretically validated. We also discuss the way to use the NARNN for ensemble deep RVFL (edRVFL) networks. Finally, we present an extension of the NARNN to concurrently address weight noise, stuck-at-fault, and outliers. The experimental results demonstrate that the proposed algorithm outperforms a number of state-of-the-art robust RNN algorithms.

References

[1]
Alspector J., Gupta B., Allen R.B., Performance of a stochastic learning microchip, in: Advances in neural information processing systems, 1989, pp. 748–760.
[2]
Ballabio D., Cassotti M., Consonni V., Todeschini R., QSAR aquatic toxicity, 2019,. UCI Machine Learning Repository.
[3]
Ballabio D., Cassotti M., Consonni V., Todeschini R., QSAR fish toxicity, 2019,. UCI Machine Learning Repository.
[4]
Boucher-Routhier M., Zhang B.L.F., Thivierge J.-P., Extreme neural machines, Neural Networks 144 (2021) 639–647.
[5]
Brooks T., Pope D., Marcolini M., Airfoil self-noise, 2014,. UCI Machine Learning Repository.
[6]
Cao J., Dai H., Lei B., Yin C., Zeng H., Kummert A., Maximum correntropy criterion-based hierarchical one-class classification, IEEE Transactions on Neural Networks and Learning Systems 32 (8) (2021) 3748–3754.
[7]
Chauhan V., Tiwari A., Randomized neural networks for multilabel classification, Applied Soft Computing 115 (2022).
[8]
Chen B., Wang X., Lu N., Wang S., Cao J., Qin J., Mixture correntropy for robust learning, Pattern Recognition 79 (2018) 318–327.
[9]
Cortez P., Cerdeira A., Almeida F., Matos T., Reis J., Wine Quality, 2009,. UCI Machine Learning Repository.
[10]
Dey P., Nag K., Pal T., Pal N.R., Regularizing multilayer perceptron for robustness, IEEE Transactions on Systems, Man, and Cybernetics: Systems 48 (8) (2018) 1255–1266.
[11]
Ding Y., Tiwari P., Guo F., Zou Q., Shared subspace-based radial basis function neural network for identifying ncRNAs subcellular localization, Neural Networks 156 (2022) 170–178.
[12]
Dolenko B., Card H., Tolerance to analog hardware of on-chip learning in backpropagation networks, IEEE Transactions on Neural Networks 6 (5) (1995) 1045–1052.
[13]
Du B., Wang Z., Zhang L., Zhang L., Tao D., Robust and discriminative labeling for multi-label active learning based on maximum correntropy criterion, IEEE Transactions on Image Processing 26 (4) (2017) 1694–1707.
[14]
Dua D., Graff C., UCI machine learning repository, 2017.
[15]
Fei W., Dai W., Li C., Zou J., Xiong H., General bitwidth assignment for efficient deep convolutional neural network quantization, IEEE Transactions on Neural Networks and Learning Systems 33 (10) (2022) 5253–5267.
[16]
Feng G., Huang G.-B., Lin Q., Gay R., Error minimized extreme learning machine with growth of hidden nodes and incremental learning, IEEE Transactions on Neural Networks 20 (8) (2009) 1352–1357.
[17]
Frye R., Rietman E., Wong C., Back-propagation learning and nonidealities in analog neural network hardware, IEEE Transactions on Neural Networks 2 (1) (1991) 110–117.
[18]
Gao R., Du L., Yuen K.F., Suganthan P.N., Walk-forward empirical wavelet random vector functional link for time series forecasting, Applied Soft Computing 108 (2021).
[19]
Goel T., Sharma R., Tanveer M., Suganthan P., Maji K., Pilli R., Multimodal neuroimaging based Alzheimer’s disease diagnosis using evolutionary RVFL classifier, IEEE Journal of Biomedical and Health Informatics (2023).
[20]
Gong X., Zhang T., Chen C.P., Liu Z., Research review for broad learning system: Algorithms, theory, and applications, IEEE Transactions on Cybernetics (2021).
[21]
Guliyev N.J., Ismailov V.E., On the approximation by single hidden layer feedforward neural networks with fixed weights, Neural Networks 98 (2018) 296–304.
[22]
Guo X., Zhou W., Lu Q., Du A., Cai Y., Ding Y., et al., Assessing dry weight of hemodialysis patients via sparse Laplacian regularized RVFL neural network with L 2, 1-norm, BioMed Research International 2021 (2021).
[23]
Han W., Lekamalage C.K.L., Huang G.-B., Efficient joint model learning, segmentation and model updating for visual tracking, Neural Networks 147 (2022) 175–185.
[24]
He R., Hu B.-G., Zheng W.-S., Kong X.-W., Robust principal component analysis based on maximum correntropy criterion, IEEE Transactions on Image Processing 20 (6) (2011) 1485–1494.
[25]
He R., Zheng W.-S., Hu B.-G., Maximum correntropy criterion for robust face recognition, IEEE Transactions on Pattern Analysis and Machine Intelligence 33 (8) (2011) 1561–1576.
[26]
Hu M., Herng Chion J., Suganthan P.N., Katuwal R.K., Ensemble deep random vector functional link neural network for regression, IEEE Transactions on Systems, Man, and Cybernetics: Systems 53 (5) (2023) 2604–2615.
[27]
Huang G.-B., Chen Y.-Q., Babri H., Classification ability of single hidden layer feedforward neural networks, IEEE Transactions on Neural Networks 11 (3) (2000) 799–801.
[28]
Huang G., Huang G.-B., Song S., You K., Trends in extreme learning machines: A review, Neural Networks 61 (2015) 32–48.
[29]
Huang G.-B., Zhu Q.-Y., Siew C.-K., Extreme learning machine: Theory and applications, Neurocomputing 70 (1) (2006) 489–501. Neural Networks.
[30]
Igelnik B., Pao Y.-H., Stochastic choice of basis functions in adaptive function approximation and the functional-link net, IEEE Transactions on Neural Networks 6 (6) (1995) 1320–1329.
[31]
Jouppi N.P., Young C., Patil N., Patterson D., Agrawal G., Bajwa R., et al., In-datacenter performance analysis of a tensor processing unit, in: Proceedings of the 44th annual international symposium on computer architecture, ACM, 2017, pp. 1–12.
[32]
Katuwal R., Suganthan P.N., Stacked autoencoder based deep random vector functional link neural network for classification, Applied Soft Computing 85 (2019).
[33]
Kim M., The generalized extreme learning machines: Tuning hyperparameters and limiting approach for the Moore–Penrose generalized inverse, Neural Networks 144 (2021) 591–602.
[34]
Kim M., Lee I., Human-guided auto-labeling for network traffic data: The GELM approach, Neural Networks 152 (2022) 510–526.
[35]
Lamela H., Ruiz-Llata M., Image identification system based on an optical broadcast neural network and a pulse coupled neural network preprocessor stage, Applied Optics 47 (10) (2008) B52–B63.
[36]
Leung H.C., Chi Sing L., Wong E.W.M., Li S., Extreme learning machine for estimating blocking probability of bufferless OBS/OPS networks, IEEE/OSA Journal of Optical Communications and Networking 9 (8) (2017) 682–692.
[37]
Li X., Mao W., Jiang W., Extreme learning machine based transfer learning for data classification, Neurocomputing 174 (2016) 203–210.
[38]
Liu B., Kaneko T., Error analysis of digital filters realized with floating-point arithmetic, Proceedings of the IEEE 57 (10) (1969) 1735–1747.
[39]
Liu W., Pokharel P.P., Principe J.C., Correntropy: Properties and applications in non-Gaussian signal processing, IEEE Transactions on Signal Processing 55 (11) (2007) 5286–5298.
[40]
Lu X., Ming L., Liu W., Li H.-X., Probabilistic regularized extreme learning machine for robust modeling of noise data, IEEE Transactions on Cybernetics 48 (8) (2018) 2368–2377.
[41]
Maatta J., Bazaliy V., Kimari J., Djurabekova F., Nordlund K., Roos T., Gradient-based training and pruning of radial basis function networks with an application in materials physics, Neural Networks 133 (2021) 123–131.
[42]
Makino J., Hiraki K., Inaba M., GRAPE-DR: 2-Pflops massively-parallel computer with 512-core, 512-Gflops processor chips for scientific computing, in: Proceedings of the 2007 ACM/IEEE conference on supercomputing, ACM, 2007, p. 18.
[43]
Malik A., Gao R., Ganaie M., Tanveer M., Suganthan P.N., Random vector functional link network: Recent developments, applications, and future directions, Applied Soft Computing 143 (2023).
[44]
Manjunatha Prasad K., Bapat R., The generalized moore-penrose inverse, Linear Algebra and its Applications 165 (1992) 59–69.
[45]
Mercer J., Functions of positive and negative type, and their connection with the theory of integral equations, Philosophical Transactions of the Royal Society of London. Series A, Containing Papers of a Mathematical or Physical Character 209 (1909) 415–446.
[46]
Mhaskar H., Dimension independent bounds for general shallow networks, Neural Networks 123 (2020) 142–152.
[47]
Misra J., Saha I., Artificial neural networks in hardware: A survey of two decades of progress, Neurocomputing 74 (1) (2010) 239–255.
[48]
Mizutani E., Dreyfus S.E., Second-order stagewise backpropagation for Hessian-matrix analyses and investigation of negative curvature, Neural Networks 21 (2) (2008) 193–203. Advances in Neural Networks Research: IJCNN ’07.
[49]
Nash W., Sellers T., Talbot S., Cawthorn A., Ford W., Abalone, 1995,. UCI Machine Learning Repository.
[50]
Nayak D.R., Dash R., Majhi B., Pachori R.B., Zhang Y., A deep stacked random vector functional link network autoencoder for diagnosis of brain abnormalities and breast cancer, Biomedical Signal Processing and Control 58 (2020).
[51]
Needell D., Nelson A.A., Saab R., Salanevich P., Random vector functional link networks for function approximation on manifolds, 2020, arXiv preprint arXiv:2007.15776.
[52]
Pan C., Park D.S., Yang Y., Yoo H.M., Leukocyte image segmentation by visual attention and extreme learning machine, Neural Computing and Applications 21 (6) (2012) 1217–1227.
[53]
Pao Y.-H., Park G.-H., Sobajic D.J., Learning and generalization characteristics of the random vector functional-link net, Neurocomputing 6 (2) (1994) 163–180. Backpropagation, Part IV.
[54]
Peng Y., Li Q., Kong W., Qin F., Zhang J., Cichocki A., A joint optimization framework to semi-supervised RVFL and ELM networks for efficient data classification, Applied Soft Computing 97 (2020).
[55]
Peng C., Lu R., Kang O., Kai W., Batch process fault detection for multi-stage broad learning system, Neural Networks 129 (2020) 298–312.
[56]
Phatak D., Koren I., Complete and partial fault tolerance of feedforward neural nets, IEEE Transactions on Neural Networks 6 (2) (1995) 446–456.
[57]
Real estate valuation data set, 2018,. UCI Machine Learning Repository.
[58]
Rockafellar R., Conjugate convex functions in optimal control and the calculus of variations, Journal of Mathematical Analysis and Applications 32 (1) (1970) 174–222.
[59]
Schmidt, W., Kraaijveld, M., & Duin, R. (1992). Feedforward neural networks with random weights. In Proceedings., 11th IAPR international conference on pattern recognition. vol.iI. conference b: pattern recognition methodology and systems (pp. 1–4).
[60]
Shi Q., Katuwal R., Suganthan P., Tanveer M., Random vector functional link neural network based ensemble deep learning, Pattern Recognition 117 (2021).
[61]
Suganthan P.N., Katuwal R., On the origins of randomization-based feedforward neural networks, Applied Soft Computing 105 (2021).
[62]
Takahashi H., Boateng K.O., Saluja K.K., Takamatsu Y., On diagnosing multiple stuck-at faults using multiple and single fault simulation in combinational circuits, IEEE Transactions on Computer-Aided Design of Integrated Circuits and Systems 21 (3) (2002) 362–368.
[63]
Tang L., Wu Y., Yu L., A non-iterative decomposition-ensemble learning paradigm using RVFL network for crude oil price forecasting, Applied Soft Computing 70 (2018) 1097–1108.
[64]
Tao P., Cheng J., Chen L., Brain-inspired chaotic backpropagation for MLP, Neural Networks 155 (2022) 1–13.
[65]
Tfekci P., Kaya H., Combined cycle power plant, 2014,. UCI Machine Learning Repository.
[66]
Tsanas A., Xifara A., Energy efficiency, 2012,. UCI Machine Learning Repository.
[67]
Ulrich K., Servo, 1993,. UCI Machine Learning Repository.
[68]
Wang N., Er M.J., Han M., Generalized single-hidden layer feedforward networks for regression problems, IEEE Transactions on Neural Networks and Learning Systems 26 (6) (2015) 1161–1176.
[69]
Wang J., Lu S., Wang S.-H., Zhang Y.-D., A review on extreme learning machine, Multimedia Tools and Applications 81 (29) (2022) 41611–41660.
[70]
Wang K., Pei H., Cao J., Zhong P., Robust regularized extreme learning machine for regression with non-convex loss function via DC program, Journal of the Franklin Institute 357 (11) (2020) 7069–7091.
[71]
Wong P.W., Quantization noise, fixed-point multiplicative roundoff noise, and dithering, IEEE Transactions on Acoustics, Speech and Signal Processing 38 (2) (1990) 286–300.
[72]
Xu Z., Yao M., Wu Z., Dai W., Incremental regularized extreme learning machine and it’s enhancement, Neurocomputing 174 (2016) 134–142.
[73]
Yang J., Cao J., Xue A., Robust maximum mixture correntropy criterion-based semi-supervised ELM with variable center, IEEE Transactions on Circuits and Systems II: Express Briefs 67 (12) (2020) 3572–3576.
[74]
Yang Y., Wu Q.M.J., Wang Y., Zeeshan K.M., Lin X., Yuan X., Data partition learning with multiple extreme learning machines, IEEE Transactions on Cybernetics 45 (8) (2015) 1463–1475.
[75]
Yeh I.-C., Concrete slump test, 2009,. UCI Machine Learning Repository.
[76]
Yuan X.-T., Hu B.-G., Robust Feature Extraction via Information Theoretic Learning, Association for Computing Machinery, New York, NY, USA, 2009, pp. 1193–1200.
[77]
Zeng Y., Xu X., Shen D., Fang Y., Xiao Z., Traffic sign recognition using kernel extreme learning machines with deep perceptual features, IEEE Transactions on Intelligent Transportation Systems 18 (6) (2017) 1647–1653.
[78]
Zhang G., Li Y., Cui D., Mao S., Huang G.-B., R-ELMNet: Regularized extreme learning machine network, Neural Networks 130 (2020) 49–59.
[79]
Zhang L., Suganthan P., A comprehensive evaluation of random vector functional link networks, Information Sciences 367–368 (2016) 1094–1105.
[80]
Zhang L., Suganthan P.N., Visual tracking with convolutional random vector functional link network, IEEE Transactions on Cybernetics 47 (10) (2017) 3243–3253.
[81]
Zhang Y., Wu J., Cai Z., Du B., Yu P.S., An unsupervised parameter learning model for RVFL neural network, Neural Networks 112 (2019) 85–97.
[82]
Zhou Y., Peng J., Chen C.L.P., Extreme learning machine with composite kernels for hyperspectral image classification, IEEE Journal of Selected Topics in Applied Earth Observations and Remote Sensing 8 (6) (2015) 2351–2360.

Index Terms

  1. Robust noise-aware algorithm for randomized neural network and its convergence properties
        Index terms have been assigned to the content through auto-classification.

        Recommendations

        Comments

        Information & Contributors

        Information

        Published In

        cover image Neural Networks
        Neural Networks  Volume 173, Issue C
        May 2024
        703 pages

        Publisher

        Elsevier Science Ltd.

        United Kingdom

        Publication History

        Published: 02 July 2024

        Author Tags

        1. Randomized neural network
        2. Half-quadratic
        3. Network resilience
        4. Noise awareness
        5. Outlier samples

        Qualifiers

        • Research-article

        Contributors

        Other Metrics

        Bibliometrics & Citations

        Bibliometrics

        Article Metrics

        • 0
          Total Citations
        • 0
          Total Downloads
        • Downloads (Last 12 months)0
        • Downloads (Last 6 weeks)0
        Reflects downloads up to 12 Nov 2024

        Other Metrics

        Citations

        View Options

        View options

        Get Access

        Login options

        Media

        Figures

        Other

        Tables

        Share

        Share

        Share this Publication link

        Share on social media