Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
article

An incremental adaptive neural network model for online noisy data regression and its application to compartment fire studies

Published: 01 January 2011 Publication History

Abstract

This paper presents a probabilistic-entropy-based neural network (PENN) model for tackling online data regression problems. The network learns online with an incremental growth network structure and performs regression in a noisy environment. The training samples presented to the model are clustered into hyperellipsoidal Gaussian kernels in the joint space of the input and output domains by using the principles of Bayesian classification and minimization of entropy. The joint probability distribution is established by applying the Parzen density estimator to the kernels. The prediction is carried out by evaluating the expected conditional mean of the output space with the given input vector. The PENN model is demonstrated to be able to remove symmetrically distributed noise embedded in the training samples. The performance of the model was evaluated by three benchmarking problems with noisy data (i.e., Ozone, Friedman#1, and Santa Fe Series E). The results show that the PENN model is able to outperform, statistically, other artificial neural network models. The PENN model is also applied to solve a fire safety engineering problem. It has been adopted to predict the height of the thermal interface which is one of the indicators of fire safety level of the fire compartment. The data samples are collected from a real experiment and are noisy in nature. The results show the superior performance of the PENN model working in a noisy environment, and the results are found to be acceptable according to industrial requirements.

References

[1]
Rosenblatt, F., Principles of Neurodynamics. 1962. Spartan Books, New York.
[2]
Broomhead, D.S. and Lowe, D., Multi-variable functional interpolation and adaptive networks. Complex Systems. v2. 321-355.
[3]
Specht, D.F., A general regression neural network. IEEE Transaction on Neural Networks. v2 i6. 568-576.
[4]
Saad, D., On-line Learning in Neural Networks. 1998. Cambridge University Press.
[5]
Saad, D. and Solla, S.A., Exact solution for on-line learning in multilayer neural networks. Physical Review Letters. v74 i21. 4337-4340.
[6]
Freeman, J.A. and Saad, D., Dynamics of on-line learning in radial basis function networks. Physical Review E. v56 i1. 907-918.
[7]
Yuen, R.K.K., Lee, E.W.M., Lim, C.P. and Cheng, G.W.Y., Fusion of GRNN and FA for online noisy data regression. Neural Processing Letters. v19 i3. 227-241.
[8]
Carpenter, G.A., Grossberg, S. and Rosen, D.B., Fuzzy ART: fast stable learning and categorization of analog patterns by an adaptive resonance system. Neural Networks. v4. 759-771.
[9]
Nadaraya, E.A., On estimating regression. Theory of Probability and Its Applications. v9. 141-142.
[10]
Watson, G.S., Smooth regression analysis. Sankhya: The Indian Journal of Statistics - Series A. v26. 359-372.
[11]
Cai, Z., Weighted Nadaraya-Watson regression estimation. Statistics & Probability Letters. v51. 307-318.
[12]
Padhy, S.K., Panigrahi, S.P., Patra, P.K. and Nayak, S.K., Non-linear channel equalization using adaptive MPNN. Applied Soft Computing. v9. 1016-1022.
[13]
Devroye, L., The Hilbert kernel regression estimate. Journal of Multivariate Analysis. v65. 209-227.
[14]
Meinicke, P., Klanke, S., Memisevic, R. and Ritter, H., Principal surfaces from unsupervised kernel regression. IEEE Transactions on Pattern Analysis and Machine Intelligence. v27 i9. 1379-1391.
[15]
Lee, E.W.M., Lim, C.P., Yuen, R.K.K. and Lo, S.M., A hybrid neural network model for noisy data regression. IEEE Transactions on Systems, Man and Cybernetics, Part B. v34 i2. 951-960.
[16]
Bezdek, J.B., A convergence theorem for the fuzzy ISODATA clustering algorithms. IEEE Transaction on Pattern Analysis and Machine Intelligence PAMI-2. v2. 1-8.
[17]
Kohonen, T., The self-organizing map. Proceedings of the IEEE. v78 i9. 1464-1480.
[18]
Carpenter, G.A., Grossberg, S. and David, B.R., Fuzzy ART: fast stable learning and categorization of analog patterns by an adaptive resonance system. Neural Networks. v4. 759-771.
[19]
William, J.R., Gaussian ARTMAP: a neural network for fast incremental learning of noisy multidimensional maps. Neural Networks. v9 i5. 881-887.
[20]
Tan, S.C., Rao, M.V.C. and Lim, C.P., Fuzzy ARTMAP dynamic decay adjustment: an improved fuzzy ARTMAP model with a conflict resolving facility. Applied Soft Computing. v8. 543-554.
[21]
Carpenter, G.A., Grossberg, S., Markuzone, N., Reynolds, J.H. and Rosen, D.B., Fuzzy ARTMAP: a neural network structure for incremental supervised learning of analog multidimensional maps. IEEE Transactions on Neural Networks. v3 i5. 698-713.
[22]
Williamson, J.R., Gaussian ARTMAP: a neural network for fast incremental learning of noisy multidimensional maps. Neural Networks. v9 i5. 881-897.
[23]
Carpenter, G.A. and Grossberg, S., A massively parallel architecture for a self-organising neural pattern recognition machine. Computer Vision, Graphics and Image Processing. v37. 54-115.
[24]
Carpenter, G.A. and Grossberg, S., ART2: stable self-organisation of pattern recognition codes for analogue input patterns. Applied Optics. v26. 4919-4930.
[25]
Carpenter, G.A. and Grossberg, S., ART3 hierarchical search: chemical transmitters in self-organising pattern recognition architectures. Neural Networks. v3. 129-152.
[26]
Hartigan, J.A., Clustering Algorithm. 1975. John Wiley and Sons, New York.
[27]
Lopes, M.L.M., Minussi, C.R. and Lotufo, A.D.P., Electric load forecasting using fuzzy ART&ARTMAP neural network. Applied Soft Computing. v5. 235-244.
[28]
Quteishat, A. and Lim, C.P., A modified fuzzy min-max neural network with rule extraction and its application to fault detection and classification. Applied Soft Computing. v8. 985-995.
[29]
Soliman, H.S. and Omari, M., A neural network approach to image data compression. Applied Soft Computing. v6. 258-271.
[30]
Holmes, C. and Denison, D., Minimum-entropy data partitioning using reversible jump Markov Chain Monte Carlo. IEEE Transaction on Pattern Analysis and Machine Intelligence. v23 i8. 909-914.
[31]
Karayiannis, N.B., An axiomatic approach to soft learning vector quantization and clustering. IEEE Transactions on Neural Network. v10 i5. 1153-1165.
[32]
Gokcay, E. and Principe, J.C., Information theoretic clustering. IEEE Transactions on Pattern Analysis and Machine Intelligence. v24 i2. 158-171.
[33]
Karystinos, G., On overfitting, generalization, and randomly expanded training sets. IEEE Transactions on Neural Networks. v11 i5. 1050-1057.
[34]
Ahmed, N.A and Gokhale, D.V., Entropy expression and their estimators for multivariate distributions. IEEE Transactions on Information Theory. v35 i3. 688-692.
[35]
Parzen, E., On estimation of a probability density function and mode. Annual Mathematical Statistics. v33. 155-167.
[36]
Inokuchi, R. and Miyamoto, S., LVQ clustering and SOM using a kernel function. Proceedings of IEEE International Conference on Fuzzy Systems. v3. 25-29.
[37]
Carney, J. and Cunningham, P., Tuning diversity in bagged ensembles. International Journal of Neural Systems. v10 i4. 267-279.
[38]
Breiman, L., Bagging Predictors, Technical Report No. 421. 1994. Department of Statistics, University of California at Berkeley, California.
[39]
Freidman, J., Multivariate adaptive regression splines (with discussion). Annals of Statistics. v19. 1-141.
[40]
Weigend, A.S. and Gersehnfield, N.A., Time Series Prediction: Forecasting the Future and Understanding the Past. 1994. Addison-Wesley, Reading, MA.
[41]
Singh, S., Noise impact on time-series forecasting using an intelligent pattern matching technique. Pattern Recognition. v32. 1389-1398.
[42]
Weiss, S.M. and Kulikowski, C.A., Computer Systems That Learn. 1991. Morgan Kaufmann, San Mateo, CA.
[43]
Okayama, Y., A primitive study of a fire detection method controlled by artificial neural net. Fire Safety Journal. v17 i6. 535-553.
[44]
Ishii, H., Ono, T., Yamauchi, Y. and Ohtani, S., Fire detection system by multi-layered neural network with delay circuit. In: Fire Safety Science - Proceedings of the Fourth International Symposium, pp. 761-772.
[45]
Milke, J.A. and McAvoy, T.J., Analysis of signature patterns for discriminating fire detection with multiple sensors. Fire Technology. v31 i2. 120-136.
[46]
Pfister, G., Multisensor/multicriteria fire detection: a new trend rapidly becomes state of art. Fire Technology. v33 i2. 115-139.
[47]
Chen, Y., Sathyamoorthy, S. and Serio, M.A., New fire detection system using FT-IR spectroscopy and artificial neural network. In: NISTIR6242, NIST Annual Conference on Fire Research,
[48]
Lee, E.W.M., Lau, P.C. and Yuen, K.K.Y., Application of artificial neural network to building compartment design for fire safety. In: Proceedings of the 7th International Conference on Intelligent Data Engineering and Automated Learning (IDEAL 2006), pp. 265-274.
[49]
Lee, E.W.M., Lee, Y.Y., Lim, C.P. and Tang, C.Y., Application of noisy data classification technique to determine the occurrence of flashover in compartment fires. Advanced Engineering Informatics. v20 i2. 213-222.
[50]
Lee, E.W.M., Yuen, R.K.K., Lo, S.M., Lam, K.C. and Yeoh, G.H., A novel artificial neural network fire model for prediction of thermal interface location in single compartment fire. Fire Safety Journal. v39 i1. 67-87.
[51]
Steckler, K.D., Quintiere, J.D. and Rinkinen, W.J., Flow Induced by Fire in a Compartment, NBSIR 82-2520. 1982. National Bureau of Standards, Washington, DC.
[52]
Kraaijveld, M.A., A Parzen classifier with an improved robustness against deviations between training and test data. Pattern Recognition Letters. v17. 679-689.
[53]
Lim, C.P. and Harrison, R.F., An incremental adaptive network for on-line supervised learning and probability estimation. Neural Networks. v10 i5. 925-939.

Cited By

View all
  • (2016)A Constrained Optimization based Extreme Learning Machine for noisy data regressionNeurocomputing10.1016/j.neucom.2015.07.065171:C(1431-1443)Online publication date: 1-Jan-2016
  • (2016)A SVR-based ensemble approach for drifting data streams with recurring patternsApplied Soft Computing10.1016/j.asoc.2016.06.03047:C(553-564)Online publication date: 1-Oct-2016
  • (2014)A fully autonomous kernel-based online learning neural network model and its application to building cooling load predictionSoft Computing - A Fusion of Foundations, Methodologies and Applications10.1007/s00500-013-1181-918:10(1999-2014)Online publication date: 1-Oct-2014
  1. An incremental adaptive neural network model for online noisy data regression and its application to compartment fire studies

      Recommendations

      Comments

      Information & Contributors

      Information

      Published In

      cover image Applied Soft Computing
      Applied Soft Computing  Volume 11, Issue 1
      January, 2011
      1490 pages

      Publisher

      Elsevier Science Publishers B. V.

      Netherlands

      Publication History

      Published: 01 January 2011

      Author Tags

      1. Artificial neural network
      2. Compartment fire
      3. Kernel regression

      Qualifiers

      • Article

      Contributors

      Other Metrics

      Bibliometrics & Citations

      Bibliometrics

      Article Metrics

      • Downloads (Last 12 months)0
      • Downloads (Last 6 weeks)0
      Reflects downloads up to 11 Feb 2025

      Other Metrics

      Citations

      Cited By

      View all
      • (2016)A Constrained Optimization based Extreme Learning Machine for noisy data regressionNeurocomputing10.1016/j.neucom.2015.07.065171:C(1431-1443)Online publication date: 1-Jan-2016
      • (2016)A SVR-based ensemble approach for drifting data streams with recurring patternsApplied Soft Computing10.1016/j.asoc.2016.06.03047:C(553-564)Online publication date: 1-Oct-2016
      • (2014)A fully autonomous kernel-based online learning neural network model and its application to building cooling load predictionSoft Computing - A Fusion of Foundations, Methodologies and Applications10.1007/s00500-013-1181-918:10(1999-2014)Online publication date: 1-Oct-2014

      View Options

      View options

      Figures

      Tables

      Media

      Share

      Share

      Share this Publication link

      Share on social media