Abstract
A new dynamic tree structured network—the Stochastic Competitive Evolutionary Neural Tree (SCENT) is introduced. The network is able to provide a hierarchical classification of unlabelled data sets. The main advantage that SCENT offers over other hierarchical competitive networks is its ability to self-determine the number and structure of the competitive nodes in the network without the need for externally set parameters. The network produces stable classificatory structures by halting its growth using locally calculated, stochastically controlled, heuristics. The performance of the network is analysed by comparing its results with that of a good non-hierarchical clusterer, and with three other hierarchical clusterers and its non stochastic predecessor. SCENT's classificatory capabilities are demonstrated by its ability to produce a representative hierarchical structure to classify a broad range of data sets.
Similar content being viewed by others
Explore related subjects
Discover the latest articles, news and stories from top researchers in related subjects.References
J. Hertz, A. Krogh, and R.G. Palmer, An Introduction to the Theory of Neural Computation, Adddison Wesley, 1991.
P.H. Sneath and R.R. Sokal, Numerical Taxonomy, the Principles and Practice of Numerical Classification, W.H. Freeman and Company: San Francisco, 1973.
T. Li, Y.Y. Fang, and L.Y. Fang, “Astructure-parameter-adaptive (SPA) neural tree for the recognition of large character set,” Pattern Recognition, vol. 28, no. 4, pp. 315–329, 1995.
J. Racz and T. Klotz, “Knowledge representation by dynamic competitive learning techniques,” SPIE Applications of Artificial Neural Networks II, vol. 1469, pp. 778–783, 1991.
K. Butchart, N. Davey, and R.G. Adams, “An investigation into the performance and representations of a stochastic evolutionary neural tree,” in Proceedings of the International Conference on the Applications of Neural Networks and Genetic Algorithms(ICANNGA 97), Springer Verlag.
R.G. Adams, N. Davey, and S.G. George, “Analysing hierarchical data using a stochastic evolutionary neural tree,” in Proceedings of the International Symposium on the Engineering of Intelligent Systems(EIS98), 1998, pp. 268–275.
H. Song and S. Lee, “A self-organising neural tree for large-set pattern classification,” IEEE Transactions on Neural Networks, vol. 9, no. 3, pp. 369–380, 1998.
T. Li, Y.Y. Tang, S.C. Suen, L.Y. Fang, and A.J. Jennings, “A structurally adaptive neural tree for recognition of a large character set,” in Proc. 11th IAPR International Joint Conference on Pattern Recognition, 1992, vol. II, pp. 187–190.
K. Butchart, “Hierarchical clustering using a dynamic self organising neural networks,” PhD Thesis, University of Hertfordshire, 1996.
N. Metropolis, A.W. Rosenbluth, M.N. Rosenbluth, A.A. Teller, and E. Teller, “Equations of state calculations by fast computing machines,” Journal Chemical Physics, vol. 21, pp. 1087–1091, 1953.
T. Martinetz, S. Berkovich, and K. Schulten, “Neural-gas network for vector quantisation and its application to time-series prediction,” IEEE Transactions on Neural Networks, vol. 44, no. 4, 1993.
G. Bartfai, “An ART-based modular architecture for learning hierarchical clusterings,” Neurocomputing, vol. 13, pp. 31–46, 1996.
P.R. Krishnaiah and L.N. Kanal, “Classification, pattern recognition, and reduction of dimensionality,” in Handbook of Statistics, vol. 2, North Holland: Amsterdam, 1989.
K. Butchart, N. Davey, and R. Adams, “A comparative study of three neural networks that use soft competition,” in Proceedings of IWANN'95, 1995, pp. 308–314.
N. Pal, J. Bezdec, and E. Tsao, “Generalised clustering networks andKohonen's Self-Organising Scheme,” IEEE Transactions on Neural Networks, vol. 44, no. 4, 1993.
E. Yair, K. Zeger, and A. Gersho, “Competitive learning and soft competition for vector quantiser design,” IEEE Transactions on Signal Processing, vol. 40, no. 2, 1992.
K. Butchart, N. Davey, and R. Adams, “A comparative study of two self organising and structually adaptive dynamic neural tree networks,” in Neural Networks and their Applications, edited by J.G. Taylor, John Wiley, pp. 93–112, 1996.
G.A. Carpenter and S. Grossberg, “Amassively parallel architecture for a self-organising neural pattern recognising machine,” Computer Vision; Graphic and Image Processing, vol. 37, pp. 54–115, 1987.
K. Butchart, N. Davey, and R. Adams, “Hierarchical classification with a stochastic competitive evolutionary neural tree,” in Proceedings of ICNN96, 1996, vol. 2, pp. 1372–1377.
T. Li, L. Fang, and K. Q-Q Li, “Hierarchical classification and vector quantisation with neural trees,” Neurocomputing, vol. 5, pp. 119–139.
T. Gale, “Perception and semantic information in human object recognition: A neuropsychological and connectionist study,” Ph.D. Thesis, University of Hertfordshire, 1997.
P.M. Murphy and D.W. Aha, “Repository of machine learning databases,” Technical Report, Department of Information and Computer Science, University of California, CA, 1992.
B.S. Everit, Cluster Analysis, Edward Arnold: London, 1993.
J.C. Bezdek and N.R. Pal, “Two soft relatives of learning vector quantization,” Neural Networks, vol. 8, no. 5, pp. 729–743, 1995.
Author information
Authors and Affiliations
Rights and permissions
About this article
Cite this article
Davey, N., Adams, R. & George, S. The Architecture and Performance of a Stochastic Competitive Evolutionary Neural Tree Network. Applied Intelligence 12, 75–93 (2000). https://doi.org/10.1023/A:1008364004705
Issue Date:
DOI: https://doi.org/10.1023/A:1008364004705