Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
10.5555/1704555.1704719guideproceedingsArticle/Chapter ViewAbstractPublication PagesConference Proceedingsacm-pubtype
Article

Effects of widely separated clusters on lotto-type competitive learning with particle swarm features

Published: 14 June 2009 Publication History

Abstract

This correspondence describes our attempts of incorporating particle swarm features into competitive learning. We first outline our reinterpretation of the symbols and notations used in particle swarm optimisation (PSO) algorithms. Three versions of modifications to the classical frequencysensitive competitive learning are presented. A new contraction/expansion phenomenon is illustrated. We then examine the effect of introducing particle swarm like features in our lotto-type competitive learning. Experimental results indicate that, like the PSO algorithms, a careful selection of the values for the control parameters is necessary for the successful convergence of particles. With the new modifications, we show experimentally that the modified algorithm can behave both similar to PSO algorithms and the original lotto-type competitive learning algorithms.

References

[1]
A. Luk and S. Lien, "Lotto-type competitive learning with particle swarm features," 2005 Inter. Joint Conf. on Neural Networks, Montreal, Quebec, Canada, July 31-August 4, 2005, pp 1517-1522.
[2]
Y. Shi, "Particle swarm optimization," IEEE Connections, vol. 2, no. 1, pp. 8-13, February 2004.
[3]
M. Clerc, "The swarm and the queen: towards a deterministic and adaptive particle swarm optimization," 1999 Congress on Evolutionary Computation, Washington, D.C., USA, pp. 1951-1957.
[4]
S.C. Ahalt, A.K. Krishnamurty, P. Chen and D.E. Melton, "Competitive learning algorithms for vector quantization," Neural Networks, vol. 3, no. 3, pp. 277-291, 1990.
[5]
A. Luk and S. Lien, "Lotto-type competitive learning," Progress in Connectionist-Based Information System, vol. 1, Singapore: Springer-Verlag, 1997, pp. 510-513.
[6]
S. Grossberg, "On learning and energy-entropy dependence in recurrent and nonrecurrent signed networks," Journal of Statistical Physics, vol. 48, pp. 105-132, 1968.
[7]
S. Grossberg, "Adaptive pattern classification and universal recording: I. Parallel development and coding of neural feature detectors," Biol. Cybern., vol. 23, pp. 121-134, 1976.
[8]
T. Kohonen, Self-Organizing Maps. Second Edition. Berlin: Springer-Verlag, 1997.
[9]
J. Hertz, A. Krogh and R.G. Palmer, Introduction to the Theory of Neural Computation, Redwood City: Addison-Wesley, 1991.
[10]
J.F. Yang and C.M. Chen, "Winner-take-all neural networks using the highest threshold," IEEE Trans. on Neural Networks, vol. 11, no. 1, pp. 194-199, 2000.
[11]
R.P. Lippmann, "An introduction to computing with neural nets," IEEE ASSP Magazine, vol. 4, no. 2, pp. 4-22, 1987.
[12]
E. Fiesler and R. Beale (eds.), Handbook of Neural Computation, Oxford: IOP Publishing Ltd and Oxford University Press, 1997.
[13]
D. DeSieno, "Adding a conscience to competitive learning," IEEE Inter. Conf. on Neural Networks, San Diego, USA, 1998, Vol 1, pp. 117-124.
[14]
R. Kamimura, "Cooperative information maximization with Gaussian activation functions for self-organizing maps," IEEE Trans. on Neural Networks, vol. 17, no. 4, pp. 909-918, July 2006.
[15]
L. Xu, A. Krzyzark and E. Oja, "Rival penalized competitive learning for clustering analysis, RBF net and curve detection," IEEE Trans. on Neural Network, vol. 4, no. 4, pp. 636-649, 1993.
[16]
G.P. Zhang, "Avoiding pitfalls in neural network research," IEEE Trans. On Systems, Man, and Cy bernetics. Part C: Applications and Reviews, Vol. 37, no. 1, pp. 3-16, 2007.
[17]
Y. Maeda and T. Kuratani, "Simultaneous perturbation particle swarm optimization," 2006 IEEE World Congress on Computation Intelligence, Vancouver, BC, Canada, 2006, pp. 2687-2691.
[18]
Y. Del Valle, G.K. Venayagamoorthy, S. Mohagheghi, J.C. Hermandez and R.G. Harley, "Particle swarm optimization: basic concepts, variants and applications in power systems," IEEE Trans. on Evolutionary Computation, Vol. 12, no. 2, pp. 171-195, April 2008.

Recommendations

Comments

Information & Contributors

Information

Published In

cover image Guide Proceedings
IJCNN'09: Proceedings of the 2009 international joint conference on Neural Networks
June 2009
3570 pages
ISBN:9781424435494

Sponsors

  • Georgia Tech: Georgia Institute of Technology
  • ieee-cis: IEEE Computational Intelligence Society
  • INNS: International Neural Network Society

Publisher

IEEE Press

Publication History

Published: 14 June 2009

Qualifiers

  • Article

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • 0
    Total Citations
  • 0
    Total Downloads
  • Downloads (Last 12 months)0
  • Downloads (Last 6 weeks)0
Reflects downloads up to 08 Feb 2025

Other Metrics

Citations

View Options

View options

Figures

Tables

Media

Share

Share

Share this Publication link

Share on social media