Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
research-article

Quaternionic Hopfield neural networks with twin-multistate activation function

Published: 06 December 2017 Publication History

Abstract

In the present work, a Hopfield neural network with a new quaternionic activation function, referred to as a twin-multistate activation function, is proposed. The multistate activation function has been used in complex-valued Hopfield neural networks (CHNNs). It is useful for representing multilevel information, and the CHNNs with a multistate activation function have been applied to the storage of multilevel data, such as gray-scale images. A twin-multistate activation function consists of two multistate activation functions. Quaternionic Hopfield neural networks (QHNNs) with a twin-multistate activation function can take the place of the CHNNs with a multistate activation function. The QHNNs require half the number of connection parameters of CHNNs. Projection rule is a fast learning algorithm, and is suitable under restricted computational power. However, it requires full-connection, and the sparse connections are not allowed. Projection rule is also available for the QHNNs with a twin-multistate activation function. When the memory resource and computational power are restricted, the QHNNs with a twin-multistate activation function is useful. In the present work, the conventional network topology is only considered. To take advantage of non-commutativity of quaternions, several researchers proposed network topology for QHNNs, such as dual connections. In future, the QHNNs with a twin-multistate activation function will be extended to those with such network topology.

References

[1]
J. Hertz, A. Krogh, R.G. Palmer, Introduction to the Theory of Network Computation, Addison Wesley Publishing Company, USA, 1991.
[2]
S. Jankowski, A. Lozowski, J.M. Zurada, Complex-valued multistate neural associative memory, IEEE Trans. Neural Netw., 7 (1996) 1491-1496.
[3]
H. Aoki, Y. Kosugi, An image storage system using complex-valued associative memories, 2000.
[4]
H. Aoki, A complex-valued neuron to transform gray level images to phase information, 2002.
[5]
G. Tanaka, K. Aihara, Complex-valued multistate associative memory with nonlinear multilevel functions for gray-level image reconstruction, IEEE Trans. Neural Netw., 20 (2009) 1463-1473.
[6]
M. Kobayashi, Gradient descent learning rule for complex-valued associative memories with large constant terms, IEEJ Trans. Electr. Electron. Eng., 11 (2016) 357-363.
[7]
M. Kobayashi, H. Yamada, M. Kitahara, Noise robust gradient descent learning for complex-valued associative memory, IEICE Trans. Fundam. Electron. Commun. Comput. Sci., E94-A (2011) 1756-1759.
[8]
M. Kobayashi, Pseudo-relaxation learning algorithm for complex-valued associative memory, Int. J. Neural Syst., 18 (2008) 147-156.
[9]
M. K.Muezzinoglu, C. Guzelis, J.M. Zurada, A new design method for the complex-valued multistate Hopfield associative memory, IEEE Trans. Neural Netw., 14 (2003) 891-899.
[10]
D.L. Lee, Improving the capacity of complex-valued neural networks with a modified gradient descent learning rule, IEEE Trans. Neural Netw., 12 (2001) 439-443.
[11]
D.L. Lee, Improvements of complex-valued Hopfield associative memory by using generalized projection rules, IEEE Trans. Neural Netw., 17 (2006) 1341-1347.
[12]
M. Kitahara, M. Kobayashi, Projection rule for complex-valued associative memory with large constant terms, Nonlinear Theory Appl., 3 (2012) 426-435.
[13]
M. Kitahara, M. Kobayashi, Projection rule for rotor Hopfield neural networks, IEEE Trans. Neural Netw. Learn. Syst., 25 (2012) 1298-1307.
[14]
T. Isokawa, H. Nishimura, N. Kamiura, N. Matsui, Associative memory in quaternionic Hopfield neural network, Int. J. Neural Syst., 18 (2008) 135-145.
[15]
T. Isokawa, H. Nishimura, A. Saitoh, N. Kamiura, N. Matsui, On the scheme of quaternionic multistate Hopfield neural network, 2008.
[16]
M. Kobayashi, Rotational invariance of quaternionic Hopfield neural networks, IEEJ Trans. Electr. Electron. Eng., 11 (2016) 516-520.
[17]
M. Kobayashi, Uniqueness theorem for quaternionic neural networks, Signal Process., 136 (2016) 102-106.
[18]
J. Zhu, J. Sun, Global exponential stability of Clifford-valued recurrent neural networks, Neurocomputing, 173 (2016) 685-689.
[19]
T. Fang, J. Sun, Stability of complex-valued recurrent neural networks with time-delays, IEEE Trans. Neural Netw. Learn. Syst., 25 (2014) 1709-1713.
[20]
M. Kobayashi, Hybrid quaternionic Hopfield neural network, IEICE Trans. Fundam. Electron. Commun. Comput. Sci., E98-A (2015) 1512-1518.
[21]
T. Minemoto, T. Isokawa, M. Kobayashi, H. Nishimura, N. Matsui, Pattern retrieval by quaternionic associative memory with dual connections, 2016.
[22]
A. Krizhevsky, Learning Multiple Layers of Features From Tiny Images, University of Toronto, 2009.

Cited By

View all
  • (2022)Existence and Finite-Time Stability of Besicovitch Almost Periodic Solutions of Fractional-Order Quaternion-Valued Neural Networks with Time-Varying DelaysNeural Processing Letters10.1007/s11063-021-10722-454:3(2127-2141)Online publication date: 1-Jun-2022
  • (2020)Ensemble of Binary Classifiers Combined Using Recurrent Correlation Associative MemoriesIntelligent Systems10.1007/978-3-030-61380-8_30(442-455)Online publication date: 20-Oct-2020
  • (2019)Almost Automorphic Solutions in Distribution Sense of Quaternion-Valued Stochastic Recurrent Neural Networks with Mixed Time-Varying DelaysNeural Processing Letters10.1007/s11063-019-10151-451:2(1353-1377)Online publication date: 9-Nov-2019
  • Show More Cited By

Recommendations

Comments

Information & Contributors

Information

Published In

cover image Neurocomputing
Neurocomputing  Volume 267, Issue C
December 2017
496 pages

Publisher

Elsevier Science Publishers B. V.

Netherlands

Publication History

Published: 06 December 2017

Author Tags

  1. Complex-valued neural network
  2. Hopfield neural network
  3. Multistate neuron
  4. Projection rule
  5. Quaternion

Qualifiers

  • Research-article

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)0
  • Downloads (Last 6 weeks)0
Reflects downloads up to 13 Nov 2024

Other Metrics

Citations

Cited By

View all
  • (2022)Existence and Finite-Time Stability of Besicovitch Almost Periodic Solutions of Fractional-Order Quaternion-Valued Neural Networks with Time-Varying DelaysNeural Processing Letters10.1007/s11063-021-10722-454:3(2127-2141)Online publication date: 1-Jun-2022
  • (2020)Ensemble of Binary Classifiers Combined Using Recurrent Correlation Associative MemoriesIntelligent Systems10.1007/978-3-030-61380-8_30(442-455)Online publication date: 20-Oct-2020
  • (2019)Almost Automorphic Solutions in Distribution Sense of Quaternion-Valued Stochastic Recurrent Neural Networks with Mixed Time-Varying DelaysNeural Processing Letters10.1007/s11063-019-10151-451:2(1353-1377)Online publication date: 9-Nov-2019
  • (2018)Storage Capacities of Twin-Multistate Quaternion Hopfield Neural NetworksComputational Intelligence and Neuroscience10.1155/2018/12752902018Online publication date: 1-Nov-2018

View Options

View options

Get Access

Login options

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media