Abstract
Binary neural networks (BNNs) have important value in many application areas. They adopt linearly separable structures, which are simple and easy to implement by hardware. For a BNN with single hidden layer, the problem of how to determine the upper bound of the number of hidden neurons has not been solved well and truly. This paper defines a special structure called most isolated samples (MIS) in the Boolean space. We prove that at least 2n−1 hidden neurons are needed to express the MIS logical relationship in the Boolean space if the hidden neurons of a BNN and its output neuron form a structure of AND/OR logic. Then the paper points out that the n-bit parity problem is just equivalent to the MIS structure. Furthermore, by proposing a new concept of restraining neuron and using it in the hidden layer, we can reduce the number of hidden neurons to n. This result explains the important role of restraining neurons in some cases. Finally, on the basis of Hamming sphere and SP function, both the restraining neuron and the n-bit parity problem are given a clear logical meaning, and can be described by a series of logical expressions.
Similar content being viewed by others
References
Wang D, Chaudhair N S. A fast modified constructive-covering algorithm for binary multi-layer neural networks. Neurocomputing, 2006, 70: 445–461
Zhang J Y, Liang J L, Bao Z. Classifier design for heavy-overlap patterns based on capture/inhibition principle (in Chinese). Acta Electron Sin, 2006, 34: 2154–2160
Iyoda E M, Nobuhara H, Hirota K. A solution for the n-bit parity problem using a single translated multiplicative neuron. Neur Process Lett, 2003, 18: 213–218
Shen Y J, Wang B W. A fast learning algorithm of neural network with tunable activation function. Sci China Ser F-Inf Sci, 2004, 47: 126–136
Grochowski M, Duch W. Learning highly non-separable boolean functions using constructive feedforward neural network. In: Proceedings of ICANN’07: the 17th International Conference on Artificial Networks. Porto, Portugal, 2007. 180–189
Lavtrtsky E. On the exact solution of the parity-N problem using ordered neural networks. Neural Netw, 2000, 13: 643–649
Wilamowski B M, Hunter D, Malinowski A. Solving parity-n problems with feed-forward neural networks. Neural Netw, 2003, 4: 2546–2551
Liu D R, Hohil M E, Smith S H. N-bit parity neural networks:new solutions based on linear programming. Neurocomputing, 2002, 48: 477–488
Moraga C. Design of neural networks. Lecture Notes in Artif Intell, 2007, 4692: 26–33
Duch W. K-Separability. Lecture Notes Comput Sci, 2006, 4131: 188–197
Setiono R. On the solution of the parity problem by a single hidden layer feedforward neural network. Neurocomputing, 1997, 16: 225–235
Stork D G, Allen J D. How to solve the n-bit parity problem with two hidden units. Neur Netw, 1992, 5: 923–926
Brown D A. Neural network letter. Neur Netw, 1993, 6: 607–608
Minor J M. Parity with two layer feedforward nets. Neur Netw, 1993, 6: 705–707
Hohil M E, Liu D R, Smith S H. Solving the n-bit parity problem using neural networks. Neur Netw, 1999, 12: 1321–1323
Huang S C, Huang Y F. Bounds on the number of hidden neurons in multilayer perceptrons. IEEE Trans Neur Netw, 1991, 2: 47–55
Gray D L, Michel A N. A training algorithm for binary feedforward neural networks. IEEE Trans Neur Netw, 1992, 3: 176–194
Abe T, Saito T. An approach to prediction of spatio-temporal patterns based on binary neural networks and celltclar automata. In: IEEE International Joint Conference on Neural Networks, 2008. 2494–2499
Tian H, Chen S L, Zhang J Y. Study on the classification ability of feedforward neural network through experiments (in Chinese). Microelectron Comput, 2004, 21: 99–101
Cui R Y, Hong B R. On constructing a hidden layer for three-layered feedforward neural networks (in Chinese). J Comput Res Develop, 2004, 41: 524–530
Erhan O. Sign-representation of Boolean functions using a small number of monomials. Neur Netw, 2009, 22: 938–948
Erhan O. An upper bound on the minimum number of monomials required to separate dichotomies of {−1, 1}n. Neur Comput, 2006, 18: 3119–3138
Fung H K, Li L K. Minimal feedforward parity networks using threshold gates. Neur Comput, 2001, 13: 319–326
Lu Y, Han J H, Gao J. Research on the minimal upper bound of the number of hidden nodes in binary neural networks (in Chinese). Patt Recog Artif Intell, 2000, 13: 254–257
Ma X M, Yang Y X, Zhang Z Z, et al. An efficient algorithm for Boolean neural network (in Chinese). J China Inst Commun, 1999, 20: 13–18
Lu Y, Han J H, Wei Z. A general judging and constructing method of SP functions in binary neural networks. ACTA Autom Sin, 2003, 29: 234–241
Kim J H, Park S K. The geometrical learning of binary neural networks. IEEE Trans Neur Netw, 1995, 6: 237–247
Lu Y, Wei Z, Gao J, et al. Logical meaning of hamming sphere and its general judgement method in binary neural networks (in Chinese). J Comput Res Develop, 2002, 39: 79–86
Author information
Authors and Affiliations
Corresponding author
Rights and permissions
About this article
Cite this article
Lu, Y., Yang, J., Wang, Q. et al. The upper bound of the minimal number of hidden neurons for the parity problem in binary neural networks. Sci. China Inf. Sci. 55, 1579–1587 (2012). https://doi.org/10.1007/s11432-011-4405-6
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s11432-011-4405-6