Fixed-point feedforward deep neural network design using weights+ 1, 0, and− 1

K Hwang, W Sung - 2014 IEEE Workshop on Signal Processing …, 2014 - ieeexplore.ieee.org
2014 IEEE Workshop on Signal Processing Systems (SiPS), 2014ieeexplore.ieee.org
Feedforward deep neural networks that employ multiple hidden layers show high
performance in many applications, but they demand complex hardware for implementation.
The hardware complexity can be much lowered by minimizing the word-length of weights
and signals, but direct quantization for fixed-point network design does not yield good
results. We optimize the fixed-point design by employing backpropagation based retraining.
The designed fixed-point networks with ternary weights (+ 1, 0, and-1) and 3-bit signal show …
Feedforward deep neural networks that employ multiple hidden layers show high performance in many applications, but they demand complex hardware for implementation. The hardware complexity can be much lowered by minimizing the word-length of weights and signals, but direct quantization for fixed-point network design does not yield good results. We optimize the fixed-point design by employing backpropagation based retraining. The designed fixed-point networks with ternary weights (+1, 0, and -1) and 3-bit signal show only negligible performance loss when compared to the floating-point coun-terparts. The backpropagation for retraining uses quantized weights and fixed-point signal to compute the output, but utilizes high precision values for adapting the networks. A character recognition and a phoneme recognition examples are presented.
ieeexplore.ieee.org