Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
Skip to main content

Accelerated training algorithm for feedforward neural networks based on least squares method

  • Published:
Neural Processing Letters Aims and scope Submit manuscript

Abstract

A least squares based training algorithm for feedforward neural networks is presented. By decomposing each neuron of the network into a linear part and a nonlinear part, the learning error can then be minimized on each neuron by applying the least squares method to solve the linear part of the neuron. In all the problems investigated, the proposed algorithm is capable of achieving the required error level in one training iteration. Comparing to the conventional backpropagation algorithm and other fast training algorithms, the proposed training algorithm provides a major breakthrough in speeding up the training process.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

Explore related subjects

Discover the latest articles, news and stories from top researchers in related subjects.

References

  1. D.E.Rumelhart, G.E.Hinton, R.J.Williams. Learning internal representations by error backpropagation, in D.E.Rumelhart and J.L. McClelland eds.,Parallel Distributed Processings, MIT Press, Cambridge, Massachusetts, 1986.

    Google Scholar 

  2. L.B.Almeida, F.M.Silva. Acceleration techniques for the backpropagation algorithm,Lectures Notes in Computer Science, vol. 412, Springer-Verlag, 1990.

  3. E. Barnard. Optimization for training neural nets.IEEE Transactions on Neural Networks, vol. 3, no. 2, pp. 232–240, 1992.

    Google Scholar 

  4. Y.F. Yam, T.W.S. Chow. Extended backpropagation algorithm,Electronic Letters, vol. 29, no. 19, pp. 1701–1702, 1993.

    Google Scholar 

  5. M.T. Hagan, M.B.Menhaj. Training feedforward networks with the Marquardt algorithm,IEEE Transactions on Neural Networks, vol. 5, no. 6, pp. 989–993, 1994.

    Google Scholar 

  6. M.R. Azimi-Sadjadi, R.Liou. Fast learning process of multilayer neural networks using recursive least squares method,IEEE Transaction on Signal Processing, vol. 40, no. 2, pp. 446–450, 1992.

    Google Scholar 

  7. F.Biegler-König, F.BÄrmann. A learning algorithm for multilayered neural networks based on linear least squares problems,Neural Networks, vol. 6, pp. 127–131, 1993.

    Google Scholar 

  8. M.C.Mackey, L.Glass. Oscillations and chaos in physiological control systems,Science, vol. 197, pp. 287–289, 1977.

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Rights and permissions

Reprints and permissions

About this article

Cite this article

Yam, Y.F., Chow, T.W.S. Accelerated training algorithm for feedforward neural networks based on least squares method. Neural Process Lett 2, 20–25 (1995). https://doi.org/10.1007/BF02279934

Download citation

  • Issue Date:

  • DOI: https://doi.org/10.1007/BF02279934

Keywords