Abstract
This paper presents a novel recurrent time continuous neural network model which performs linear fractional optimization subject to bound constraints on each of the optimization variables. The network is proved to be complete in the sense that the set of optima of the objective function to be minimized with bound constraints coincides with the set of equilibria of the neural network. It is also shown that the network is primal and globally convergent in the sense that its trajectory cannot escape from the feasible region and will converge to an exact optimal solution for any initial point chosen in the feasible bound region. Simulation results are given to demonstrate further the global convergence and the good performance of the proposed neural network for linear fractional programming problems with bound constraints.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
Similar content being viewed by others
References
Charnes, A., Cooper, W.W., Rhodes, E.: Measuring the Efficiency of Decision Making Units. European J. Oper. Res. 2(2), 429–444 (1978)
Patkar, V.N.: Fractional Programming Models for Sharing of Urban Development Responsabilities. Nagarlok 22(1), 88–94 (1990)
Mjelde, K.M.: Fractional Resource Allocation with S-shaped Return Functions. J. Oper. Res. Soc. 34(2), 627–632 (1983)
Stancu-Minasian, I.M.: Fractional Programming, Theory, Methods and Applications. Kluwer Academic Publishers, Netherlands (1992)
Hopfield, J.J.: Neurons with Graded Response Have Collective Computational Properties Like Those of Two-state Neurons. Proc. Natl. Acad. Sci. 81(10), 3088–3092 (1984)
Hopfield, J.J., Tank, D.W.: Neural Computation of Decisions in Optimization Problems. Biolog. Cybernetics 52(1), 141–152 (1985)
Cichocki, A., Unbehauen, R.: Neural Networks for Optimization and Signal Processing. John Wiley & Sons, New York (1993)
Wang, J.: A Deterministic Annealing Neural Network for Convex Programming. Neural Networks 7(2), 629–641 (1994)
Wang, J., Chankong, V.: Recurrent Neural Networks for Linear Programming: Analysis and Design Principles. Computers and Operations Research 19(1), 297–311 (1992)
Wang, J.: Analysis and Design of a Recurrent Neural Network for Linear Programming. IEEE Transactions on Circuits and Systems 40(5), 613–618 (1993)
Kennedy, M.P., Chua, L.O.: Neural Networks for Nonlinear Programming. IEEE Transaction on Circuits and Systems 35(5), 554–562 (1988)
Xia, Y.S., Wang, J.: A General Methodology for Designing Globally Convergent Optimization Neural Networks. IEEE Transaction on Neural Networks 9(12), 1311–1343 (1998)
Bouzerdorm, A., Pattison, T.R.: Neural Network for Quadratic Optimization with Bound Constraints. IEEE Transaction on Neural Networks 4(2), 293–304 (1993)
Liang, X.B., Wang, J.: A Recurrent Neural Network for Nonlinear Optimization with a Continuously Differentiable Objective Function and Bound Constraints. IEEE Transaction on Neural Networks 11(11), 1251–1262 (2000)
Xu, Z.B., Hu, G.Q., Kwong, C.P.: Asymmetric-Hopfield-Type Networks: Theory and Applications. Neural Networks 9(2), 483–501 (2000)
Kinderlehrer, D., Stampcchia, G.: An Introduction to Variational Inequalities and Their Applications. Academic, New York (1980)
Bazaraa, M.S., Shetty, C.M.: Nonlinear Programming, Theory and Algorithms. John Wiley and Sons, New York (1979)
Eaves, B.C.: On the Basic Theorem of Complementarity. Mathematical Pragramming 1(1), 68–75 (1970)
Hale, J.K.: Ordinary Diffential Equations. Wiley, New York (1993)
LaSalle, J.: The Stability Theory for Ordinary Differential Equations. J. Differential Equations 4(1), 57–65 (1983)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2006 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Feng, F., Xia, Y., Zhang, Q. (2006). A Recurrent Neural Network for Linear Fractional Programming with Bound Constraints. In: Wang, J., Yi, Z., Zurada, J.M., Lu, BL., Yin, H. (eds) Advances in Neural Networks - ISNN 2006. ISNN 2006. Lecture Notes in Computer Science, vol 3971. Springer, Berlin, Heidelberg. https://doi.org/10.1007/11759966_55
Download citation
DOI: https://doi.org/10.1007/11759966_55
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-34439-1
Online ISBN: 978-3-540-34440-7
eBook Packages: Computer ScienceComputer Science (R0)