Abstract
In this paper, the problem of the global dissipativity of high-order Hopfield bidirectional associative memory neural networks with time-varying coefficients and distributed delays is discussed. By using Lyapunov–Krasovskii functional method, inequality techniques and linear matrix inequalities, a novel set of sufficient conditions for global dissipativity and global exponential dissipativity for the addressed system is developed. Further, the estimations of the positive invariant set, globally attractive set and globally exponentially attractive set are found. Finally, two examples with numerical simulations are provided to support the feasibility of the theoretical findings.
Similar content being viewed by others
Explore related subjects
Discover the latest articles, news and stories from top researchers in related subjects.References
Alimi AM, Aouiti C, Chérif F, Dridi F, M’hamdi MS (2018) Dynamics and oscillations of generalized high-order Hopfield neural networks with mixed delays. Neurocomputing 321:274–295
Aouiti C (2018) Oscillation of impulsive neutral delay generalized high-order Hopfield neural networks. Neural Comput Appl 29:477–495
Aouiti C (2016) Neutral impulsive shunting inhibitory cellular neural networks with time-varying coefficients and leakage delays. Cogn Neurodyn 10(6):573–591
Aouiti C, Coirault P, Miaadi F, Moulay E (2017) Finite time boundedness of neutral high-order Hopfield neural networks with time delay in the leakage term and mixed time delays. Neurocomputing 260:378–392
Aouiti C, Dridi F (2018) Piecewise asymptotically almost automorphic solutions for impulsive non-autonomous high-order Hopfield neural networks with mixed delays. Neural Comput Appl 31:5527–5545
Aouiti C, Dridi F: \((\mu ,\nu )\)-Pseudo-almost automorphic solutions for high-order Hopfield bidirectional associative memory neural networks. Neural Comput Appl 1–22
Aouiti C, Gharbia IB, Cao J, M’hamdi MS, Alsaedi A (2018) Existence and global exponential stability of pseudo almost periodic solution for neutral delay BAM neural networks with time-varying delay in leakage terms. Chaos, Solitons Fractals 107:111–127
Aouiti C, Miaadi F (2018) Finite-time stabilization of neutral Hopfield neural networks with mixed delays. Neural Process Lett 48(3):1645–1669
Aouiti C, Miaadi F (2018) Pullback attractor for neutral Hopfield neural networks with time delay in the leakage term and mixed time delays. Neural Comput Appl 1–10
Aouiti C, M’hamdi MS, Chérif F (2016) The existence and the stability of weighted pseudo almost periodic solution of high-order Hopfield neural network. In: International conference on artificial neural networks, Springer International Publishing, Berlin, pp 478–485
Aouiti C, M’hamdi MS, Touati A (2017) Pseudo almost automorphic solutions of recurrent neural networks with time-varying coefficients and mixed delays. Neural Process Lett 45(1):121–140
Aouiti C, M’hamdi MS, Chérif F (2017) New results for impulsive recurrent neural networks with time-varying coefficients and mixed delays. Neural Process Lett 46(2):487–506
Aouiti C, M’hamdi MS, Cao J, Alsaedi A (2017) Piecewise pseudo almost periodic solution for impulsive generalised high-order Hopfield neural networks with leakage delays. Neural Process Lett 45(2):615–648
Cao J, Liang J, Lam J (2004) Exponential stability of high-order bidirectional associative memory neural networks with time delays. Physica D 199(3–4):425–436
Coban R (2013) A context layered locally recurrent neural network for dynamic system identification. Eng Appl Artif Intell 26(1):241–250
Coban R, Aksu IO (2018) Neuro-controller design by using the multifeedback layer neural network and the particle swarm optimization. Tehnički vjesnik 25(2):437–444
Coban R, Can B (2009) An expert trajectory design for control of nuclear research reactors. Expert Syst Appl 36(9):11502–11508
Fan Y, Huang X, Wang Z, Li Y (2018) Global dissipativity and quasi-synchronization of asynchronous updating fractional-order memristor-based neural networks via interval matrix method. J Franklin Inst 355(13):5998–6025
Huang T, Li C, Duan S, Starzyk JA (2012) Robust exponential stability of uncertain delayed neural networks with stochastic perturbation and impulse effects. IEEE Trans Neural Netw Learn Syst 23(6):866–875
Johansson KH (2000) The quadruple-tank process: A multivariable laboratory process with an adjustable zero. IEEE Trans Control Syst Technol 8(3):456–465
Kosko B (1987) Adaptive bidirectional associative memories. Appl Opt 26(23):4947–4960
Kosko B (1988) Bidirectional associative memories. IEEE Trans Syst Man Cybernet 18(1):49–60
Lee TH, Park JH, Kwon OM, Lee SM (2013) Stochastic sampled-data control for state estimation of time-varying delayed neural networks. Neural Netw 46:99–108
Li H, Li C, Zhang W, Xu J (2018) Global dissipativity of inertial neural networks with proportional delay via new generalized halanay inequalities. Neural Process Lett 48(3):1543–1561
Li N, Cao J (2018) Global dissipativity analysis of quaternion-valued memristor-based neural networks with proportional delay. Neurocomputing 321:103–113
Liao X, Wang J (2003) Global dissipativity of continuous-time recurrent neural networks with time delay. Phys Rev E 68(1):016118
Maharajan C, Raja R, Cao J, Rajchakit G, Tu Z, Alsaedi A (2018) LMI-based results on exponential stability of BAM-type neural networks with leakage and both time-varying delays: a non-fragile state estimation approach. Appl Math Comput 326:33–55
Maharajan C, Raja R, Cao J, Rajchakit G, Alsaedi A (2018) Impulsive Cohen–Grossberg BAM neural networks with mixed time-delays: an exponential stability analysis issue. Neurocomputing 275:2588–2602
Maharajan C, Raja R, Cao J, Rajchakit G (2018) Novel global robust exponential stability criterion for uncertain inertial-type BAM neural networks with discrete and distributed time-varying delays via Lagrange sense. J Frankl Inst 355:4727–4754
M’hamdi MS, Aouiti C, Touati A, Alimi AM, Snasel V (2016) Weighted pseudo almost-periodic solutions of shunting inhibitory cellular neural networks with mixed delays. Acta Math Sci 36(6):1662–1682
Manivannan R, Mahendrakumar G, Samidurai R, Cao J, Alsaedi A (2017) Exponential stability and extended dissipativity criteria for generalized neural networks with interval time-varying delay signals. J Frankl Inst 354(11):4353–4376
Manivannan R, Samidurai R, Cao J, Alsaedi A, Alsaadi FE (2018) Design of extended dissipativity state estimation for generalized neural networks with mixed time-varying delay signals. Inf Sci 424:175–203
Marcus CM, Westervelt RM (1989) Stability of analog neural networks with delay. Phys Rev A 39(1):347
Pu Z, Rao R (2018) Exponential stability criterion of high-order BAM neural networks with delays and impulse via fixed point approach. Neurocomputing 292:63–71
Qiu J (2010) Dynamics of high-order Hopfield neural networks with time delays. Neurocomputing 73(4–6):820–826
Rajchakit G, Saravanakumar R, Ahn CK, Karimi HR (2017) Improved exponential convergence result for generalized neural networks including interval time-varying delayed signals. Neural Netw 86:10–17
Rajivganthi C, Rihan FA, Lakshmanan S (2019) Dissipativity analysis of complex-valued BAM neural networks with time delay. Neural Comput Appl 31(1):127–137
Samidurai R, Manivannan R, Ahn CK, Karimi HR (2018) New criteria for stability of generalized neural networks including Markov jump parameters and additive time delays. IEEE Trans Syst Man Cybernet Syst 48(4):485–499
Song Q, Zhao Z (2005) Global dissipativity of neural networks with both variable and unbounded delays. Chaos, Solitons Fractals 25(2):393–401
Sowmiya C, Raja R, Cao J, Li X, Rajchakit G (2018) Discrete-time stochastic impulsive BAM neural networks with leakage and mixed time delays: an exponential stability problem. J Franklin Inst 355(10):4404–4435
Tu Z, Cao J, Alsaedi A, Alsaadi F (2017) Global dissipativity of memristor-based neutral type inertial neural networks. Neural Netw 88:125–133
Tu Z, Wang L, Zha Z, Jian J (2013) Global dissipativity of a class of BAM neural networks with time-varying and unbound delays. Commun Nonlinear Sci Numer Simul 18(9):2562–2570
Wang L, Zhang L, Ding X (2015) Global dissipativity of a class of BAM neural networks with both time-varying and continuously distributed delays. Neurocomputing 152:250–260
Willem JC (1972) Dissipative dynamical systems-part 1: general theory. Arch Mech Anal 45(5):321–351
Willems JC (1972) Dissipative dynamical systems part II: linear systems with quadratic supply rates. Arch Ration Mech Anal 45(5):352–393
Zhang B, Xu S, Li Y, Chu Y (2007) On global exponential stability of high-order neural networks with time-varying delays. Phys Lett A 366(1–2):69–78
Zhang G, Zeng Z, Hu J (2018) New results on global exponential dissipativity analysis of memristive inertial neural networks with distributed time-varying delays. Neural Netw 97:183–191
Author information
Authors and Affiliations
Corresponding author
Additional information
Publisher's Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Appendices
Appendix 1
If Assumption 2 is satisfied, we suppose that system (1) possesses an equilibrium point \((x^{*},\;y^{*})=(x_{1}^{*},\;x_{2}^{*},\ldots ,x_{n}^{*};\;y_{1}^{*},\; y_{2}^{*},\ldots ,y_{m}^{*})^{T}.\)
Let \(x_{i}(t)=y_{i}(t)-y^{*}_i,\;y_{j}(t)=x_{j}(t)-x^{*}_j\) for \(i=1,\;2,\ldots ,n,\;j=1,\;2,\ldots ,m\) and \(f_{i}(x_{i}(t))={\tilde{f}}_{i}(x_{i}(t)+y^{*})-{\tilde{f}}_{i}(y^{*}),\)
\(g_{j}(y_{j}(t))={\tilde{g}}_{j}(y_{j}(t)+x^{*})-{\tilde{g}}_{j}(x^{*})\) and \(f_{i}(0)={\tilde{f}}_{i}(0)=0,\)
\(g_{i}(0)={\tilde{g}}_{i}(0)=0,\;i=1,2,\ldots ,n,\;j=1,2,\ldots ,m.\)
Hence, we have:
We can conclude that the transformed models are equivalent to the original model only if Assumption 3 is satisfied. The system can be described as:
where
-
\(\xi _{ijk}(x_{k}(t))=(W_{ijk}{\tilde{f}}_{k}(y_{k}(t))+W_{ikj}{\tilde{f}}_{k}(y_{k}^{*}))/(W_{ijk}+W_{ikj})\) if it lies between \({\tilde{f}}_{k}(y_{k}(t))\) and \({\tilde{f}}_{k}(y_{k}^{*}),\)
-
\(\xi ^{1}_{jik}(y_{k}(t))=(W^{1}_{jik}{\tilde{g}}_{k}(x_{k}(t))+W^{1}_{kij}{\tilde{g}}_{k}(x_{k}^{*}))/(W^{1}_{jik}+W^{1}_{jki})\)
if it lies between \({\tilde{g}}_{k}(x_{k}(t))\) and \({\tilde{g}}_{k}(x_{k}^{*}),\)
-
\(\zeta _{ijk}(x_{k}(t-\tau _{k}(t)))=(T_{ijk}{\tilde{f}}_{k}(y_{k}(t-\tau _{k}(t))))+T_{ikj}{\tilde{f}}_{k}(y_{k}^{*}))/(T_{ijk}+T_{ikj})\) if it lies between \({\tilde{f}}_{k}(y_{k}(t-\tau _{k}(t))))\) and \({\tilde{f}}_{k}(y_{k}^{*}),\)
-
\(\zeta ^{1}_{jik}(y_{k}(t-\sigma _{k}(t)))=(T^{1}_{jik}{\tilde{g}}_{k}(x_{k}(t-\sigma _{k}(t))))+T^{1}_{jki}{\tilde{g}}_{k}(x_{k}^{*}))/(T^{1}_{jik}+T^{1}_{jki})\) if it lies between \({\tilde{g}}_{k}(x_{k}(t-\sigma _{k}(t))))\) and \({\tilde{g}}_{k}(x_{k}^{*}),\)
If we denote,
-
\(x(.)=[x_{1}(\cdot ),\;x_{2}(\cdot ),\ldots , x_{n}(\cdot )]^{T},\)
-
\(y(.)=[y_{1}(\cdot ),\;y_{2}(\cdot ),\ldots , y_{m}(\cdot )]^{T},\)
-
\(f(x(.))=[f_{1}(x_{1}(\cdot )),\;f_{2}(x_{2}(\cdot )),\ldots , f_{n}(x_{n}(\cdot ))]^{T},\)
-
\(g(y(.))=[g_{1}(y_{1}(\cdot )),\;g_{2}(y_{2}(\cdot )),\ldots ,\;g_{m}(y_{m}(\cdot ))]^{T},\)
-
\(f(x(t-\tau (t)))=[f_{1}(x_{1}(t-\tau _{1}(t))),\ldots , f_{n}(x_{n}(t-\tau _{n}(t)))]^{T},\)
-
\(g(y(t-\sigma (t)))=[g_{1}(y_{1}(t-\sigma _{1}(t))),\ldots , g_{m}(y_{m}(t-\sigma _{m}(t)))]^{T},\)
-
\(A=diag\{a_{1},a_{2},\ldots ,a_{n}\},\)\(A^{1}=diag\{a^{1}_{1}, a^{1}_{2},\ldots ,\;a_{m}\},\)
-
\(B=(b_{ij})_{n\times n},\)\(B^{1}=(b^{1}_{ji})_{m\times m},\)\(C=(c_{ij})_{n\times n},\;C^{1}=(c^{1}_{ji})_{m\times m},\)
-
\(P=(p_{ij})_{n\times n},\)\(P^{1}=(p^{1}_{ji})_{m\times m},\)\(W_{i}=(W_{ijk})_{n\times n},\)
-
\(W^{1}_{j}=(W^{1}_{jik})_{m\times m},\)\(W=(W_{1}+W_{1}^{T},\;W_{2}+W_{2}^{T},\ldots ,\;W_{n}+W_{n}^{T}),\)
-
\(W^{1}=(W^{1}_{1}+(W^{1}_{1})^{T},\;W^{1}_{2}+(W^{1}_{2})^{T},\ldots ,\;W^{1}_{m}+(W^{1}_{m})^{T}),\; T_{i}=(T_{ijk})_{n\times n},\)
-
\(T^{1}_{j}=(T^{1}_{jik})_{m\times m},\;T=(T_{1}+T_{1}^{T},\;T_{2}+T_{2}^{T},\ldots , T_{n}+T_{n}^{T}),\)
-
\(T^{1}=(T^{1}_{1}+(T^{1}_{1})^{T},\;T^{1}_{2}+(T^{1}_{2})^{T},\ldots , T^{1}_{m}+(T^{1}_{m})^{T}),\)
-
\(\xi =(\xi _{1},\;\xi _{2},\ldots , \xi ^{1}_{n})^{T},\;\xi ^{1}=(\xi ^{1}_{1},\;\xi _{2},\ldots ,\;\xi _{m})^{T},\)
-
\(\varLambda =diag\{\xi _{1},\;\xi _{2},\ldots , \xi _{n}\},\)\(\varLambda ^{1}=diag\{\xi ^{1}_{1},\;\xi ^{1}_{2},\ldots ,\;\xi ^{1}_{m}\},\)
-
\(\zeta =(\zeta _{1},\;\zeta _{2},\ldots ,\zeta _{n})^{T},\)\(\zeta ^{1}=(\zeta ^{1}_{1},\;\zeta ^{1}_{2},\ldots ,\zeta ^{1}_{m})^{T},\)
-
\(\varGamma =diag\{\zeta _{1},\;\zeta _{2},\ldots ,\zeta _{n}\},\)\(\varGamma ^{1}=diag\{\zeta ^{1}_{1},\;\zeta ^{1}_{2},\ldots ,\;\zeta ^{1}_{m}\},\)
-
\(U(.)=\big (u_{1}(\cdot ),\;u_{2}(\cdot ),\ldots ,u_{n}(\cdot )\big )^{T},{\check{V}}(\cdot )=\big (v_{1}(\cdot ),\;v_{2}(\cdot ),\ldots ,v_{n}(\cdot )\big )^{T},\)
then the system (9) can be rewritten in the following vector–matrix form:
Appendix 2 (Proof of Theorem 1)
Proof
Consider the radially unbounded and positive definite Lyapunov function:
when \((x^{T}(t),\;y^T(t))^{T}\in {\mathbb {R}}^{m+n}\backslash \varUpsilon _{1},\) that is \((x^{T}(t),\;y^T(t))^{T}\notin \varUpsilon _{1},\) which implies that for \((\varphi ^{T},\;\phi ^{T})^{T}\in \varUpsilon _{1},\;t\ge t_{0};\)
\(\big ((x^{T}(t,\;t_{0},\;\varphi ),\;y(t,\;t_{0},\;\phi )\big )^{T}\in \varUpsilon _{1}\) holds. And for \((\varphi ^{T},\;\phi ^{T})^{T}\notin \varUpsilon _{1},\) there exists \(T>0\) such that:
\(\big ((x^{T}(t,\;t_{0},\;\varphi ),\;y(t,\;t_{0},\;\phi )\big )^{T}\in \varUpsilon _{1}\) holds for all \(t>t_{0}+T\). From Definition 1, it is concluded that the neural network model 1 is a dissipative system and \(\varUpsilon _{1}\) is a positive invariant and globally attractive set of (1) \(\square\)
Appendix 3 (Proof of Theorem 2)
Proof
Consider the following Lyapunov functional:
Calculating the upper right-hand derivative of \(V(\cdot )\) along the positive half trajectory of system (1), we have
when \((x^{T}(t),\;y^{T}(t))^{T}\in {\mathbb {R}}^{m+n}{\setminus} {\tilde{\varUpsilon }}_{2}\). Intergrating two sides of inequality (10) between 0 and \(t>0\), it follows that:
Thus, it is concluded that \({\tilde{\varUpsilon }}_{2}\) is a globally exponentially attractive set and also the neural network model (1) is a globally exponentially dissipative system. \(\square\)
Appendix 4 (Proof of Theorem 3)
Proof
We choose the following Lyapunov–Krasovskii functional:
where:
Calculating the time derivative of \(V(\cdot )\) along any trajectory of system (1), it yields that:
Applying Lemma1 to the above inequality,
It can be verified that (for more details see [46]):
Therefore, by Lemma 2, we have:
By the inequality \(2ab\le a^{2}+b^{2}\) for any \(a,\;b\in {\mathbb {R}},\) we obtain:
where
With the use of Lemma (3) and Eq. (4), we get:
In accordance with Eqs. (4) and (11), we obtain:
When \((x^{T}(t),\;y^{T}(t))\in {\mathbb {R}}^{n+m}\setminus \varUpsilon _{3}\), it follows that Eq. (12) concludes that the neural network model (1) is a dissipative system, and \(\varUpsilon _{3}\) is a positive invariant and globally attractive set of (1). \(\square\)
Rights and permissions
About this article
Cite this article
Aouiti, C., Sakthivel, R. & Touati, F. Global dissipativity of high-order Hopfield bidirectional associative memory neural networks with mixed delays. Neural Comput & Applic 32, 10183–10197 (2020). https://doi.org/10.1007/s00521-019-04552-8
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s00521-019-04552-8