Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
Skip to main content

Structured diagonal Gauss–Newton method for nonlinear least squares

  • Published:
Computational and Applied Mathematics Aims and scope Submit manuscript

Abstract

This work proposes a structured diagonal Gauss–Newton algorithm for solving zero residue nonlinear least-squares problems. The matrix corresponding to the Gauss–Newton direction is approximated with a diagonal matrix that satisfies the structured secant condition. Using a derivative-free Armijo-type line search with some appropriate conditions, we prove that the proposed algorithm converges globally. Furthermore, the algorithm achieved R-linear convergence rate for zero residue problems. Numerical result shows that the algorithm is competitive with the existing algorithms in the literature.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7

Similar content being viewed by others

References

  • Awwal AM, Kumam P, Wang L, Yahaya MM, Mohammad H (2020) On the Barzilai–Borwein gradient methods with structured secant equation for nonlinear least squares problems. Optim Methods Softw pp 1–20. https://doi.org/10.1080/10556788.2020.1855170

  • Dehghani R, Mahdavi-Amiri N (2019) Scaled nonlinear conjugate gradient methods for nonlinear least squares problems. Numer Algorithm 82:1–20

    Article  MathSciNet  Google Scholar 

  • Dennis JE, Schnabel RB Jr (1983) Numerical methods for unconstrained optimization and nonlinear equations. Prentice-Hall, New Jersey

    MATH  Google Scholar 

  • Dolan ED, Moré JJ (2002) Benchmarking optimization software with performance profiles. Math Program Ser 91:201–213

  • Fletcher R, Reeves CM (1964) Function minimization by conjugate gradients. Comput J 7:149–154

    Article  MathSciNet  Google Scholar 

  • Gonçalves DS, Santos SA (2016) Local analysis of a spectral correction for the Gauss-Newton model applied to quadratic residual problems. Numer Algorithm 73(2):407–431

    Article  MathSciNet  Google Scholar 

  • Gonçalves DS, Santos SA (2016) A globally convergent method for nonlinear least-squares problems based on the Gauss–Newton model with spectral correction. Bull Comput Appl Math 4(2):7–26

    MathSciNet  MATH  Google Scholar 

  • Grippo L, Lampariello F, Lucidi S (1986) A nonmonotone line search technique for newton’s method. SIAM J Numer Anal 23(4):707–716

    Article  MathSciNet  Google Scholar 

  • Hartley HO (1961) The modified Gauss-Newton method for the fitting of non-linear regression functions by least squares. Technometrics 3(2):269–280

    Article  MathSciNet  Google Scholar 

  • Jamil M, Yang XS (2013) A literature survey of benchmark functions for global optimisation problems. Int J Math Model Numer Optim 4(2):150–194

    MATH  Google Scholar 

  • La Cruz W, Martínez JM, Raydan M (2006) Spectral residual method without gradient information for solving large-scale nonlinear systems. Math Comp 75(255):1429–1448

    Article  MathSciNet  Google Scholar 

  • La Cruz W, Martínez JM, Raydan M (2004) Spectral residual method without gradient information for solving large-scale nonlinear systems: theory and experiments. Tech. Rep. RT-04-08, Universidad Central de Venezuela, Venezuela

  • Levenberg K (1944) A method for the solution of certain non-linear problems in least squares. Q Appl Math 2(2):164–168

    Article  MathSciNet  Google Scholar 

  • Li DH, Fukushima M (1999) A globally and superlinearly convergent Gauss–Newton-based BFGS method for symmetric nonlinear equations. SIAM J Numer Anal 37(1):152–172

    Article  MathSciNet  Google Scholar 

  • Lukšan L, Vlcek J (2003) Test problems for unconstrained optimization. Tech. Rep. Technical Report- 897, Academy of Sciences of the Czech Republic, Institute of Computer Science

  • Marquardt DW (1963) An algorithm for least-squares estimation of nonlinear parameters. J Soc Ind Appl Math 11(2):431–441

    Article  MathSciNet  Google Scholar 

  • Mohammad H, Santos SA (2018) A structured diagonal Hessian approximation method with evaluation complexity analysis for nonlinear least squares. Comput Appl Math 37:6619–6653. https://doi.org/10.1007/s40314-018-0696-1

    Article  MathSciNet  MATH  Google Scholar 

  • Mohammad H, Waziri MY (2019) Structured two-point stepsize gradient methods for nonlinear least squares. J Optim Theory Appl 181(1):298–317. https://doi.org/10.1007/s10957-018-1434-y

    Article  MathSciNet  MATH  Google Scholar 

  • Mohammad H, Waziri MY, Santos SA (2019) A brief survey of methods for nonlinear least-squares problems. Numer Algebra Control Optim 9(1):1–13. https://doi.org/10.3934/naco.201901

    Article  MathSciNet  MATH  Google Scholar 

  • Moré JJ, Garbow BS, Hillstrom KE (1981) Testing unconstrained optimization software. ACM Trans Math Softw 7(1):17–41

    Article  MathSciNet  Google Scholar 

  • Nazareth L (1980) Some recent approaches to solving large residual nonlinear least squares problems. SIAM Rev 22(1):1–11

    Article  MathSciNet  Google Scholar 

  • Nocedal J, Wright SJ (2006) Numerical optimization. Springer, New York

    MATH  Google Scholar 

  • Raydan M (1997) The Barzilai and Borwein gradient method for the large scale unconstrained minimization problem. SIAM J Optim 7(1):26–33

    Article  MathSciNet  Google Scholar 

  • Sun W, Yuan YX (2006) Optimization theory and methods. Springer, New York

    MATH  Google Scholar 

  • Yahaya MM, Kumam P, Awwal AM, Aji S (2021) A structured quasi-Newton algorithm with nonmonotone search strategy for structured NLS problems and its application in robotic motion control. J Comput Appl Math 395(15):113–582

    MathSciNet  MATH  Google Scholar 

  • Yuan YX (2011) Recent advances in numerical methods for nonlinear equations and nonlinear least squares. Numer Algebra Control Optim 1(1):15–34

    Article  MathSciNet  Google Scholar 

  • Zhang H, Hager WW (2004) A nonmonotone line search technique and its application to unconstrained optimization. SIAM J Optim 14(4):1043–1056

    Article  MathSciNet  Google Scholar 

Download references

Acknowledgements

The authors are grateful to the referees for their constructive suggestions which improved the quality of the earlier version of this manuscript.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Hassan Mohammad.

Additional information

Communicated by Orizon Pereira Ferreira.

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Danmalam, K.U., Mohammad, H. & Waziri, M.Y. Structured diagonal Gauss–Newton method for nonlinear least squares. Comp. Appl. Math. 41, 68 (2022). https://doi.org/10.1007/s40314-022-01774-w

Download citation

  • Received:

  • Revised:

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1007/s40314-022-01774-w

Keywords

Mathematics Subject Classification