Skip to main content
Log in

An improved Polak–Ribière–Polyak conjugate gradient method with an efficient restart direction

  • Published:
Computational and Applied Mathematics Aims and scope Submit manuscript

Abstract

For the large-scale optimization problems, we propose a new conjugate parameter by modifying the denominator of the Polak–Ribière–Polyak formula, and give its non-negative form. Under the weak Wolfe line search, their corresponding algorithms perform superior to their congener methods, respectively. To guarantee its global convergence, we further introduce a restart condition and a restart direction to improve the proposed method. Under usual assumptions and using the strong Wolfe line search to yielded the step-length, the improved method is sufficient descent and globally convergent. Numerical experiments for the improved method and its comparisons are carried out, and the corresponding numerical results and performance profiles are reported, which showed that the improved method is practicable and efficient for the large-scale optimization problems.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6

Similar content being viewed by others

References

  • Andrei N (2008) An unconstrained optimization test functions collection. Adv Model Optim 10(1):147–161

    MathSciNet  MATH  Google Scholar 

  • Andrei N (2009) Hybrid conjugate gradient algorithm for unconstrained optimization. J Optim Theory Appl 141:249–264

  • Cheng WY, Liu QF (2010) Sufficient descent nonlinear conjugate gradient methods with condition. Numer Algor 53(1):113–131

    Article  MathSciNet  Google Scholar 

  • Dai YH, Kou CX (2013) A nonlinear conjugate gradient algorithm with an optimal property and an improved Wolfe line search. SIAM J Optim 23(1):296–320

    Article  MathSciNet  Google Scholar 

  • Dai ZF, Wen FH (2012) Another improved Wei-Yao-Liu nonlinear conjugate gradient method with sufficient descent property. Appl Math Comput 218(14):7421–7430

    MathSciNet  MATH  Google Scholar 

  • Dai YH, Yuan YX (1999) A nonlinear conjugate gradient method with a strong global convergence property. SIAM J Optim 10(1):177–182

    Article  MathSciNet  Google Scholar 

  • Dai YH, Yuan YX (2000) Nonlinear conjugate gradient methods. Shanghai Scientific and Technical Publishers, Shanghai

    Google Scholar 

  • Dolan ED, Moré J (2002) Benchmarking optimization software with performance profiles. Math Program 91:201–213

    Article  MathSciNet  Google Scholar 

  • Dong XL, Han DR, Dai ZF, Li LX, Zhu JG (2018) An accelerated three-term conjugate gradient method with sufficient descent condition and conjugacy condition. J Optim Theory Appl 179:944–961

  • Fatemi M (2016a) A new efficient conjugate gradient method for unconstrained optimization. J Comput Appl Math 300:207–216

  • Fatemi M (2016b) An optimal parameter for Dai-Liao family of conjugate gradient methods. J Optim Theory Appl 169(2):587–605

  • Fatemi M (2017) A scaled conjugate gradient method for nonlinear unconstrained optimization. Optim Methods Soft 32(5):1095–1112

    Article  MathSciNet  Google Scholar 

  • Fletcher R, Reeves CM (1964) Function minimization by conjugate gradients. Comput J 7(2):149–154

    Article  MathSciNet  Google Scholar 

  • Gilbert JC, Nocedal J (1992) Global convergence properties of conjugate gradient methods for optimization. SIAM J Optim 2(1):21–42

    Article  MathSciNet  Google Scholar 

  • Gould NIM, Orban D, Toint PL (2003) CUTEr and SifDec: a constrained and unconstrained testing environment, revisited. ACM Trans Math Soft (TOMS) 29(4):373–394

    Article  Google Scholar 

  • Hager WW, Zhang HC (2005) A new conjugate gradient method with guaranteed descent and an efficient line search. SIAM J Optim 16(1):170–192

    Article  MathSciNet  Google Scholar 

  • Hager WW, Zhang HC (2013) The limited memory conjugate gradient method. SIAM J Optim 23(4):2150–2168

    Article  MathSciNet  Google Scholar 

  • Hestenes MR, Stiefel E (1952) Method of conjugate gradient for solving linear equations. J Res Natl Bur Stand 49(6):409–436

    Article  Google Scholar 

  • Jian JB, Han L, Jiang XZ (2015) A hybrid conjugate gradient method with descent property for unconstrained optimization. Appl Math Model 39(3–4):1281–1290

    Article  MathSciNet  Google Scholar 

  • Jiang XZ, Jian JB (2019) Improved Fletcher-Reeves and Dai-Yuan conjugate gradient methods with the strong Wolfe line search. J Comput Appl Math 348:525–534

    Article  MathSciNet  Google Scholar 

  • Khoshgam Z, Ashrafi A (2019) A new hybrid conjugate gradient method for large-scale unconstrained optimization problem with non-convex objective function. Comput Appl Math 38(186):1–14

    MathSciNet  MATH  Google Scholar 

  • Kou CX, Dai YH (2015) A modified self-scaling memoryless Broyden-Fletcher-Goldfarb-Shanno method for unconstrained optimization. J Optim Theory Appl 165(1):209–224

  • Liu JK, Feng YM, Zou LM (2019) A spectral conjugate gradient method for solving large-scale unconstrained optimization. Comput Math Appl 77(3):731–739

  • Moré J, Garbow BS, Hillstrome KE (1981) Testing unconstrained optimization software. ACM Trans Math Softw 7:17–41

  • Polak B, Ribière G (1969) Note surla convergence des methodes de directions conjugees. Rev Fr Inf Rech Oper 3(1):35–43

  • Powell MJD (1984) Nonconvex minimization calculations and the conjugate gradient method, lecture note in mathematics, vol 1066. Springer, Berlin

  • Saman BK, Ghanbari R (2014) Two modified three-term conjugate gradient methods with sufficient descent property. Optim Lett 8(8):2285–2297

  • Zhu ZB, Zhang DD, Wang S (2020) Two modified DY conjugate gradient methods for unconstrained optimization problems. Appl Math Comput 373(15):125004

    MathSciNet  MATH  Google Scholar 

  • Zoutendijk G (1970) Nonlinear programming, computational methods. In: Abadie J (ed) Integer and nonlinear programming. North-Holland, Amsterdam, pp 37–86

    MATH  Google Scholar 

Download references

Acknowledgements

The authors wish to thank the two anonymous referees and the editor for their constructive and pertinent suggestions for improving the presentation of the work.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Jinbao Jian.

Additional information

Communicated by Paulo J. S. Silva.

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

This work was supported by NSFC (Grant no. 11771383), the Natural Science Foundation of Guangxi Province (Grant no. 2020GXNSFDA238017), the Research Project of Guangxi University for Nationalities (Grant no. 2018KJQD02) and Innovation Project of Guangxi Graduate Education (gxun-chxzs2019034).

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Jiang, X., Jian, J., Song, D. et al. An improved Polak–Ribière–Polyak conjugate gradient method with an efficient restart direction. Comp. Appl. Math. 40, 174 (2021). https://doi.org/10.1007/s40314-021-01557-9

Download citation

  • Received:

  • Revised:

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1007/s40314-021-01557-9

Keywords

Mathematics Subject Classification

Navigation