Abstract
In this paper, we propose a hybrid conjugate gradient method for unconstrained optimization, obtained by a convex combination of the LS and KMD conjugate gradient parameters. A favourite property of the proposed method is that the search direction satisfies the Dai–Liao conjugacy condition and the quasi-Newton direction. In addition, this property does not depend on the line search. Under a modified strong Wolfe line search, we establish the global convergence of the method. Numerical comparison using a set of 109 unconstrained optimization test problems from the CUTEst library show that the proposed method outperforms the Liu–Storey and Hager–Zhang conjugate gradient methods.
| Original language | English |
|---|---|
| Pages (from-to) | 1370-1383 |
| Number of pages | 14 |
| Journal | Optimization Methods and Software |
| Volume | 37 |
| Issue number | 4 |
| DOIs | |
| State | Published - 2022 |
| Externally published | Yes |
Bibliographical note
Publisher Copyright:© 2022 Informa UK Limited, trading as Taylor & Francis Group.
Keywords
- CUTEst
- Dai–Liao conjugacy
- Quasi-Newton direction
- Unconstrained optimization
- conjugate gradient method
- hybrid conjugate gradient method
ASJC Scopus subject areas
- Software
- Control and Optimization
- Applied Mathematics