A New Conjugate Gradient for Efficient Unconstrained Optimization with Robust Descent Guarantees
DOI:
https://doi.org/10.31185/wjcms.358Keywords:
Unconstrained optimization, Descent and Sufficient descent condition, Global convergent.Abstract
The Conjugate Gradient method is a powerful iterative algorithm aims to find the minimum of a function by iteratively searching along conjugate directions. This work presents a nonlinear conjugate gradient approach for unconstrained optimization, resulting from the resolution of a novel optimization problem. The theoretical framework of the proposed method is discussed, and on the application of the descent condition. To evaluate its performance, numerical experiments were conducted, comparing the proposed method against established algorithms () and (Lia and Story) methods. The results demonstrate that the new method not only exhibits enhanced efficiency but also significantly outperforms the () and (Lia and Story) methods in terms of optimization effectiveness. These findings suggest that the proposed approach offers a competitive and promising alternative for solving unconstrained optimization problems.
Downloads
References
[1] Hestenes, M. R., & Stiefel, E. (1952). Methods of conjugate gradients for solving linear systems (Vol. 49, No. 1). Washington, DC: NBS. DOI: https://doi.org/10.6028/jres.049.044
[2] Fletcher, R., & Reeves, C. M. (1964). Function minimization by conjugate gradients. The computer journal, 7(2), 149-154. DOI: https://doi.org/10.1093/comjnl/7.2.149
[3] Polak, E., & Ribiere, G. (1969). Note sur la convergence de méthodes de directions conjuguées. Revue française d'informatique et de recherche opérationnelle. Série rouge, 3(16), 35-43. DOI: https://doi.org/10.1051/m2an/196903R100351
[4] Dai, Y. H., & Yuan, Y. (1999). A nonlinear conjugate gradient method with a strong global convergence property. SIAM Journal on Optimization, 10(1), 177-182. DOI: https://doi.org/10.1137/S1052623497318992
[5] Jahwar, B. H., Ajeel, S. M., & Shareef, S. G. (2024). Two new classes of conjugate gradient method based on logistic mapping. TELKOMNIKA (Telecommunication Computing Electronics and Control), 22(1), 86-94. DOI: https://doi.org/10.12928/telkomnika.v22i1.25264
[6] Khatab, H. A., & Shareef, S. G. (2024). Two new limited-memory preconditioned conjugate gradient algorithms for nonlinear optimization problems. Journal of Intelligent & Fuzzy Systems, (Preprint), 1-14. DOI: https://doi.org/10.3233/JIFS-233081
[7] SHAREEF, S. G. (2022). A NEW CONJUGATE GRADIENT WITH GLOBAL CONVERGES FOR NONLINEAR PROBLEMS. Journal of Duhok University, 25(2), 573-578. DOI: https://doi.org/10.26682/sjuod.2022.25.2.51
[8] Dwail, H. H., Mahdi, M. M., & Shiker, M. A. (2022). CG method with modifying βk for solving unconstrained optimization problems. Journal of Interdisciplinary Mathematics, 25(5), 1347-1355. DOI: https://doi.org/10.1080/09720502.2022.2040854
[9] Ibrahim, A. L., Sadiq, M. A., & Shareef, S. G. (2019). A New Conjugate Gradient Coefficient for Unconstrained Optimization Based On Dai-Liao. Science Journal of University of Zakho, 7(1), 34-36. DOI: https://doi.org/10.25271/sjuoz.2019.7.1.525
[10] Nocedal, J., & Wright, S. J. (Eds.). (1999). Numerical optimization. New York, NY: Springer New York. DOI: https://doi.org/10.1007/b98874
[11] Liu, Y., & Storey, C. (1991). Efficient generalized conjugate gradient algorithms, part 1: theory. Journal of optimization theory and applications, 69, 129-137. DOI: https://doi.org/10.1007/BF00940464
[12] Powell, M. J. (1978). Algorithms for nonlinear constraints that use Lagrangian functions. Mathematical programming, 14, 224-248 DOI: https://doi.org/10.1007/BF01588967
Downloads
Published
Issue
Section
License
Copyright (c) 2025 Hussein Saleem Ahmed

This work is licensed under a Creative Commons Attribution 4.0 International License.