A modified three-term conjugate gradient method for large –scale optimization
We propose a three-term conjugate gradient method in this paper . The basic idea is to exploit the good properties of the BFGS update. Quasi – Newton method lies a good efficient numerical computational, so we suggested to be based on BFGS method. However, the descent condition and the global conve...
Saved in:
| Main Authors: | , , |
|---|---|
| Format: | Article |
| Language: | English |
| Published: |
Tikrit University
2020-03-01
|
| Series: | Tikrit Journal of Pure Science |
| Subjects: | |
| Online Access: | https://tjpsj.org/index.php/tjps/article/view/258 |
| Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
| Summary: | We propose a three-term conjugate gradient method in this paper . The basic idea is to exploit the good properties of the BFGS update. Quasi – Newton method lies a good efficient numerical computational, so we suggested to be based on BFGS method. However, the descent condition and the global convergent is proven under Wolfe condition. The new algorithm is very effective e for solving the large – scale unconstrained optimization problem.
|
|---|---|
| ISSN: | 1813-1662 2415-1726 |