Abstract
This research proposes and investigates some improvements in gradient descent iterations that can be applied for solving system of nonlinear equations (SNE). In the available literature, such methods are termed improved gradient descent methods. We use verified advantages of various accelerated double direction and double step size gradient methods in solving single scalar equations. Our strategy is to control the speed of the convergence of gradient methods through the step size value defined using more parameters. As a result, efficient minimization schemes for solving SNE are introduced. Linear global convergence of the proposed iterative method is confirmed by theoretical analysis under standard assumptions. Numerical experiments confirm the significant computational efficiency of proposed methods compared to traditional gradient descent methods for solving SNE.
| Original language | English |
|---|---|
| Article number | 64 |
| Journal | Algorithms |
| Volume | 16 |
| Issue number | 2 |
| DOIs | |
| State | Published - Feb 2023 |
Bibliographical note
Publisher Copyright:© 2023 by the authors.
Keywords
- Jacobian
- gradient descent methods
- nonlinear equations
- nonlinear programming
ASJC Scopus subject areas
- Theoretical Computer Science
- Numerical Analysis
- Computational Theory and Mathematics
- Computational Mathematics