Abstract
In this research, we propose a novel fractional gradient descent-based learning algorithm (FGD) for the radial basis function neural networks (RBF-NN). The proposed FGD is the convex combination of the conventional, and the modified Riemann–Liouville derivative-based fractional gradient descent methods. The proposed FGD method is analyzed for an optimal solution in a system identification problem, and a closed form Wiener solution of a least square problem is obtained. Using the FGD, the weight update rule for the proposed fractional RBF-NN (FRBF-NN) is derived. The proposed FRBF-NN method is shown to outperform the conventional RBF-NN on four major problems of estimation namely nonlinear system identification, pattern classification, time series prediction and function approximation.
| Original language | English |
|---|---|
| Pages (from-to) | 5311-5332 |
| Number of pages | 22 |
| Journal | Circuits, Systems, and Signal Processing |
| Volume | 37 |
| Issue number | 12 |
| DOIs | |
| State | Published - 1 Dec 2018 |
| Externally published | Yes |
Bibliographical note
Publisher Copyright:© 2018, Springer Science+Business Media, LLC, part of Springer Nature.
Keywords
- Artificial neural networks
- Fractional-order calculus
- Function approximation
- Kernel function
- Nonlinear system identification
- Radial basis function
- Time series prediction
- Wiener solution
- modified Riemann–Liouville derivative
ASJC Scopus subject areas
- Signal Processing
- Applied Mathematics