A Fractional Gradient Descent-Based RBF Neural Network

  • Shujaat Khan
  • , Imran Naseem*
  • , Muhammad Ammar Malik
  • , Roberto Togneri
  • , Mohammed Bennamoun
  • *Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

67 Scopus citations

Abstract

In this research, we propose a novel fractional gradient descent-based learning algorithm (FGD) for the radial basis function neural networks (RBF-NN). The proposed FGD is the convex combination of the conventional, and the modified Riemann–Liouville derivative-based fractional gradient descent methods. The proposed FGD method is analyzed for an optimal solution in a system identification problem, and a closed form Wiener solution of a least square problem is obtained. Using the FGD, the weight update rule for the proposed fractional RBF-NN (FRBF-NN) is derived. The proposed FRBF-NN method is shown to outperform the conventional RBF-NN on four major problems of estimation namely nonlinear system identification, pattern classification, time series prediction and function approximation.

Original languageEnglish
Pages (from-to)5311-5332
Number of pages22
JournalCircuits, Systems, and Signal Processing
Volume37
Issue number12
DOIs
StatePublished - 1 Dec 2018
Externally publishedYes

Bibliographical note

Publisher Copyright:
© 2018, Springer Science+Business Media, LLC, part of Springer Nature.

Keywords

  • Artificial neural networks
  • Fractional-order calculus
  • Function approximation
  • Kernel function
  • Nonlinear system identification
  • Radial basis function
  • Time series prediction
  • Wiener solution
  • modified Riemann–Liouville derivative

ASJC Scopus subject areas

  • Signal Processing
  • Applied Mathematics

Fingerprint

Dive into the research topics of 'A Fractional Gradient Descent-Based RBF Neural Network'. Together they form a unique fingerprint.

Cite this