A Novel Fractional Gradient-Based Learning Algorithm for Recurrent Neural Networks

Shujaat Khan, Jawwad Ahmad, Imran Naseem*, Muhammad Moinuddin

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

45 Scopus citations

Abstract

In this research, we propose a novel algorithm for learning of the recurrent neural networks called as the fractional back-propagation through time (FBPTT). Considering the potential of the fractional calculus, we propose to use the fractional calculus-based gradient descent method to derive the FBPTT algorithm. The proposed FBPTT method is shown to outperform the conventional back-propagation through time algorithm on three major problems of estimation namely nonlinear system identification, pattern classification and Mackey–Glass chaotic time series prediction.

Original languageEnglish
Pages (from-to)593-612
Number of pages20
JournalCircuits, Systems, and Signal Processing
Volume37
Issue number2
DOIs
StatePublished - 1 Feb 2018
Externally publishedYes

Bibliographical note

Publisher Copyright:
© 2017, Springer Science+Business Media New York.

Keywords

  • Back-propagation through time (BPTT)
  • Fractional calculus
  • Gradient descent
  • Mackey–Glass chaotic time series
  • Minimum redundancy and maximum relevance (mRMR)
  • Recurrent neural network (RNN)

ASJC Scopus subject areas

  • Signal Processing
  • Applied Mathematics

Fingerprint

Dive into the research topics of 'A Novel Fractional Gradient-Based Learning Algorithm for Recurrent Neural Networks'. Together they form a unique fingerprint.

Cite this