Abstract
In this research, we propose a novel algorithm for learning of the recurrent neural networks called as the fractional back-propagation through time (FBPTT). Considering the potential of the fractional calculus, we propose to use the fractional calculus-based gradient descent method to derive the FBPTT algorithm. The proposed FBPTT method is shown to outperform the conventional back-propagation through time algorithm on three major problems of estimation namely nonlinear system identification, pattern classification and Mackey–Glass chaotic time series prediction.
| Original language | English |
|---|---|
| Pages (from-to) | 593-612 |
| Number of pages | 20 |
| Journal | Circuits, Systems, and Signal Processing |
| Volume | 37 |
| Issue number | 2 |
| DOIs | |
| State | Published - 1 Feb 2018 |
| Externally published | Yes |
Bibliographical note
Publisher Copyright:© 2017, Springer Science+Business Media New York.
Keywords
- Back-propagation through time (BPTT)
- Fractional calculus
- Gradient descent
- Mackey–Glass chaotic time series
- Minimum redundancy and maximum relevance (mRMR)
- Recurrent neural network (RNN)
ASJC Scopus subject areas
- Signal Processing
- Applied Mathematics