A new scheme for training feed-forward neural networks

Osama Abdel-Wahhab*, M. A. Sid-Ahmed

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

6 Scopus citations

Abstract

In this paper we present a new algorithm, which is orders of magnitude faster than the delta rule, for training feed-forward neural networks. It provides a substantial improvement over the method of Scalero and Tepedelenlioglu (IEEE Trans. Signal Process. 40(1) (1992)) in both training time and numerical stability. The method combines the modified back-propagation algorithm described by Scalero and Tepedelenlioglu along with a faster training scheme and has better numerical stability. The algorithm is tested against other methods, and results are presented.

Original languageEnglish
Pages (from-to)519-524
Number of pages6
JournalPattern Recognition
Volume30
Issue number3
DOIs
StatePublished - Mar 1997

Keywords

  • Arabic fonts
  • Back propagation
  • Delta rule
  • Feed-forward neural network
  • Kalman filter
  • Moment invariants
  • Training

ASJC Scopus subject areas

  • Software
  • Signal Processing
  • Computer Vision and Pattern Recognition
  • Artificial Intelligence

Fingerprint

Dive into the research topics of 'A new scheme for training feed-forward neural networks'. Together they form a unique fingerprint.

Cite this