AdaBoost-based artificial neural network learning

Mirza M. Baig, Mian M. Awais*, El Sayed M. El-Alfy

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

81 Scopus citations

Abstract

A boosting-based method of learning a feed-forward artificial neural network (ANN) with a single layer of hidden neurons and a single output neuron is presented. Initially, an algorithm called Boostron is described that learns a single-layer perceptron using AdaBoost and decision stumps. It is then extended to learn weights of a neural network with a single hidden layer of linear neurons. Finally, a novel method is introduced to incorporate non-linear activation functions in artificial neural network learning. The proposed method uses series representation to approximate non-linearity of activation functions, learns the coefficients of nonlinear terms by AdaBoost. It adapts the network parameters by a layer-wise iterative traversal of neurons and an appropriate reduction of the problem. A detailed performances comparison of various neural network models learned the proposed methods and those learned using the least mean squared learning (LMS) and the resilient back-propagation (RPROP) is provided in this paper. Several favorable results are reported for 17 synthetic and real-world datasets with different degrees of difficulties for both binary and multi-class problems.

Original languageEnglish
Pages (from-to)120-126
Number of pages7
JournalNeurocomputing
Volume248
DOIs
StatePublished - 26 Jul 2017

Bibliographical note

Publisher Copyright:
© 2017

Keywords

  • AdaBoost
  • Artificial neural network
  • Boostron
  • Ensemble learning
  • Perceptron

ASJC Scopus subject areas

  • Computer Science Applications
  • Cognitive Neuroscience
  • Artificial Intelligence

Fingerprint

Dive into the research topics of 'AdaBoost-based artificial neural network learning'. Together they form a unique fingerprint.

Cite this