Multi-category bioinformatics dataset classification using extreme learning machine

Tarek Helmy*, Zeehasham Rasheed

*Corresponding author for this work

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

30 Scopus citations

Abstract

This paper presents recently introduced learning algorithm called Extreme Learning Machine (ELM) for Singlehidden Layer Feed-forward Neural-networks (SLFNs) which randomly chooses hidden nodes and analytically determines the output weights of SLFNs. The ELM avoids problems like local minima, improper learning rate and over fitting commonly faced by iterative learning methods and completes the training very fast. We have evaluated the multi-category classification performance of ELM on five different data sets related to bioinformatics namely, the Breast Cancer Wisconsin data set, the Pima Diabetes data set, the Heart-Statlog data set, the Hepatitis data set and the Hypothyroid data set. A detailed analysis of different activation functions with varying number of neurons is also carried out which concludes that Algebraic Sigmoid function outperforms all other activation functions on these data sets. The evaluation results indicate that ELM produces better classification accuracy with reduced training time and implementation complexity compared to earlier implemented models.

Original languageEnglish
Title of host publication2009 IEEE Congress on Evolutionary Computation, CEC 2009
Pages3234-3240
Number of pages7
DOIs
StatePublished - 2009

Publication series

Name2009 IEEE Congress on Evolutionary Computation, CEC 2009

Keywords

  • Bayesian network
  • Bioinformatics
  • Classification
  • Decision tree
  • Extreme learning machine
  • SVM

ASJC Scopus subject areas

  • Artificial Intelligence
  • Computational Theory and Mathematics
  • Theoretical Computer Science

Fingerprint

Dive into the research topics of 'Multi-category bioinformatics dataset classification using extreme learning machine'. Together they form a unique fingerprint.

Cite this