First-Order Sparse TSK Nonstationary Fuzzy Neural Network Based on the Mean Shift Algorithm and the Group Lasso Regularization

Bingjie Zhang, Jian Wang*, Xiaoling Gong, Zhanglei Shi, Chao Zhang*, Kai Zhang, El Sayed M. El-Alfy, Sergey V. Ablameyko

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

Abstract

Nonstationary fuzzy inference systems (NFIS) are able to tackle uncertainties and avoid the difficulty of type-reduction operation. Combining NFIS and neural network, a first-order sparse TSK nonstationary fuzzy neural network (SNFNN-1) is proposed in this paper to improve the interpretability/translatability of neural networks and the self-learning ability of fuzzy rules/sets. The whole architecture of SNFNN-1 can be considered as an integrated model of multiple sub-networks with a variation in center, variation in width or variation in noise. Thus, it is able to model both “intraexpert” and “interexpert” variability. There are two techniques adopted in this network: the Mean Shift-based fuzzy partition and the Group Lasso-based rule selection, which can adaptively generate a suitable number of clusters and select important fuzzy rules, respectively. Quantitative experiments on six UCI datasets demonstrate the effectiveness and robustness of the proposed model.

Original languageEnglish
Article number120
JournalMathematics
Volume12
Issue number1
DOIs
StatePublished - Jan 2024

Bibliographical note

Publisher Copyright:
© 2023 by the authors.

Keywords

  • group lasso
  • mean shift
  • nonstationary neuro-fuzzy network
  • rule reduction

ASJC Scopus subject areas

  • Computer Science (miscellaneous)
  • General Mathematics
  • Engineering (miscellaneous)

Fingerprint

Dive into the research topics of 'First-Order Sparse TSK Nonstationary Fuzzy Neural Network Based on the Mean Shift Algorithm and the Group Lasso Regularization'. Together they form a unique fingerprint.

Cite this