Home > Research > Publications & Outputs > PANFIS
View graph of relations

PANFIS: a novel incremental learning machine

Research output: Contribution to Journal/MagazineJournal articlepeer-review

Published

Standard

PANFIS: a novel incremental learning machine. / Pratama, Maharadhika; G. Avanatti, Sreenatha; Angelov, Plamen et al.
In: IEEE Transactions on Neural Networks, Vol. 25, No. 1, 01.2014, p. 55-68.

Research output: Contribution to Journal/MagazineJournal articlepeer-review

Harvard

Pratama, M, G. Avanatti, S, Angelov, P & Lughofer, E 2014, 'PANFIS: a novel incremental learning machine', IEEE Transactions on Neural Networks, vol. 25, no. 1, pp. 55-68. https://doi.org/10.1109/TNNLS.2013.2271933

APA

Pratama, M., G. Avanatti, S., Angelov, P., & Lughofer, E. (2014). PANFIS: a novel incremental learning machine. IEEE Transactions on Neural Networks, 25(1), 55-68. https://doi.org/10.1109/TNNLS.2013.2271933

Vancouver

Pratama M, G. Avanatti S, Angelov P, Lughofer E. PANFIS: a novel incremental learning machine. IEEE Transactions on Neural Networks. 2014 Jan;25(1):55-68. doi: 10.1109/TNNLS.2013.2271933

Author

Pratama, Maharadhika ; G. Avanatti, Sreenatha ; Angelov, Plamen et al. / PANFIS : a novel incremental learning machine. In: IEEE Transactions on Neural Networks. 2014 ; Vol. 25, No. 1. pp. 55-68.

Bibtex

@article{5750842e9605405890d6079338408a32,
title = "PANFIS: a novel incremental learning machine",
abstract = "Most of the dynamics in real-world systems are compiled by shifts and drifts, which are uneasy to be overcome by omnipresent neuro-fuzzy systems. Nonetheless, learning in nonstationary environment entails a system owning high degree of flexibility capable of assembling its rule base autonomously according to the degree of nonlinearity contained in the system. In practice, the rule growing and pruning are carried out merely benefiting from a small snapshot of the complete training data to truncate the computational load and memory demand to the low level. An exposure of a novel algorithm, namely parsimonious network based on fuzzy inference system (PANFIS), is to this end presented herein. PANFIS can commence its learning process from scratch with an empty rule base. The fuzzy rules can be stitched up and expelled by virtue of statistical contributions of the fuzzy rules and injected datum afterward. Identical fuzzy sets may be alluded and blended to be one fuzzy set as a pursuit of a transparent rule base escalating human's interpretability. The learning and modeling performances of the proposed PANFIS are numerically validated using several benchmark problems from real-world or synthetic datasets. The validation includes comparisons with state-of-the-art evolving neuro-fuzzy methods and showcases that our new method can compete and in some cases even outperform these approaches in terms of predictive fidelity and model complexity.",
keywords = "Evolving neuro-fuzzy systems (ENFSs), incremental learning, sample-wise training",
author = "Maharadhika Pratama and {G. Avanatti}, Sreenatha and Plamen Angelov and Edwin Lughofer",
year = "2014",
month = jan,
doi = "10.1109/TNNLS.2013.2271933",
language = "English",
volume = "25",
pages = "55--68",
journal = "IEEE Transactions on Neural Networks",
issn = "1045-9227",
publisher = "Institute of Electrical and Electronics Engineers Inc.",
number = "1",

}

RIS

TY - JOUR

T1 - PANFIS

T2 - a novel incremental learning machine

AU - Pratama, Maharadhika

AU - G. Avanatti, Sreenatha

AU - Angelov, Plamen

AU - Lughofer, Edwin

PY - 2014/1

Y1 - 2014/1

N2 - Most of the dynamics in real-world systems are compiled by shifts and drifts, which are uneasy to be overcome by omnipresent neuro-fuzzy systems. Nonetheless, learning in nonstationary environment entails a system owning high degree of flexibility capable of assembling its rule base autonomously according to the degree of nonlinearity contained in the system. In practice, the rule growing and pruning are carried out merely benefiting from a small snapshot of the complete training data to truncate the computational load and memory demand to the low level. An exposure of a novel algorithm, namely parsimonious network based on fuzzy inference system (PANFIS), is to this end presented herein. PANFIS can commence its learning process from scratch with an empty rule base. The fuzzy rules can be stitched up and expelled by virtue of statistical contributions of the fuzzy rules and injected datum afterward. Identical fuzzy sets may be alluded and blended to be one fuzzy set as a pursuit of a transparent rule base escalating human's interpretability. The learning and modeling performances of the proposed PANFIS are numerically validated using several benchmark problems from real-world or synthetic datasets. The validation includes comparisons with state-of-the-art evolving neuro-fuzzy methods and showcases that our new method can compete and in some cases even outperform these approaches in terms of predictive fidelity and model complexity.

AB - Most of the dynamics in real-world systems are compiled by shifts and drifts, which are uneasy to be overcome by omnipresent neuro-fuzzy systems. Nonetheless, learning in nonstationary environment entails a system owning high degree of flexibility capable of assembling its rule base autonomously according to the degree of nonlinearity contained in the system. In practice, the rule growing and pruning are carried out merely benefiting from a small snapshot of the complete training data to truncate the computational load and memory demand to the low level. An exposure of a novel algorithm, namely parsimonious network based on fuzzy inference system (PANFIS), is to this end presented herein. PANFIS can commence its learning process from scratch with an empty rule base. The fuzzy rules can be stitched up and expelled by virtue of statistical contributions of the fuzzy rules and injected datum afterward. Identical fuzzy sets may be alluded and blended to be one fuzzy set as a pursuit of a transparent rule base escalating human's interpretability. The learning and modeling performances of the proposed PANFIS are numerically validated using several benchmark problems from real-world or synthetic datasets. The validation includes comparisons with state-of-the-art evolving neuro-fuzzy methods and showcases that our new method can compete and in some cases even outperform these approaches in terms of predictive fidelity and model complexity.

KW - Evolving neuro-fuzzy systems (ENFSs)

KW - incremental learning

KW - sample-wise training

U2 - 10.1109/TNNLS.2013.2271933

DO - 10.1109/TNNLS.2013.2271933

M3 - Journal article

VL - 25

SP - 55

EP - 68

JO - IEEE Transactions on Neural Networks

JF - IEEE Transactions on Neural Networks

SN - 1045-9227

IS - 1

ER -