Home > Research > Publications & Outputs > No Need to Forget, Just Keep the Balance

Associated organisational unit

Electronic data

  • Tovar&Westermann-Cognition2022-preprint

    Rights statement: This is the author’s version of a work that was accepted for publication in Cognition. Changes resulting from the publishing process, such as peer review, editing, corrections, structural formatting, and other quality control mechanisms may not be reflected in this document. Changes may have been made to this work since it was submitted for publication. A definitive version was subsequently published in Cognition, 230, 2023 DOI: 10.1016/j.cognition.2022.105176

    Accepted author manuscript, 715 KB, PDF document

    Available under license: CC BY-NC-ND: Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License

Links

Text available via DOI:

View graph of relations

No Need to Forget, Just Keep the Balance: Hebbian Neural Networks for Statistical Learning

Research output: Contribution to Journal/MagazineJournal articlepeer-review

Published

Standard

No Need to Forget, Just Keep the Balance: Hebbian Neural Networks for Statistical Learning. / Tovar, Angel E.; Westermann, Gert.
In: Cognition, Vol. 230, 105176, 31.01.2023.

Research output: Contribution to Journal/MagazineJournal articlepeer-review

Harvard

APA

Vancouver

Tovar AE, Westermann G. No Need to Forget, Just Keep the Balance: Hebbian Neural Networks for Statistical Learning. Cognition. 2023 Jan 31;230:105176. Epub 2022 Nov 25. doi: 10.1016/j.cognition.2022.105176

Author

Bibtex

@article{77a05de810574c2596143a3fccfca00a,
title = "No Need to Forget, Just Keep the Balance: Hebbian Neural Networks for Statistical Learning",
abstract = "Language processing in humans has long been proposed to rely on sophisticated learning abilities including statistical learning. Endress and Johnson (E&J, 2021) recently presented a neural network model for statistical learning based on Hebbian learning principles. This model accounts for word segmentation tasks, one primary paradigm in statistical learning. In this discussion paper we review this model and compare it with the Hebbian model previously presented by Tovar and Westermann (T&W, 2017a; 2017b; 2018) that has accounted for serial reaction time tasks, cross-situational learning, and categorization paradigms, all relevant in the study of statistical learning. We discuss the similarities and differences between both models, and their key findings. From our analysis, we question the concept of “forgetting” in the model of E&J and their suggestion of considering forgetting as the critical ingredient for successful statistical learning. We instead suggest that a set of simple but well-balanced mechanisms including spreading activation, activation persistence, and synaptic weight decay, all based on biologically grounded principles, allow modeling statistical learning in Hebbian neural networks, as demonstrated in the T&W model which successfully covers learning of nonadjacent dependencies and accounts for differences between typical and atypical populations, both aspects that have not been fully demonstrated in the E&J model. We outline the main computational and theoretical differences between the E&J and T&W approaches, present new simulation results, and discuss implications for the development of a computational cognitive theory of statistical learning.",
keywords = "Statistical learning, Hebbian learning, Artificial neural networks, Language processing, Computational modeling",
author = "Tovar, {Angel E.} and Gert Westermann",
note = "This is the author{\textquoteright}s version of a work that was accepted for publication in Cognition. Changes resulting from the publishing process, such as peer review, editing, corrections, structural formatting, and other quality control mechanisms may not be reflected in this document. Changes may have been made to this work since it was submitted for publication. A definitive version was subsequently published in Cognition, 230, 2023 DOI: 10.1016/j.cognition.2022.105176",
year = "2023",
month = jan,
day = "31",
doi = "10.1016/j.cognition.2022.105176",
language = "English",
volume = "230",
journal = "Cognition",
issn = "0010-0277",
publisher = "Elsevier",

}

RIS

TY - JOUR

T1 - No Need to Forget, Just Keep the Balance

T2 - Hebbian Neural Networks for Statistical Learning

AU - Tovar, Angel E.

AU - Westermann, Gert

N1 - This is the author’s version of a work that was accepted for publication in Cognition. Changes resulting from the publishing process, such as peer review, editing, corrections, structural formatting, and other quality control mechanisms may not be reflected in this document. Changes may have been made to this work since it was submitted for publication. A definitive version was subsequently published in Cognition, 230, 2023 DOI: 10.1016/j.cognition.2022.105176

PY - 2023/1/31

Y1 - 2023/1/31

N2 - Language processing in humans has long been proposed to rely on sophisticated learning abilities including statistical learning. Endress and Johnson (E&J, 2021) recently presented a neural network model for statistical learning based on Hebbian learning principles. This model accounts for word segmentation tasks, one primary paradigm in statistical learning. In this discussion paper we review this model and compare it with the Hebbian model previously presented by Tovar and Westermann (T&W, 2017a; 2017b; 2018) that has accounted for serial reaction time tasks, cross-situational learning, and categorization paradigms, all relevant in the study of statistical learning. We discuss the similarities and differences between both models, and their key findings. From our analysis, we question the concept of “forgetting” in the model of E&J and their suggestion of considering forgetting as the critical ingredient for successful statistical learning. We instead suggest that a set of simple but well-balanced mechanisms including spreading activation, activation persistence, and synaptic weight decay, all based on biologically grounded principles, allow modeling statistical learning in Hebbian neural networks, as demonstrated in the T&W model which successfully covers learning of nonadjacent dependencies and accounts for differences between typical and atypical populations, both aspects that have not been fully demonstrated in the E&J model. We outline the main computational and theoretical differences between the E&J and T&W approaches, present new simulation results, and discuss implications for the development of a computational cognitive theory of statistical learning.

AB - Language processing in humans has long been proposed to rely on sophisticated learning abilities including statistical learning. Endress and Johnson (E&J, 2021) recently presented a neural network model for statistical learning based on Hebbian learning principles. This model accounts for word segmentation tasks, one primary paradigm in statistical learning. In this discussion paper we review this model and compare it with the Hebbian model previously presented by Tovar and Westermann (T&W, 2017a; 2017b; 2018) that has accounted for serial reaction time tasks, cross-situational learning, and categorization paradigms, all relevant in the study of statistical learning. We discuss the similarities and differences between both models, and their key findings. From our analysis, we question the concept of “forgetting” in the model of E&J and their suggestion of considering forgetting as the critical ingredient for successful statistical learning. We instead suggest that a set of simple but well-balanced mechanisms including spreading activation, activation persistence, and synaptic weight decay, all based on biologically grounded principles, allow modeling statistical learning in Hebbian neural networks, as demonstrated in the T&W model which successfully covers learning of nonadjacent dependencies and accounts for differences between typical and atypical populations, both aspects that have not been fully demonstrated in the E&J model. We outline the main computational and theoretical differences between the E&J and T&W approaches, present new simulation results, and discuss implications for the development of a computational cognitive theory of statistical learning.

KW - Statistical learning

KW - Hebbian learning

KW - Artificial neural networks

KW - Language processing

KW - Computational modeling

U2 - 10.1016/j.cognition.2022.105176

DO - 10.1016/j.cognition.2022.105176

M3 - Journal article

VL - 230

JO - Cognition

JF - Cognition

SN - 0010-0277

M1 - 105176

ER -