Home > Research > Publications & Outputs > TTL: transformer-based two-phase transfer learn...

Links

Text available via DOI:

View graph of relations

TTL: transformer-based two-phase transfer learning for cross-lingual news event detection

Research output: Contribution to Journal/MagazineJournal articlepeer-review

Published

Standard

TTL: transformer-based two-phase transfer learning for cross-lingual news event detection. / Hettiarachchi, Hansi; Adedoyin-Olowe, Mariam; Bhogal, Jagdev et al.
In: International Journal of Machine Learning and Cybernetics, Vol. 14, 08.03.2023, p. 2739–2760.

Research output: Contribution to Journal/MagazineJournal articlepeer-review

Harvard

Hettiarachchi, H, Adedoyin-Olowe, M, Bhogal, J & Gaber, MM 2023, 'TTL: transformer-based two-phase transfer learning for cross-lingual news event detection', International Journal of Machine Learning and Cybernetics, vol. 14, pp. 2739–2760. https://doi.org/10.1007/s13042-023-01795-9

APA

Hettiarachchi, H., Adedoyin-Olowe, M., Bhogal, J., & Gaber, M. M. (2023). TTL: transformer-based two-phase transfer learning for cross-lingual news event detection. International Journal of Machine Learning and Cybernetics, 14, 2739–2760. https://doi.org/10.1007/s13042-023-01795-9

Vancouver

Hettiarachchi H, Adedoyin-Olowe M, Bhogal J, Gaber MM. TTL: transformer-based two-phase transfer learning for cross-lingual news event detection. International Journal of Machine Learning and Cybernetics. 2023 Mar 8;14:2739–2760. doi: 10.1007/s13042-023-01795-9

Author

Hettiarachchi, Hansi ; Adedoyin-Olowe, Mariam ; Bhogal, Jagdev et al. / TTL: transformer-based two-phase transfer learning for cross-lingual news event detection. In: International Journal of Machine Learning and Cybernetics. 2023 ; Vol. 14. pp. 2739–2760.

Bibtex

@article{49acdb47deb247cfad885ca514161798,
title = "TTL: transformer-based two-phase transfer learning for cross-lingual news event detection",
abstract = "Today, we have access to a vast data amount, especially on the internet. Online news agencies play a vital role in this data generation, but most of their data is unstructured, requiring an enormous effort to extract important information. Thus, automated intelligent event detection mechanisms are invaluable to the community. In this research, we focus on identifying event details at the sentence and token levels from news articles, considering their fine granularity. Previous research has proposed various approaches ranging from traditional machine learning to deep learning, targeting event detection at these levels. Among these approaches, transformer-based approaches performed best, utilising transformers{\textquoteright} transferability and context awareness, and achieved state-of-the-art results. However, they considered sentence and token level tasks as separate tasks even though their interconnections can be utilised for mutual task improvements. To fill this gap, we propose a novel learning strategy named Two-phase Transfer Learning (TTL) based on transformers, which allows the model to utilise the knowledge from a task at a particular data granularity for another task at different data granularity, and evaluate its performance in sentence and token level event detection. Also, we empirically evaluate how the event detection performance can be improved for different languages (high- and low-resource), involving monolingual and multilingual pre-trained transformers and language-based learning strategies along with the proposed learning strategy. Our findings mainly indicate the effectiveness of multilingual models in low-resource language event detection. Also, TTL can further improve model performance, depending on the involved tasks{\textquoteright} learning order and their relatedness concerning final predictions.",
author = "Hansi Hettiarachchi and Mariam Adedoyin-Olowe and Jagdev Bhogal and Gaber, {Mohamed Medhat}",
year = "2023",
month = mar,
day = "8",
doi = "10.1007/s13042-023-01795-9",
language = "English",
volume = "14",
pages = "2739–2760",
journal = "International Journal of Machine Learning and Cybernetics",

}

RIS

TY - JOUR

T1 - TTL: transformer-based two-phase transfer learning for cross-lingual news event detection

AU - Hettiarachchi, Hansi

AU - Adedoyin-Olowe, Mariam

AU - Bhogal, Jagdev

AU - Gaber, Mohamed Medhat

PY - 2023/3/8

Y1 - 2023/3/8

N2 - Today, we have access to a vast data amount, especially on the internet. Online news agencies play a vital role in this data generation, but most of their data is unstructured, requiring an enormous effort to extract important information. Thus, automated intelligent event detection mechanisms are invaluable to the community. In this research, we focus on identifying event details at the sentence and token levels from news articles, considering their fine granularity. Previous research has proposed various approaches ranging from traditional machine learning to deep learning, targeting event detection at these levels. Among these approaches, transformer-based approaches performed best, utilising transformers’ transferability and context awareness, and achieved state-of-the-art results. However, they considered sentence and token level tasks as separate tasks even though their interconnections can be utilised for mutual task improvements. To fill this gap, we propose a novel learning strategy named Two-phase Transfer Learning (TTL) based on transformers, which allows the model to utilise the knowledge from a task at a particular data granularity for another task at different data granularity, and evaluate its performance in sentence and token level event detection. Also, we empirically evaluate how the event detection performance can be improved for different languages (high- and low-resource), involving monolingual and multilingual pre-trained transformers and language-based learning strategies along with the proposed learning strategy. Our findings mainly indicate the effectiveness of multilingual models in low-resource language event detection. Also, TTL can further improve model performance, depending on the involved tasks’ learning order and their relatedness concerning final predictions.

AB - Today, we have access to a vast data amount, especially on the internet. Online news agencies play a vital role in this data generation, but most of their data is unstructured, requiring an enormous effort to extract important information. Thus, automated intelligent event detection mechanisms are invaluable to the community. In this research, we focus on identifying event details at the sentence and token levels from news articles, considering their fine granularity. Previous research has proposed various approaches ranging from traditional machine learning to deep learning, targeting event detection at these levels. Among these approaches, transformer-based approaches performed best, utilising transformers’ transferability and context awareness, and achieved state-of-the-art results. However, they considered sentence and token level tasks as separate tasks even though their interconnections can be utilised for mutual task improvements. To fill this gap, we propose a novel learning strategy named Two-phase Transfer Learning (TTL) based on transformers, which allows the model to utilise the knowledge from a task at a particular data granularity for another task at different data granularity, and evaluate its performance in sentence and token level event detection. Also, we empirically evaluate how the event detection performance can be improved for different languages (high- and low-resource), involving monolingual and multilingual pre-trained transformers and language-based learning strategies along with the proposed learning strategy. Our findings mainly indicate the effectiveness of multilingual models in low-resource language event detection. Also, TTL can further improve model performance, depending on the involved tasks’ learning order and their relatedness concerning final predictions.

U2 - 10.1007/s13042-023-01795-9

DO - 10.1007/s13042-023-01795-9

M3 - Journal article

VL - 14

SP - 2739

EP - 2760

JO - International Journal of Machine Learning and Cybernetics

JF - International Journal of Machine Learning and Cybernetics

ER -