Home > Research > Publications & Outputs > IgboBERT Models

Links

View graph of relations

IgboBERT Models: Building and Training Transformer Models for the Igbo Language

Research output: Contribution in Book/Report/Proceedings - With ISBN/ISSNConference contribution/Paperpeer-review

Published
Publication date20/06/2022
Host publicationLREC 2022 Conference Proceedings
EditorsNicoletta Calzolari
Place of PublicationParis
PublisherEuropean Language Resources Association (ELRA)
Pages5114–5122
Number of pages9
ISBN (electronic)9781095546726
<mark>Original language</mark>English
Event13th Language Resources and Evaluation Conference - Marseille, France
Duration: 20/06/202225/06/2022
https://lrec2022.lrec-conf.org/en/

Conference

Conference13th Language Resources and Evaluation Conference
Abbreviated titleLREC 2022
Country/TerritoryFrance
CityMarseille
Period20/06/2225/06/22
Internet address

Conference

Conference13th Language Resources and Evaluation Conference
Abbreviated titleLREC 2022
Country/TerritoryFrance
CityMarseille
Period20/06/2225/06/22
Internet address

Abstract

This work presents a standard Igbo named entity recognition (IgboNER) dataset as well as the results from training and fine-tuning state-of-the-art transformer IgboNER models. We discuss the process of our dataset creation - data collection and annotation and quality checking. We also present experimental processes involved in building an IgboBERT language model from scratch as well as fine-tuning it along with other non-Igbo pre-trained models for the downstream IgboNER task. Our results show that, although the IgboNER task benefited hugely from fine-tuning large transformer model, fine-tuning a transformer model built from scratch with comparatively little Igbo text data seems to yield quite decent results for the IgboNER task. This work will contribute immensely to IgboNLP in particular as well as the wider African and low-resource NLP efforts.