Home > Research > Publications & Outputs > Exploring Abstractive Text Summarisation for Po...

Electronic data

  • RANLP 2023 Saxena

    Accepted author manuscript, 214 KB, PDF document

    Available under license: CC BY: Creative Commons Attribution 4.0 International License

View graph of relations

Exploring Abstractive Text Summarisation for Podcasts: A Comparative Study of BART and T5 Models

Research output: Contribution to conference - Without ISBN/ISSN Conference paperpeer-review

Published

Standard

Exploring Abstractive Text Summarisation for Podcasts: A Comparative Study of BART and T5 Models. / Saxena, Parth; El-Haj, Mahmoud.
2023. Paper presented at 14th Conference on Recent Advances in Natural Language Processing , Varna, Bulgaria.

Research output: Contribution to conference - Without ISBN/ISSN Conference paperpeer-review

Harvard

Saxena, P & El-Haj, M 2023, 'Exploring Abstractive Text Summarisation for Podcasts: A Comparative Study of BART and T5 Models', Paper presented at 14th Conference on Recent Advances in Natural Language Processing , Varna, Bulgaria, 4/09/23 - 6/09/23.

APA

Saxena, P., & El-Haj, M. (2023). Exploring Abstractive Text Summarisation for Podcasts: A Comparative Study of BART and T5 Models. Paper presented at 14th Conference on Recent Advances in Natural Language Processing , Varna, Bulgaria.

Vancouver

Saxena P, El-Haj M. Exploring Abstractive Text Summarisation for Podcasts: A Comparative Study of BART and T5 Models. 2023. Paper presented at 14th Conference on Recent Advances in Natural Language Processing , Varna, Bulgaria.

Author

Saxena, Parth ; El-Haj, Mahmoud. / Exploring Abstractive Text Summarisation for Podcasts : A Comparative Study of BART and T5 Models. Paper presented at 14th Conference on Recent Advances in Natural Language Processing , Varna, Bulgaria.11 p.

Bibtex

@conference{c6489137fb4f4ee9a46147575a89aec8,
title = "Exploring Abstractive Text Summarisation for Podcasts: A Comparative Study of BART and T5 Models",
abstract = "Podcasts have become increasingly popular in recent years, resulting in a massive amount of audio content being produced every day. Efficient summarisation of podcast episodes can enable better content management and discovery for users. In this paper, we explore the use of abstractive text summarisation methods to generate high-quality summaries of podcast episodes. We use pre-trained models, BART and T5, to fine-tune on a dataset of Spotify's 100K podcast. We evaluate our models using automated metrics and human evaluation, and find that the BART model fine-tuned on the podcast dataset achieved a higher ROUGE-1 and ROUGE-L score compared to other models, while the T5 model performed better in terms of semantic meaning. The human evaluation indicates that both models produced high-quality summaries that were well received by participants. Our study demonstrates the effectiveness of abstractive summarisation methods for podcast episodes and offers insights for improving the summarisation of audio content.",
author = "Parth Saxena and Mahmoud El-Haj",
year = "2023",
month = sep,
day = "6",
language = "English",
note = "14th Conference on Recent Advances in Natural Language Processing , RANLP 2023 ; Conference date: 04-09-2023 Through 06-09-2023",
url = "http://ranlp.org/ranlp2023/",

}

RIS

TY - CONF

T1 - Exploring Abstractive Text Summarisation for Podcasts

T2 - 14th Conference on Recent Advances in Natural Language Processing

AU - Saxena, Parth

AU - El-Haj, Mahmoud

PY - 2023/9/6

Y1 - 2023/9/6

N2 - Podcasts have become increasingly popular in recent years, resulting in a massive amount of audio content being produced every day. Efficient summarisation of podcast episodes can enable better content management and discovery for users. In this paper, we explore the use of abstractive text summarisation methods to generate high-quality summaries of podcast episodes. We use pre-trained models, BART and T5, to fine-tune on a dataset of Spotify's 100K podcast. We evaluate our models using automated metrics and human evaluation, and find that the BART model fine-tuned on the podcast dataset achieved a higher ROUGE-1 and ROUGE-L score compared to other models, while the T5 model performed better in terms of semantic meaning. The human evaluation indicates that both models produced high-quality summaries that were well received by participants. Our study demonstrates the effectiveness of abstractive summarisation methods for podcast episodes and offers insights for improving the summarisation of audio content.

AB - Podcasts have become increasingly popular in recent years, resulting in a massive amount of audio content being produced every day. Efficient summarisation of podcast episodes can enable better content management and discovery for users. In this paper, we explore the use of abstractive text summarisation methods to generate high-quality summaries of podcast episodes. We use pre-trained models, BART and T5, to fine-tune on a dataset of Spotify's 100K podcast. We evaluate our models using automated metrics and human evaluation, and find that the BART model fine-tuned on the podcast dataset achieved a higher ROUGE-1 and ROUGE-L score compared to other models, while the T5 model performed better in terms of semantic meaning. The human evaluation indicates that both models produced high-quality summaries that were well received by participants. Our study demonstrates the effectiveness of abstractive summarisation methods for podcast episodes and offers insights for improving the summarisation of audio content.

M3 - Conference paper

Y2 - 4 September 2023 through 6 September 2023

ER -