Home > Research > Publications & Outputs > Unlocking the soundscape of coral reefs with ar...

Links

Text available via DOI:

View graph of relations

Unlocking the soundscape of coral reefs with artificial intelligence: pretrained networks and unsupervised learning win out

Research output: Contribution to Journal/MagazineJournal articlepeer-review

Published

Standard

Unlocking the soundscape of coral reefs with artificial intelligence: pretrained networks and unsupervised learning win out. / Williams, Ben; Balvanera, Santiago M.; Sethi, Sarab S. et al.
In: PLoS Computational Biology, Vol. 21, No. 4, e1013029, 28.04.2025.

Research output: Contribution to Journal/MagazineJournal articlepeer-review

Harvard

Williams, B, Balvanera, SM, Sethi, SS, Lamont, TAC, Jompa, J, Prasetya, M, Richardson, L, Chapuis, L, Weschke, E, Hoey, A, Beldade, R, Mills, SC, Haguenauer, A, Zuberer, F, Simpson, SD, Curnick, D, Jones, KE & Papin, JA (ed.) 2025, 'Unlocking the soundscape of coral reefs with artificial intelligence: pretrained networks and unsupervised learning win out', PLoS Computational Biology, vol. 21, no. 4, e1013029. https://doi.org/10.1371/journal.pcbi.1013029

APA

Williams, B., Balvanera, S. M., Sethi, S. S., Lamont, T. A. C., Jompa, J., Prasetya, M., Richardson, L., Chapuis, L., Weschke, E., Hoey, A., Beldade, R., Mills, S. C., Haguenauer, A., Zuberer, F., Simpson, S. D., Curnick, D., Jones, K. E., & Papin, J. A. (Ed.) (2025). Unlocking the soundscape of coral reefs with artificial intelligence: pretrained networks and unsupervised learning win out. PLoS Computational Biology, 21(4), Article e1013029. https://doi.org/10.1371/journal.pcbi.1013029

Vancouver

Williams B, Balvanera SM, Sethi SS, Lamont TAC, Jompa J, Prasetya M et al. Unlocking the soundscape of coral reefs with artificial intelligence: pretrained networks and unsupervised learning win out. PLoS Computational Biology. 2025 Apr 28;21(4):e1013029. doi: 10.1371/journal.pcbi.1013029

Author

Williams, Ben ; Balvanera, Santiago M. ; Sethi, Sarab S. et al. / Unlocking the soundscape of coral reefs with artificial intelligence : pretrained networks and unsupervised learning win out. In: PLoS Computational Biology. 2025 ; Vol. 21, No. 4.

Bibtex

@article{c50762dc095042f9bcbcbd01d35df4cd,
title = "Unlocking the soundscape of coral reefs with artificial intelligence: pretrained networks and unsupervised learning win out",
abstract = "Passive acoustic monitoring can offer insights into the state of coral reef ecosystems at low-costs and over extended temporal periods. Comparison of whole soundscape properties can rapidly deliver broad insights from acoustic data, in contrast to detailed but time-consuming analysis of individual bioacoustic events. However, a lack of effective automated analysis for whole soundscape data has impeded progress in this field. Here, we show that machine learning (ML) can be used to unlock greater insights from reef soundscapes. We showcase this on a diverse set of tasks using three biogeographically independent datasets, each containing fish community (high or low), coral cover (high or low) or depth zone (shallow or mesophotic) classes. We show supervised learning can be used to train models that can identify ecological classes and individual sites from whole soundscapes. However, we report unsupervised clustering achieves this whilst providing a more detailed understanding of ecological and site groupings within soundscape data. We also compare three different approaches for extracting feature embeddings from soundscape recordings for input into ML algorithms: acoustic indices commonly used by soundscape ecologists, a pretrained convolutional neural network (P-CNN) trained on 5.2 million hrs of YouTube audio, and CNN{\textquoteright}s which were trained on each individual task (T-CNN). Although the T-CNN performs marginally better across tasks, we reveal that the P-CNN offers a powerful tool for generating insights from marine soundscape data as it requires orders of magnitude less computational resources whilst achieving near comparable performance to the T-CNN, with significant performance improvements over the acoustic indices. Our findings have implications for soundscape ecology in any habitat.",
author = "Ben Williams and Balvanera, {Santiago M.} and Sethi, {Sarab S.} and Lamont, {Timothy A.C.} and Jamaluddin Jompa and Mochyudho Prasetya and Laura Richardson and Lucille Chapuis and Emma Weschke and Andrew Hoey and Ricardo Beldade and Mills, {Suzanne C.} and Anne Haguenauer and Frederic Zuberer and Simpson, {Stephen D.} and David Curnick and Jones, {Kate E.} and Papin, {Jason A.}",
year = "2025",
month = apr,
day = "28",
doi = "10.1371/journal.pcbi.1013029",
language = "English",
volume = "21",
journal = "PLoS Computational Biology",
issn = "1553-734X",
publisher = "Public Library of Science",
number = "4",

}

RIS

TY - JOUR

T1 - Unlocking the soundscape of coral reefs with artificial intelligence

T2 - pretrained networks and unsupervised learning win out

AU - Williams, Ben

AU - Balvanera, Santiago M.

AU - Sethi, Sarab S.

AU - Lamont, Timothy A.C.

AU - Jompa, Jamaluddin

AU - Prasetya, Mochyudho

AU - Richardson, Laura

AU - Chapuis, Lucille

AU - Weschke, Emma

AU - Hoey, Andrew

AU - Beldade, Ricardo

AU - Mills, Suzanne C.

AU - Haguenauer, Anne

AU - Zuberer, Frederic

AU - Simpson, Stephen D.

AU - Curnick, David

AU - Jones, Kate E.

A2 - Papin, Jason A.

PY - 2025/4/28

Y1 - 2025/4/28

N2 - Passive acoustic monitoring can offer insights into the state of coral reef ecosystems at low-costs and over extended temporal periods. Comparison of whole soundscape properties can rapidly deliver broad insights from acoustic data, in contrast to detailed but time-consuming analysis of individual bioacoustic events. However, a lack of effective automated analysis for whole soundscape data has impeded progress in this field. Here, we show that machine learning (ML) can be used to unlock greater insights from reef soundscapes. We showcase this on a diverse set of tasks using three biogeographically independent datasets, each containing fish community (high or low), coral cover (high or low) or depth zone (shallow or mesophotic) classes. We show supervised learning can be used to train models that can identify ecological classes and individual sites from whole soundscapes. However, we report unsupervised clustering achieves this whilst providing a more detailed understanding of ecological and site groupings within soundscape data. We also compare three different approaches for extracting feature embeddings from soundscape recordings for input into ML algorithms: acoustic indices commonly used by soundscape ecologists, a pretrained convolutional neural network (P-CNN) trained on 5.2 million hrs of YouTube audio, and CNN’s which were trained on each individual task (T-CNN). Although the T-CNN performs marginally better across tasks, we reveal that the P-CNN offers a powerful tool for generating insights from marine soundscape data as it requires orders of magnitude less computational resources whilst achieving near comparable performance to the T-CNN, with significant performance improvements over the acoustic indices. Our findings have implications for soundscape ecology in any habitat.

AB - Passive acoustic monitoring can offer insights into the state of coral reef ecosystems at low-costs and over extended temporal periods. Comparison of whole soundscape properties can rapidly deliver broad insights from acoustic data, in contrast to detailed but time-consuming analysis of individual bioacoustic events. However, a lack of effective automated analysis for whole soundscape data has impeded progress in this field. Here, we show that machine learning (ML) can be used to unlock greater insights from reef soundscapes. We showcase this on a diverse set of tasks using three biogeographically independent datasets, each containing fish community (high or low), coral cover (high or low) or depth zone (shallow or mesophotic) classes. We show supervised learning can be used to train models that can identify ecological classes and individual sites from whole soundscapes. However, we report unsupervised clustering achieves this whilst providing a more detailed understanding of ecological and site groupings within soundscape data. We also compare three different approaches for extracting feature embeddings from soundscape recordings for input into ML algorithms: acoustic indices commonly used by soundscape ecologists, a pretrained convolutional neural network (P-CNN) trained on 5.2 million hrs of YouTube audio, and CNN’s which were trained on each individual task (T-CNN). Although the T-CNN performs marginally better across tasks, we reveal that the P-CNN offers a powerful tool for generating insights from marine soundscape data as it requires orders of magnitude less computational resources whilst achieving near comparable performance to the T-CNN, with significant performance improvements over the acoustic indices. Our findings have implications for soundscape ecology in any habitat.

U2 - 10.1371/journal.pcbi.1013029

DO - 10.1371/journal.pcbi.1013029

M3 - Journal article

VL - 21

JO - PLoS Computational Biology

JF - PLoS Computational Biology

SN - 1553-734X

IS - 4

M1 - e1013029

ER -