Home > Research > Publications & Outputs > Gaussian random fields as an abstract represent...

Links

Text available via DOI:

View graph of relations

Gaussian random fields as an abstract representation of patient metadata for multimodal medical image segmentation

Research output: Contribution to Journal/MagazineJournal articlepeer-review

Published

Standard

Gaussian random fields as an abstract representation of patient metadata for multimodal medical image segmentation. / Cassidy, Bill; McBride, Christian; Kendrick, Connah et al.
In: Scientific Reports, Vol. 15, No. 1, 18810, 29.05.2025.

Research output: Contribution to Journal/MagazineJournal articlepeer-review

Harvard

Cassidy, B, McBride, C, Kendrick, C, Reeves, ND, Pappachan, JM, Raad, S & Yap, MH 2025, 'Gaussian random fields as an abstract representation of patient metadata for multimodal medical image segmentation', Scientific Reports, vol. 15, no. 1, 18810. https://doi.org/10.1038/s41598-025-03393-x

APA

Cassidy, B., McBride, C., Kendrick, C., Reeves, N. D., Pappachan, J. M., Raad, S., & Yap, M. H. (2025). Gaussian random fields as an abstract representation of patient metadata for multimodal medical image segmentation. Scientific Reports, 15(1), Article 18810. https://doi.org/10.1038/s41598-025-03393-x

Vancouver

Cassidy B, McBride C, Kendrick C, Reeves ND, Pappachan JM, Raad S et al. Gaussian random fields as an abstract representation of patient metadata for multimodal medical image segmentation. Scientific Reports. 2025 May 29;15(1):18810. doi: 10.1038/s41598-025-03393-x

Author

Cassidy, Bill ; McBride, Christian ; Kendrick, Connah et al. / Gaussian random fields as an abstract representation of patient metadata for multimodal medical image segmentation. In: Scientific Reports. 2025 ; Vol. 15, No. 1.

Bibtex

@article{e3989b55d6c7448f9df888142b4d458c,
title = "Gaussian random fields as an abstract representation of patient metadata for multimodal medical image segmentation",
abstract = "Growing rates of chronic wound occurrence, especially in patients with diabetes, has become a recent concerning trend. Chronic wounds are difficult and costly to treat, and have become a serious burden on health care systems worldwide. Innovative deep learning methods for the detection and monitoring of such wounds have the potential to reduce the impact to patients and clinicians. We present a novel multimodal segmentation method which allows for the introduction of patient metadata into the training workflow whereby the patient data are expressed as Gaussian random fields. Our results indicate that the proposed method improved performance when utilising multiple models, each trained on different metadata categories. Using the Diabetic Foot Ulcer Challenge 2022 test set, when compared to the baseline results (intersection over union = 0.4670, Dice similarity coefficient = 0.5908) we demonstrate improvements of +0.0220 and +0.0229 for intersection over union and Dice similarity coefficient respectively. This paper presents the first study to focus on integrating patient data into a chronic wound segmentation workflow. Our results show significant performance gains when training individual models using specific metadata categories, followed by average merging of prediction masks using distance transforms. All source code for this study is available at: https://github.com/mmu-dermatology-research/multimodal-grf",
author = "Bill Cassidy and Christian McBride and Connah Kendrick and Reeves, {Neil D.} and Pappachan, {Joseph M.} and Shaghayegh Raad and Yap, {Moi Hoon}",
year = "2025",
month = may,
day = "29",
doi = "10.1038/s41598-025-03393-x",
language = "English",
volume = "15",
journal = "Scientific Reports",
issn = "2045-2322",
publisher = "Nature Publishing Group",
number = "1",

}

RIS

TY - JOUR

T1 - Gaussian random fields as an abstract representation of patient metadata for multimodal medical image segmentation

AU - Cassidy, Bill

AU - McBride, Christian

AU - Kendrick, Connah

AU - Reeves, Neil D.

AU - Pappachan, Joseph M.

AU - Raad, Shaghayegh

AU - Yap, Moi Hoon

PY - 2025/5/29

Y1 - 2025/5/29

N2 - Growing rates of chronic wound occurrence, especially in patients with diabetes, has become a recent concerning trend. Chronic wounds are difficult and costly to treat, and have become a serious burden on health care systems worldwide. Innovative deep learning methods for the detection and monitoring of such wounds have the potential to reduce the impact to patients and clinicians. We present a novel multimodal segmentation method which allows for the introduction of patient metadata into the training workflow whereby the patient data are expressed as Gaussian random fields. Our results indicate that the proposed method improved performance when utilising multiple models, each trained on different metadata categories. Using the Diabetic Foot Ulcer Challenge 2022 test set, when compared to the baseline results (intersection over union = 0.4670, Dice similarity coefficient = 0.5908) we demonstrate improvements of +0.0220 and +0.0229 for intersection over union and Dice similarity coefficient respectively. This paper presents the first study to focus on integrating patient data into a chronic wound segmentation workflow. Our results show significant performance gains when training individual models using specific metadata categories, followed by average merging of prediction masks using distance transforms. All source code for this study is available at: https://github.com/mmu-dermatology-research/multimodal-grf

AB - Growing rates of chronic wound occurrence, especially in patients with diabetes, has become a recent concerning trend. Chronic wounds are difficult and costly to treat, and have become a serious burden on health care systems worldwide. Innovative deep learning methods for the detection and monitoring of such wounds have the potential to reduce the impact to patients and clinicians. We present a novel multimodal segmentation method which allows for the introduction of patient metadata into the training workflow whereby the patient data are expressed as Gaussian random fields. Our results indicate that the proposed method improved performance when utilising multiple models, each trained on different metadata categories. Using the Diabetic Foot Ulcer Challenge 2022 test set, when compared to the baseline results (intersection over union = 0.4670, Dice similarity coefficient = 0.5908) we demonstrate improvements of +0.0220 and +0.0229 for intersection over union and Dice similarity coefficient respectively. This paper presents the first study to focus on integrating patient data into a chronic wound segmentation workflow. Our results show significant performance gains when training individual models using specific metadata categories, followed by average merging of prediction masks using distance transforms. All source code for this study is available at: https://github.com/mmu-dermatology-research/multimodal-grf

U2 - 10.1038/s41598-025-03393-x

DO - 10.1038/s41598-025-03393-x

M3 - Journal article

VL - 15

JO - Scientific Reports

JF - Scientific Reports

SN - 2045-2322

IS - 1

M1 - 18810

ER -