Home > Research > Publications & Outputs > Splitting stump forests

Links

Text available via DOI:

View graph of relations

Splitting stump forests: tree ensemble compression for edge devices (extended version)

Research output: Contribution to Journal/MagazineJournal articlepeer-review

E-pub ahead of print

Standard

Splitting stump forests: tree ensemble compression for edge devices (extended version). / Alkhoury, Fouad; Buschjäger, Sebastian; Welke, Pascal.
In: Machine Learning, Vol. 114, No. 10, 219, 31.10.2025.

Research output: Contribution to Journal/MagazineJournal articlepeer-review

Harvard

APA

Alkhoury, F., Buschjäger, S., & Welke, P. (2025). Splitting stump forests: tree ensemble compression for edge devices (extended version). Machine Learning, 114(10), Article 219. Advance online publication. https://doi.org/10.1007/s10994-025-06866-2

Vancouver

Alkhoury F, Buschjäger S, Welke P. Splitting stump forests: tree ensemble compression for edge devices (extended version). Machine Learning. 2025 Oct 31;114(10):219. Epub 2025 Aug 27. doi: 10.1007/s10994-025-06866-2

Author

Alkhoury, Fouad ; Buschjäger, Sebastian ; Welke, Pascal. / Splitting stump forests : tree ensemble compression for edge devices (extended version). In: Machine Learning. 2025 ; Vol. 114, No. 10.

Bibtex

@article{3ef349dd6e374c14b91ef71d33146723,
title = "Splitting stump forests: tree ensemble compression for edge devices (extended version)",
abstract = "We introduce Splitting Stump Forests—small ensembles of weak learners extracted from a trained random forest. The high memory consumption of random forests renders them unfit for resource-constrained devices. We show empirically that we can significantly reduce the model size and inference time by selecting nodes that evenly split the arriving training data and applying a linear model on the resulting representation. Our extensive empirical evaluation indicates that Splitting Stump Forests outperform random forests and state-of-the-art compression methods on memory-limited embedded devices.",
keywords = "Edge devices, Random forests, Ensemble compression",
author = "Fouad Alkhoury and Sebastian Buschj{\"a}ger and Pascal Welke",
year = "2025",
month = aug,
day = "27",
doi = "10.1007/s10994-025-06866-2",
language = "English",
volume = "114",
journal = "Machine Learning",
issn = "0885-6125",
publisher = "Springer Netherlands",
number = "10",

}

RIS

TY - JOUR

T1 - Splitting stump forests

T2 - tree ensemble compression for edge devices (extended version)

AU - Alkhoury, Fouad

AU - Buschjäger, Sebastian

AU - Welke, Pascal

PY - 2025/8/27

Y1 - 2025/8/27

N2 - We introduce Splitting Stump Forests—small ensembles of weak learners extracted from a trained random forest. The high memory consumption of random forests renders them unfit for resource-constrained devices. We show empirically that we can significantly reduce the model size and inference time by selecting nodes that evenly split the arriving training data and applying a linear model on the resulting representation. Our extensive empirical evaluation indicates that Splitting Stump Forests outperform random forests and state-of-the-art compression methods on memory-limited embedded devices.

AB - We introduce Splitting Stump Forests—small ensembles of weak learners extracted from a trained random forest. The high memory consumption of random forests renders them unfit for resource-constrained devices. We show empirically that we can significantly reduce the model size and inference time by selecting nodes that evenly split the arriving training data and applying a linear model on the resulting representation. Our extensive empirical evaluation indicates that Splitting Stump Forests outperform random forests and state-of-the-art compression methods on memory-limited embedded devices.

KW - Edge devices

KW - Random forests

KW - Ensemble compression

U2 - 10.1007/s10994-025-06866-2

DO - 10.1007/s10994-025-06866-2

M3 - Journal article

VL - 114

JO - Machine Learning

JF - Machine Learning

SN - 0885-6125

IS - 10

M1 - 219

ER -