Final published version
Licence: CC BY: Creative Commons Attribution 4.0 International License
Research output: Contribution to Journal/Magazine › Journal article › peer-review
Research output: Contribution to Journal/Magazine › Journal article › peer-review
}
TY - JOUR
T1 - Splitting stump forests
T2 - tree ensemble compression for edge devices (extended version)
AU - Alkhoury, Fouad
AU - Buschjäger, Sebastian
AU - Welke, Pascal
PY - 2025/8/27
Y1 - 2025/8/27
N2 - We introduce Splitting Stump Forests—small ensembles of weak learners extracted from a trained random forest. The high memory consumption of random forests renders them unfit for resource-constrained devices. We show empirically that we can significantly reduce the model size and inference time by selecting nodes that evenly split the arriving training data and applying a linear model on the resulting representation. Our extensive empirical evaluation indicates that Splitting Stump Forests outperform random forests and state-of-the-art compression methods on memory-limited embedded devices.
AB - We introduce Splitting Stump Forests—small ensembles of weak learners extracted from a trained random forest. The high memory consumption of random forests renders them unfit for resource-constrained devices. We show empirically that we can significantly reduce the model size and inference time by selecting nodes that evenly split the arriving training data and applying a linear model on the resulting representation. Our extensive empirical evaluation indicates that Splitting Stump Forests outperform random forests and state-of-the-art compression methods on memory-limited embedded devices.
KW - Edge devices
KW - Random forests
KW - Ensemble compression
U2 - 10.1007/s10994-025-06866-2
DO - 10.1007/s10994-025-06866-2
M3 - Journal article
VL - 114
JO - Machine Learning
JF - Machine Learning
SN - 0885-6125
IS - 10
M1 - 219
ER -