Final published version
Research output: Contribution to Journal/Magazine › Journal article › peer-review
Article number | 20 |
---|---|
<mark>Journal publication date</mark> | 3/03/2021 |
<mark>Journal</mark> | Statistics and Computing |
Issue number | 3 |
Volume | 31 |
Number of pages | 13 |
Pages (from-to) | 1-13 |
Publication Status | Published |
<mark>Original language</mark> | English |
Bayesian additive regression trees (BART) is a tree-based machine learning method that has been successfully applied to regression and classification problems. BART assumes regularisation priors on a set of trees that work as weak learners and is very flexible for predicting in the presence of nonlinearity and high-order interactions. In this paper, we introduce an extension of BART, called model trees BART (MOTR-BART), that considers piecewise linear functions at node levels instead of piecewise constants. In MOTR-BART, rather than having a unique value at node level for the prediction, a linear predictor is estimated considering the covariates that have been used as the split variables in the corresponding tree. In our approach, local linearities are captured more efficiently and fewer trees are required to achieve equal or better performance than BART. Via simulation studies and real data applications, we compare MOTR-BART to its main competitors. R code for MOTR-BART implementation is available at https://github.com/ebprado/MOTR-BART.