Home > Research > Publications & Outputs > Bayesian additive regression trees with model t...

Links

Text available via DOI:

View graph of relations

Bayesian additive regression trees with model trees

Research output: Contribution to Journal/MagazineJournal articlepeer-review

Published
Close
Article number20
<mark>Journal publication date</mark>3/03/2021
<mark>Journal</mark>Statistics and Computing
Issue number3
Volume31
Number of pages13
Pages (from-to)1-13
Publication StatusPublished
<mark>Original language</mark>English

Abstract

Bayesian additive regression trees (BART) is a tree-based machine learning method that has been successfully applied to regression and classification problems. BART assumes regularisation priors on a set of trees that work as weak learners and is very flexible for predicting in the presence of nonlinearity and high-order interactions. In this paper, we introduce an extension of BART, called model trees BART (MOTR-BART), that considers piecewise linear functions at node levels instead of piecewise constants. In MOTR-BART, rather than having a unique value at node level for the prediction, a linear predictor is estimated considering the covariates that have been used as the split variables in the corresponding tree. In our approach, local linearities are captured more efficiently and fewer trees are required to achieve equal or better performance than BART. Via simulation studies and real data applications, we compare MOTR-BART to its main competitors. R code for MOTR-BART implementation is available at https://github.com/ebprado/MOTR-BART.