Home > Research > Publications & Outputs > Bayesian additive regression trees with model t...

Links

Text available via DOI:

View graph of relations

Bayesian additive regression trees with model trees

Research output: Contribution to Journal/MagazineJournal articlepeer-review

Published

Standard

Bayesian additive regression trees with model trees. / Batista Do Prado, Estevao; Parnell, Andrew C; de Andrade Moral, Rafael.
In: Statistics and Computing, Vol. 31, No. 3, 20, 03.03.2021, p. 1-13.

Research output: Contribution to Journal/MagazineJournal articlepeer-review

Harvard

Batista Do Prado, E, Parnell, AC & de Andrade Moral, R 2021, 'Bayesian additive regression trees with model trees', Statistics and Computing, vol. 31, no. 3, 20, pp. 1-13. https://doi.org/10.1007/s11222-021-09997-3

APA

Batista Do Prado, E., Parnell, A. C., & de Andrade Moral, R. (2021). Bayesian additive regression trees with model trees. Statistics and Computing, 31(3), 1-13. Article 20. https://doi.org/10.1007/s11222-021-09997-3

Vancouver

Batista Do Prado E, Parnell AC, de Andrade Moral R. Bayesian additive regression trees with model trees. Statistics and Computing. 2021 Mar 3;31(3):1-13. 20. doi: 10.1007/s11222-021-09997-3

Author

Batista Do Prado, Estevao ; Parnell, Andrew C ; de Andrade Moral, Rafael. / Bayesian additive regression trees with model trees. In: Statistics and Computing. 2021 ; Vol. 31, No. 3. pp. 1-13.

Bibtex

@article{f552188facab4055a639f5be51c34c22,
title = "Bayesian additive regression trees with model trees",
abstract = "Bayesian additive regression trees (BART) is a tree-based machine learning method that has been successfully applied to regression and classification problems. BART assumes regularisation priors on a set of trees that work as weak learners and is very flexible for predicting in the presence of nonlinearity and high-order interactions. In this paper, we introduce an extension of BART, called model trees BART (MOTR-BART), that considers piecewise linear functions at node levels instead of piecewise constants. In MOTR-BART, rather than having a unique value at node level for the prediction, a linear predictor is estimated considering the covariates that have been used as the split variables in the corresponding tree. In our approach, local linearities are captured more efficiently and fewer trees are required to achieve equal or better performance than BART. Via simulation studies and real data applications, we compare MOTR-BART to its main competitors. R code for MOTR-BART implementation is available at https://github.com/ebprado/MOTR-BART.",
keywords = "Bayesian non-parametric, Bayesian additive regression trees, Markov chain Monte Carlo (MCMC)",
author = "{Batista Do Prado}, Estevao and Parnell, {Andrew C} and {de Andrade Moral}, Rafael",
year = "2021",
month = mar,
day = "3",
doi = "10.1007/s11222-021-09997-3",
language = "English",
volume = "31",
pages = "1--13",
journal = "Statistics and Computing",
issn = "0960-3174",
publisher = "Springer Netherlands",
number = "3",

}

RIS

TY - JOUR

T1 - Bayesian additive regression trees with model trees

AU - Batista Do Prado, Estevao

AU - Parnell, Andrew C

AU - de Andrade Moral, Rafael

PY - 2021/3/3

Y1 - 2021/3/3

N2 - Bayesian additive regression trees (BART) is a tree-based machine learning method that has been successfully applied to regression and classification problems. BART assumes regularisation priors on a set of trees that work as weak learners and is very flexible for predicting in the presence of nonlinearity and high-order interactions. In this paper, we introduce an extension of BART, called model trees BART (MOTR-BART), that considers piecewise linear functions at node levels instead of piecewise constants. In MOTR-BART, rather than having a unique value at node level for the prediction, a linear predictor is estimated considering the covariates that have been used as the split variables in the corresponding tree. In our approach, local linearities are captured more efficiently and fewer trees are required to achieve equal or better performance than BART. Via simulation studies and real data applications, we compare MOTR-BART to its main competitors. R code for MOTR-BART implementation is available at https://github.com/ebprado/MOTR-BART.

AB - Bayesian additive regression trees (BART) is a tree-based machine learning method that has been successfully applied to regression and classification problems. BART assumes regularisation priors on a set of trees that work as weak learners and is very flexible for predicting in the presence of nonlinearity and high-order interactions. In this paper, we introduce an extension of BART, called model trees BART (MOTR-BART), that considers piecewise linear functions at node levels instead of piecewise constants. In MOTR-BART, rather than having a unique value at node level for the prediction, a linear predictor is estimated considering the covariates that have been used as the split variables in the corresponding tree. In our approach, local linearities are captured more efficiently and fewer trees are required to achieve equal or better performance than BART. Via simulation studies and real data applications, we compare MOTR-BART to its main competitors. R code for MOTR-BART implementation is available at https://github.com/ebprado/MOTR-BART.

KW - Bayesian non-parametric

KW - Bayesian additive regression trees

KW - Markov chain Monte Carlo (MCMC)

U2 - 10.1007/s11222-021-09997-3

DO - 10.1007/s11222-021-09997-3

M3 - Journal article

VL - 31

SP - 1

EP - 13

JO - Statistics and Computing

JF - Statistics and Computing

SN - 0960-3174

IS - 3

M1 - 20

ER -