Home > Research > Publications & Outputs > Optimal Projections for Classification with Nai...

Links

View graph of relations

Optimal Projections for Classification with Naive Bayes

Research output: Working paperPreprint

Published

Standard

Optimal Projections for Classification with Naive Bayes. / Hofmeyr, David; Kamper, Francois; Melonas, Michail C.
arXiv: Arxiv, 2024.

Research output: Working paperPreprint

Harvard

APA

Vancouver

Hofmeyr D, Kamper F, Melonas MC. Optimal Projections for Classification with Naive Bayes. arXiv: Arxiv. 2024.

Author

Hofmeyr, David ; Kamper, Francois ; Melonas, Michail C. / Optimal Projections for Classification with Naive Bayes. arXiv : Arxiv, 2024.

Bibtex

@techreport{04fe1e9317bb48f6b9b77c748c99f45e,
title = "Optimal Projections for Classification with Naive Bayes",
abstract = "In the Naive Bayes classification model the class conditional densities are estimated as the products of their marginal densities along the cardinal basis directions. We study the problem of obtaining an alternative basis for this factorisation with the objective of enhancing the discriminatory power of the associated classification model. We formulate the problem as a projection pursuit to find the optimal linear projection on which to perform classification. Optimality is determined based on the multinomial likelihood within which probabilities are estimated using the Naive Bayes factorisation of the projected data. Projection pursuit offers the added benefits of dimension reduction and visualisation. We discuss an intuitive connection with class conditional independent components analysis, and show how this is realised visually in practical applications. The performance of the resulting classification models is investigated using a large collection of (162) publicly available benchmark data sets and in comparison with relevant alternatives. We find that the proposed approach substantially outperforms other popular probabilistic discriminant analysis models and is highly competitive with Support Vector Machines.",
author = "David Hofmeyr and Francois Kamper and Melonas, {Michail C.}",
year = "2024",
language = "English",
publisher = "Arxiv",
type = "WorkingPaper",
institution = "Arxiv",

}

RIS

TY - UNPB

T1 - Optimal Projections for Classification with Naive Bayes

AU - Hofmeyr, David

AU - Kamper, Francois

AU - Melonas, Michail C.

PY - 2024

Y1 - 2024

N2 - In the Naive Bayes classification model the class conditional densities are estimated as the products of their marginal densities along the cardinal basis directions. We study the problem of obtaining an alternative basis for this factorisation with the objective of enhancing the discriminatory power of the associated classification model. We formulate the problem as a projection pursuit to find the optimal linear projection on which to perform classification. Optimality is determined based on the multinomial likelihood within which probabilities are estimated using the Naive Bayes factorisation of the projected data. Projection pursuit offers the added benefits of dimension reduction and visualisation. We discuss an intuitive connection with class conditional independent components analysis, and show how this is realised visually in practical applications. The performance of the resulting classification models is investigated using a large collection of (162) publicly available benchmark data sets and in comparison with relevant alternatives. We find that the proposed approach substantially outperforms other popular probabilistic discriminant analysis models and is highly competitive with Support Vector Machines.

AB - In the Naive Bayes classification model the class conditional densities are estimated as the products of their marginal densities along the cardinal basis directions. We study the problem of obtaining an alternative basis for this factorisation with the objective of enhancing the discriminatory power of the associated classification model. We formulate the problem as a projection pursuit to find the optimal linear projection on which to perform classification. Optimality is determined based on the multinomial likelihood within which probabilities are estimated using the Naive Bayes factorisation of the projected data. Projection pursuit offers the added benefits of dimension reduction and visualisation. We discuss an intuitive connection with class conditional independent components analysis, and show how this is realised visually in practical applications. The performance of the resulting classification models is investigated using a large collection of (162) publicly available benchmark data sets and in comparison with relevant alternatives. We find that the proposed approach substantially outperforms other popular probabilistic discriminant analysis models and is highly competitive with Support Vector Machines.

M3 - Preprint

BT - Optimal Projections for Classification with Naive Bayes

PB - Arxiv

CY - arXiv

ER -