Final published version, 446 KB, PDF document
Available under license: CC BY: Creative Commons Attribution 4.0 International License
Final published version
Licence: CC BY: Creative Commons Attribution 4.0 International License
Research output: Working paper › Preprint
Research output: Working paper › Preprint
}
TY - UNPB
T1 - Semi-Supervised Learning guided by the Generalized Bayes Rule under Soft Revision
AU - Dietrich, Stefan
AU - Rodemann, Julian
AU - Jansen, Christoph
N1 - Accepted at the 11th International Conference on Soft Methods in Probability and Statistics (SMPS) 2024
PY - 2024/5/24
Y1 - 2024/5/24
N2 - We provide a theoretical and computational investigation of the Gamma-Maximin method with soft revision, which was recently proposed as a robust criterion for pseudo-label selection (PLS) in semi-supervised learning. Opposed to traditional methods for PLS we use credal sets of priors ("generalized Bayes") to represent the epistemic modeling uncertainty. These latter are then updated by the Gamma-Maximin method with soft revision. We eventually select pseudo-labeled data that are most likely in light of the least favorable distribution from the so updated credal set. We formalize the task of finding optimal pseudo-labeled data w.r.t. the Gamma-Maximin method with soft revision as an optimization problem. A concrete implementation for the class of logistic models then allows us to compare the predictive power of the method with competing approaches. It is observed that the Gamma-Maximin method with soft revision can achieve very promising results, especially when the proportion of labeled data is low.
AB - We provide a theoretical and computational investigation of the Gamma-Maximin method with soft revision, which was recently proposed as a robust criterion for pseudo-label selection (PLS) in semi-supervised learning. Opposed to traditional methods for PLS we use credal sets of priors ("generalized Bayes") to represent the epistemic modeling uncertainty. These latter are then updated by the Gamma-Maximin method with soft revision. We eventually select pseudo-labeled data that are most likely in light of the least favorable distribution from the so updated credal set. We formalize the task of finding optimal pseudo-labeled data w.r.t. the Gamma-Maximin method with soft revision as an optimization problem. A concrete implementation for the class of logistic models then allows us to compare the predictive power of the method with competing approaches. It is observed that the Gamma-Maximin method with soft revision can achieve very promising results, especially when the proportion of labeled data is low.
KW - stat.ML
KW - cs.AI
KW - cs.LG
KW - math.ST
KW - stat.ME
KW - stat.TH
KW - 62C12 62C10
KW - I.2.6; G.3
M3 - Preprint
BT - Semi-Supervised Learning guided by the Generalized Bayes Rule under Soft Revision
PB - Arxiv
ER -