Home > Research > Publications & Outputs > Empirical Approach—Introduction

Links

Text available via DOI:

View graph of relations

Empirical Approach—Introduction

Research output: Contribution in Book/Report/Proceedings - With ISBN/ISSNChapter (peer-reviewed)

Published
Publication date1/01/2019
Host publicationEmpirical Approach to Machine Learning
EditorsPlamen Angelov, Xiaowei Gu
Place of PublicationCham
PublisherSpringer-Verlag
Pages103-133
Number of pages31
ISBN (Electronic)9783030023843
ISBN (Print)9783030023836
Original languageEnglish

Publication series

NameStudies in Computational Intelligence
PublisherSpringer
Volume800
ISSN (Print)1860-949X
ISSN (Electronic)1860-9503

Abstract

In this chapter, we will describe the fundamentals of the proposed new “empirical” approach as a systematic methodology with its nonparametric quantities derived entirely from the actual data with no subjective and/or problem-specific assumptions made. It has a potential to be a powerful extension of (and/or alternative to) the traditional probability theory, statistical learning and computational intelligence methods. The nonparametric quantities of the proposed new empirical approach include: (1) the cumulative proximity; (2) the eccentricity, and the standardized eccentricity; (3) the data density, and (4) the typicality. They can be recursively updated on a sample-by-sample basis, and they have unimodal and multimodal, discrete and continuous forms/versions. The nonparametric quantities are based on ensemble properties of the data and not limited by prior restrictive assumptions. The discrete version of the typicality resembles the unimodal probability density function, but is in a discrete form. The discrete multimodal typicality resembles the probability mass function.