Home > Research > Publications & Outputs > EEG-based affective state recognition from huma...
View graph of relations

EEG-based affective state recognition from human brain signals by using Hjorth-activity

Research output: Contribution to Journal/MagazineJournal articlepeer-review

Published

Standard

EEG-based affective state recognition from human brain signals by using Hjorth-activity. / Mehmood, Raja Majid; Bilal, Muhammad; Vimal, S. et al.
In: Measurement: Journal of the International Measurement Confederation, Vol. 202, 111738, 31.10.2022.

Research output: Contribution to Journal/MagazineJournal articlepeer-review

Harvard

Mehmood, RM, Bilal, M, Vimal, S & Lee, SW 2022, 'EEG-based affective state recognition from human brain signals by using Hjorth-activity', Measurement: Journal of the International Measurement Confederation, vol. 202, 111738. https://doi.org/10.1016/j.measurement.2022.111738

APA

Mehmood, R. M., Bilal, M., Vimal, S., & Lee, S. W. (2022). EEG-based affective state recognition from human brain signals by using Hjorth-activity. Measurement: Journal of the International Measurement Confederation, 202, Article 111738. https://doi.org/10.1016/j.measurement.2022.111738

Vancouver

Mehmood RM, Bilal M, Vimal S, Lee SW. EEG-based affective state recognition from human brain signals by using Hjorth-activity. Measurement: Journal of the International Measurement Confederation. 2022 Oct 31;202:111738. Epub 2022 Aug 23. doi: 10.1016/j.measurement.2022.111738

Author

Mehmood, Raja Majid ; Bilal, Muhammad ; Vimal, S. et al. / EEG-based affective state recognition from human brain signals by using Hjorth-activity. In: Measurement: Journal of the International Measurement Confederation. 2022 ; Vol. 202.

Bibtex

@article{601307cbcc3e4b3fbdaee7e4f667ce9b,
title = "EEG-based affective state recognition from human brain signals by using Hjorth-activity",
abstract = "EEG-based emotion recognition enables investigation of human brain activity, which is recognized as an important factor in brain-computer interface. In recent years, several methods have been studied to find optimal features from brain signals. The main limitation of existing studies is that either they consider very few emotion classes or they employ a large feature set. To overcome these issues, we propose a novel Hjorth-feature-based emotion recognition model. Unlike other methods, our proposed method explores a wider set of emotion classes in the arousal-valence domain. To reduce the dimension of the feature set, we employ Hjorth parameters (HPs) and analyze the parameters in the frequency domain. At the same time, our study was focused to maintain the accuracy of emotion recognition for four emotional classes. The average accuracy was approximately 69%, 76%, 85%, 59%, and 87% for DEAP, SEED-IV, DREAMER, SELEMO, and ASCERTAIN, respectively. Results show that the features from HP activity with random forest outperforms all the classic methods of EEG-based emotion recognition.",
keywords = "Affective state, ASCERTAIN, DEAP, DREAMER, EEG, Emotion recognition, SEED-IV, SELEMO",
author = "Mehmood, {Raja Majid} and Muhammad Bilal and S. Vimal and Lee, {Seong Whan}",
year = "2022",
month = oct,
day = "31",
doi = "10.1016/j.measurement.2022.111738",
language = "English",
volume = "202",
journal = "Measurement: Journal of the International Measurement Confederation",
issn = "0263-2241",
publisher = "Elsevier",

}

RIS

TY - JOUR

T1 - EEG-based affective state recognition from human brain signals by using Hjorth-activity

AU - Mehmood, Raja Majid

AU - Bilal, Muhammad

AU - Vimal, S.

AU - Lee, Seong Whan

PY - 2022/10/31

Y1 - 2022/10/31

N2 - EEG-based emotion recognition enables investigation of human brain activity, which is recognized as an important factor in brain-computer interface. In recent years, several methods have been studied to find optimal features from brain signals. The main limitation of existing studies is that either they consider very few emotion classes or they employ a large feature set. To overcome these issues, we propose a novel Hjorth-feature-based emotion recognition model. Unlike other methods, our proposed method explores a wider set of emotion classes in the arousal-valence domain. To reduce the dimension of the feature set, we employ Hjorth parameters (HPs) and analyze the parameters in the frequency domain. At the same time, our study was focused to maintain the accuracy of emotion recognition for four emotional classes. The average accuracy was approximately 69%, 76%, 85%, 59%, and 87% for DEAP, SEED-IV, DREAMER, SELEMO, and ASCERTAIN, respectively. Results show that the features from HP activity with random forest outperforms all the classic methods of EEG-based emotion recognition.

AB - EEG-based emotion recognition enables investigation of human brain activity, which is recognized as an important factor in brain-computer interface. In recent years, several methods have been studied to find optimal features from brain signals. The main limitation of existing studies is that either they consider very few emotion classes or they employ a large feature set. To overcome these issues, we propose a novel Hjorth-feature-based emotion recognition model. Unlike other methods, our proposed method explores a wider set of emotion classes in the arousal-valence domain. To reduce the dimension of the feature set, we employ Hjorth parameters (HPs) and analyze the parameters in the frequency domain. At the same time, our study was focused to maintain the accuracy of emotion recognition for four emotional classes. The average accuracy was approximately 69%, 76%, 85%, 59%, and 87% for DEAP, SEED-IV, DREAMER, SELEMO, and ASCERTAIN, respectively. Results show that the features from HP activity with random forest outperforms all the classic methods of EEG-based emotion recognition.

KW - Affective state

KW - ASCERTAIN

KW - DEAP

KW - DREAMER

KW - EEG

KW - Emotion recognition

KW - SEED-IV

KW - SELEMO

U2 - 10.1016/j.measurement.2022.111738

DO - 10.1016/j.measurement.2022.111738

M3 - Journal article

AN - SCOPUS:85138441095

VL - 202

JO - Measurement: Journal of the International Measurement Confederation

JF - Measurement: Journal of the International Measurement Confederation

SN - 0263-2241

M1 - 111738

ER -