Accepted author manuscript, 1.24 MB, PDF document
Available under license: CC BY: Creative Commons Attribution 4.0 International License
Final published version, 0.98 MB, PDF document
Available under license: CC BY: Creative Commons Attribution 4.0 International License
Final published version
Licence: CC BY: Creative Commons Attribution 4.0 International License
Research output: Contribution to Journal/Magazine › Journal article › peer-review
Research output: Contribution to Journal/Magazine › Journal article › peer-review
}
TY - JOUR
T1 - Balancing Gender Bias in Job Advertisements with Text-Level Bias Mitigation
AU - Hu, Shenggang
AU - Al-Ani, Jabir Alshehabi
AU - Hughes, Karen D.
AU - Denier, Nicole
AU - Konnikov, Alla
AU - Ding, Lei
AU - Xie, Jinhan
AU - Hu, Yang
AU - Tarafdar, Monideepa
AU - Jiang, Bei
AU - Kong, Linglong
AU - Dai, Hongsheng
PY - 2022/2/18
Y1 - 2022/2/18
N2 - Despite progress towards gender equality in the labor market over the past few decades, gender segregation in labor force composition and labor market outcomes persists. Evidence has shown that job advertisements may express gender preferences, which may selectively attract potential job candidates to apply for a given post and thus reinforce gendered labor force composition and outcomes. Removing gender-explicit words from job advertisements does not fully solve the problem as certain implicit traits are more closely associated with men, such as ambitiousness, while others are more closely associated with women, such as considerateness. However, it is not always possible to find neutral alternatives for these traits, making it hard to search for candidates with desired characteristics without entailing gender discrimination. Existing algorithms mainly focus on the detection of the presence of gender biases in job advertisements without providing a solution to how the text should be (re)worded. To address this problem, we propose an algorithm that evaluates gender bias in the input text and provides guidance on how the text should be debiased by offering alternative wording that is closely related to the original input. Our proposed method promises broad application in the human resources process, ranging from the development of job advertisements to algorithm-assisted screening of job applications.
AB - Despite progress towards gender equality in the labor market over the past few decades, gender segregation in labor force composition and labor market outcomes persists. Evidence has shown that job advertisements may express gender preferences, which may selectively attract potential job candidates to apply for a given post and thus reinforce gendered labor force composition and outcomes. Removing gender-explicit words from job advertisements does not fully solve the problem as certain implicit traits are more closely associated with men, such as ambitiousness, while others are more closely associated with women, such as considerateness. However, it is not always possible to find neutral alternatives for these traits, making it hard to search for candidates with desired characteristics without entailing gender discrimination. Existing algorithms mainly focus on the detection of the presence of gender biases in job advertisements without providing a solution to how the text should be (re)worded. To address this problem, we propose an algorithm that evaluates gender bias in the input text and provides guidance on how the text should be debiased by offering alternative wording that is closely related to the original input. Our proposed method promises broad application in the human resources process, ranging from the development of job advertisements to algorithm-assisted screening of job applications.
KW - bias evaluation
KW - bias mitigation
KW - constrained sampling
KW - gender bias
KW - importance sampling
U2 - 10.3389/fdata.2022.805713
DO - 10.3389/fdata.2022.805713
M3 - Journal article
VL - 5
JO - Frontiers in Big Data
JF - Frontiers in Big Data
SN - 2624-909X
M1 - 805713
ER -