Home > Research > Publications & Outputs > Balancing Gender Bias in Job Advertisements wit...

Electronic data

  • Bias_mitigation_in_job_posts

    Accepted author manuscript, 1.24 MB, PDF document

    Available under license: CC BY: Creative Commons Attribution 4.0 International License

  • fdata-05-805713

    Final published version, 0.98 MB, PDF document

    Available under license: CC BY: Creative Commons Attribution 4.0 International License

Links

Text available via DOI:

View graph of relations

Balancing Gender Bias in Job Advertisements with Text-Level Bias Mitigation

Research output: Contribution to Journal/MagazineJournal articlepeer-review

Published

Standard

Balancing Gender Bias in Job Advertisements with Text-Level Bias Mitigation. / Hu, Shenggang; Al-Ani, Jabir Alshehabi; Hughes, Karen D. et al.
In: Frontiers in Big Data, Vol. 5, 805713, 18.02.2022.

Research output: Contribution to Journal/MagazineJournal articlepeer-review

Harvard

Hu, S, Al-Ani, JA, Hughes, KD, Denier, N, Konnikov, A, Ding, L, Xie, J, Hu, Y, Tarafdar, M, Jiang, B, Kong, L & Dai, H 2022, 'Balancing Gender Bias in Job Advertisements with Text-Level Bias Mitigation', Frontiers in Big Data, vol. 5, 805713. https://doi.org/10.3389/fdata.2022.805713

APA

Hu, S., Al-Ani, J. A., Hughes, K. D., Denier, N., Konnikov, A., Ding, L., Xie, J., Hu, Y., Tarafdar, M., Jiang, B., Kong, L., & Dai, H. (2022). Balancing Gender Bias in Job Advertisements with Text-Level Bias Mitigation. Frontiers in Big Data, 5, Article 805713. https://doi.org/10.3389/fdata.2022.805713

Vancouver

Hu S, Al-Ani JA, Hughes KD, Denier N, Konnikov A, Ding L et al. Balancing Gender Bias in Job Advertisements with Text-Level Bias Mitigation. Frontiers in Big Data. 2022 Feb 18;5:805713. doi: 10.3389/fdata.2022.805713

Author

Hu, Shenggang ; Al-Ani, Jabir Alshehabi ; Hughes, Karen D. et al. / Balancing Gender Bias in Job Advertisements with Text-Level Bias Mitigation. In: Frontiers in Big Data. 2022 ; Vol. 5.

Bibtex

@article{e929ab807fe74b4a9d87ac71001fc133,
title = "Balancing Gender Bias in Job Advertisements with Text-Level Bias Mitigation",
abstract = "Despite progress towards gender equality in the labor market over the past few decades, gender segregation in labor force composition and labor market outcomes persists. Evidence has shown that job advertisements may express gender preferences, which may selectively attract potential job candidates to apply for a given post and thus reinforce gendered labor force composition and outcomes. Removing gender-explicit words from job advertisements does not fully solve the problem as certain implicit traits are more closely associated with men, such as ambitiousness, while others are more closely associated with women, such as considerateness. However, it is not always possible to find neutral alternatives for these traits, making it hard to search for candidates with desired characteristics without entailing gender discrimination. Existing algorithms mainly focus on the detection of the presence of gender biases in job advertisements without providing a solution to how the text should be (re)worded. To address this problem, we propose an algorithm that evaluates gender bias in the input text and provides guidance on how the text should be debiased by offering alternative wording that is closely related to the original input. Our proposed method promises broad application in the human resources process, ranging from the development of job advertisements to algorithm-assisted screening of job applications.",
keywords = "bias evaluation, bias mitigation, constrained sampling, gender bias, importance sampling",
author = "Shenggang Hu and Al-Ani, {Jabir Alshehabi} and Hughes, {Karen D.} and Nicole Denier and Alla Konnikov and Lei Ding and Jinhan Xie and Yang Hu and Monideepa Tarafdar and Bei Jiang and Linglong Kong and Hongsheng Dai",
year = "2022",
month = feb,
day = "18",
doi = "10.3389/fdata.2022.805713",
language = "English",
volume = "5",
journal = "Frontiers in Big Data",
issn = "2624-909X",
publisher = "Frontiers Media S.A.",

}

RIS

TY - JOUR

T1 - Balancing Gender Bias in Job Advertisements with Text-Level Bias Mitigation

AU - Hu, Shenggang

AU - Al-Ani, Jabir Alshehabi

AU - Hughes, Karen D.

AU - Denier, Nicole

AU - Konnikov, Alla

AU - Ding, Lei

AU - Xie, Jinhan

AU - Hu, Yang

AU - Tarafdar, Monideepa

AU - Jiang, Bei

AU - Kong, Linglong

AU - Dai, Hongsheng

PY - 2022/2/18

Y1 - 2022/2/18

N2 - Despite progress towards gender equality in the labor market over the past few decades, gender segregation in labor force composition and labor market outcomes persists. Evidence has shown that job advertisements may express gender preferences, which may selectively attract potential job candidates to apply for a given post and thus reinforce gendered labor force composition and outcomes. Removing gender-explicit words from job advertisements does not fully solve the problem as certain implicit traits are more closely associated with men, such as ambitiousness, while others are more closely associated with women, such as considerateness. However, it is not always possible to find neutral alternatives for these traits, making it hard to search for candidates with desired characteristics without entailing gender discrimination. Existing algorithms mainly focus on the detection of the presence of gender biases in job advertisements without providing a solution to how the text should be (re)worded. To address this problem, we propose an algorithm that evaluates gender bias in the input text and provides guidance on how the text should be debiased by offering alternative wording that is closely related to the original input. Our proposed method promises broad application in the human resources process, ranging from the development of job advertisements to algorithm-assisted screening of job applications.

AB - Despite progress towards gender equality in the labor market over the past few decades, gender segregation in labor force composition and labor market outcomes persists. Evidence has shown that job advertisements may express gender preferences, which may selectively attract potential job candidates to apply for a given post and thus reinforce gendered labor force composition and outcomes. Removing gender-explicit words from job advertisements does not fully solve the problem as certain implicit traits are more closely associated with men, such as ambitiousness, while others are more closely associated with women, such as considerateness. However, it is not always possible to find neutral alternatives for these traits, making it hard to search for candidates with desired characteristics without entailing gender discrimination. Existing algorithms mainly focus on the detection of the presence of gender biases in job advertisements without providing a solution to how the text should be (re)worded. To address this problem, we propose an algorithm that evaluates gender bias in the input text and provides guidance on how the text should be debiased by offering alternative wording that is closely related to the original input. Our proposed method promises broad application in the human resources process, ranging from the development of job advertisements to algorithm-assisted screening of job applications.

KW - bias evaluation

KW - bias mitigation

KW - constrained sampling

KW - gender bias

KW - importance sampling

U2 - 10.3389/fdata.2022.805713

DO - 10.3389/fdata.2022.805713

M3 - Journal article

VL - 5

JO - Frontiers in Big Data

JF - Frontiers in Big Data

SN - 2624-909X

M1 - 805713

ER -