Mitigating Demographic Bias in AI-based Resume Filtering

With increasing diversity in the labor market as well as the work force, employers receive resumes from an increasingly diverse population. However, studies and field experiments have confirmed the presence of bias in the labor market based on gender, race, and ethnicity. Many employers use automated resume screening to filter the many possible matches. Depending on how the automated screening algorithm is trained it can potentially exhibit bias towards a particular population by favoring certain socio-linguistic characteristics. The resume writing style and socio-linguistics are a potential source of bias as they correlate with protected characteristics such as ethnicity. A biased dataset is often translated into biased AI algorithms and de-biasing algorithms are being contemplated. In this work, we study the effects of socio-linguistic bias on resume to job description matching algorithms. We develop a simple technique, called fair-tf-idf, to match resumes with job descriptions in a fair way by mitigating the socio-linguistic bias.

Focus: Bias
Source: UMAP 2020
Readability: Expert
Type: PDF Article
Open Source: No
Keywords: N/A
Learn Tags: AI and Machine Learning Bias Employment
Summary: The paper explores how linguistic and social patterns can result in discrimination against certain demographic-utilizing algorithms that match resumes to job ads. The authors propose solutions to eliminate the bias.