Can AI prevent gender bias?

Gender bias: the number one enemy of a fair interview. But could AI soon be to the rescue? Based on new research by Phai Labs, the R&D department of PredictiveHire, that scenario could soon be on the table.

Jasper Spanjaart on April 05, 2022 Average reading time: 3 min
Share this article:
Can AI prevent gender bias?

Since the infamous Amazon case, the AI discussion in recruitment flares up every once in a while. Can algorithms based on artificial intelligence really prevent prejudice? Or do such algorithms only amplify the existing bias — as Amazon’s case neatly illustrated? New research, to be presented in Seattle in front of the annual conference of The Society for Industrial and Organizational Psychology (SIOP) at the end of April, appears to be yielding a cautiously positive conclusion.

The research, conducted by Phai Labs, the R&D department of Australian AI company PredictiveHire, shows that if you ask candidates to not submit a resume, but only answer 5 work-related questions via text, gender bias is significantly reduced in final selection procedures. “I’m happy to share this work on how we can create a fairer playing field for everyone applying for a job”, said Chief Data Scientist Dr. Buddhi Jayatilleke, who led the research.

5 million answers

Buddhi Jayatilleke

PredictiveHire has previously voiced clear opposition to all the skepticism about A.I. at recruiting. On previous occasions, the company cited research that illustrated that algorithms can actually provide a better experience for a candidate. Because A.I. can collect feedback. Subsequently learn what candidates want. And because algorithms can make your assessment criteria more transparent, can will help candidates understand recruitment choices. Moreover, through the usage of Natural Language Processing you can give every candidate good, personal feedback, according to their research.

Because algorithms can make your assessment criteria more transparent, that will help candidates understand recruitment choices.

With its newest scientific paper, entitled Identifying and Mitigating Gender Bias in Structured Interview Responses, the company addresses a common objection to algorithms: gender neutrality. “We’re in an incredibly privileged position at PredictiveHire to be able to do research like this given our large dataset of over 5 million unstructured answers to questions in job interviews – all without any identifying information.”

‘Don’t apply anonymously’

Currently, the company’s proprietary set of clean data via written responses of job candidates is at 630 million words. It is expected to reach 1 billion by mid-year, making it the largest dataset of its kind. The study shows that when assessments are made using only interview data and using a set of well-defined scoring dimension, even when responses carry higher levels of gender information, there is still a significant reduction in gender bias.

The total number adds up to less than 0.1 in terms of a recorded effect size on gender when scoring algorithms developed by PredictiveHire are used.

According to PredictiveHire, this works better than so-called ‘classic’ methods to combat bias, such as applying anonymously. “Despite the popularity of “blind” resume screening, it’s long been established that gender can  be determined from the resume data and still leads to biased hiring by either interviewers or AI”, the report adds. The total number adds up to less than 0.1 in terms of a recorded effect size on gender when scoring algorithms developed by PredictiveHire are used, the report states.

Read more:

Share this article:
Jasper Spanjaart

Jasper Spanjaart

Editor-in-Chief and Writer at ToTalent.eu
Editor-in-Chief and writer for European Total Talent Acquisition platform ToTalent.eu.
Watch full profile

Premium partners View all partners

Intelligence Group
Ravecruitment
Recruitment Tech
Timetohire
Werf&

Read the newsletter about total talent acquisition.