Since the infamous Amazon case, the AI discussion in recruitment flares up every once in a while. Can algorithms based on artificial intelligence really prevent prejudice? Or do such algorithms only amplify the existing bias — as Amazon’s case neatly illustrated? New research, to be presented in Seattle in front of the annual conference of The Society for Industrial and Organizational Psychology (SIOP) at the end of April, appears to be yielding a cautiously positive conclusion.
The research, conducted by Phai Labs, the R&D department of Australian AI company PredictiveHire, shows that if you ask candidates to not submit a resume, but only answer 5 work-related questions via text, gender bias is significantly reduced in final selection procedures. “I’m happy to share this work on how we can create a fairer playing field for everyone applying for a job”, said Chief Data Scientist Dr. Buddhi Jayatilleke, who led the research.
5 million answers
PredictiveHire has previously voiced clear opposition to all the skepticism about A.I. at recruiting. On previous occasions, the company cited research that illustrated that algorithms can actually provide a better experience for a candidate. Because A.I. can collect feedback. Subsequently learn what candidates want. And because algorithms can make your assessment criteria more transparent, can will help candidates understand recruitment choices. Moreover, through the usage of Natural Language Processing you can give every candidate good, personal feedback, according to their research.
Because algorithms can make your assessment criteria more transparent, that will help candidates understand recruitment choices.
With its newest scientific paper, entitled Identifying and Mitigating Gender Bias in Structured Interview Responses, the company addresses a common objection to algorithms: gender neutrality. “We’re in an incredibly privileged position at PredictiveHire to be able to do research like this given our large dataset of over 5 million unstructured answers to questions in job interviews – all without any identifying information.”
‘Don’t apply anonymously’
Currently, the company’s proprietary set of clean data via written responses of job candidates is at 630 million words. It is expected to reach 1 billion by mid-year, making it the largest dataset of its kind. The study shows that when assessments are made using only interview data and using a set of well-defined scoring dimension, even when responses carry higher levels of gender information, there is still a significant reduction in gender bias.
The total number adds up to less than 0.1 in terms of a recorded effect size on gender when scoring algorithms developed by PredictiveHire are used.
According to PredictiveHire, this works better than so-called ‘classic’ methods to combat bias, such as applying anonymously. “Despite the popularity of “blind” resume screening, it’s long been established that gender can be determined from the resume data and still leads to biased hiring by either interviewers or AI”, the report adds. The total number adds up to less than 0.1 in terms of a recorded effect size on gender when scoring algorithms developed by PredictiveHire are used, the report states.
Read more:
- How Martyn Redstone helped AI specialists to a job without earning a penny
- What will happen to the regulation landscape of artificial intelligence in recruitment?