From the outside looking in, PredictiveHire’s promises nearly sound too good to be true. Founded in Melbourne Australia in 2013, they describe themselves as a combined effort of data scientists, engineers, HR professionals, programmers — and rock climbers. They have based their business around the idea of empowerment of all parties — all for the greater good of fair decisions. “We believe that using data, and ideally actual performance data, is the best way to deliver fairness and better decision-making”, they say.
Phai is the only conversational interview platform with 99% candidate satisfaction feedback.
Their ideas come together in their newest invention: a chatbot called Phai. Rather than you spending countless of hours on initial candidate interviews, Phai will do the top-of-funnel interviews for you. According to Phai’s parents, it is the only conversational interview platform with 99% candidate satisfaction feedback. Moreover, the company reports a 95% completion rate.
It’s no surprise that humans are prone to unconscious bias, and that’s what the company wants to tackle with Phai. “When a recruiter first screens a resume, they do so for +/- 6 seconds. So what is it that the are seeking?”, they ask. Their answer to the unconscious bias is simple: data. “Only clean data, like the answers to specific job-related questions, can give us a true bias-free outcome.”
While PredictiveHire has been shortlisted for several tech and AI-based awards, there have been some critical notes too. MIT Technology Review writer Karen Hao labelled the hiring firm’s initiatives as ‘misleading’, ‘troubling’ and ‘causing greater scrutiny for their tools’ labour issues beyond discrimination’.
“Job hopping, or the threat of job hopping is one of the main things that workers are able to increase their income.”
Hao quotes Solon Barocas, an assistant professor at Cornell University and principal researcher at Microsoft Research. Barocas, an expert at algorithmic fairness and accountability, does raise a valid point in Hao’s article. The fact that Phai asks job hopping-related questions, isn’t a good thin for candidates. “Job hopping, or the threat of job hopping is one of the main things that workers are able to increase their income.”
While AI-based systems are designed to eliminate bias, there have been multiple cases where bias can actually creep into algorithms. Amazon stopped using a hiring algorithm after finding out it favoured applicants based on words such as ‘executed’ and ‘captured’, which were far more common in men’s resumes. It proves that even though when gender, race or sexual orientation are no longer part of the process, there are still ways for AI systems to discriminate.
The answer may lie in mandated transparency, according to Barocas. “If firms were more forthcoming about their practices and submitted their tools for such validation, it could help hold them accountable”, he says.
At the end of the day, and we’ve got ourselves to thank for this: AI bias may be an easier fix than human bias.
Meanwhile, it’s easy to forget why PredictiveHire came up with Phai in the first place. The same way it is easy to be overly critical of organisations who are trying their best to really bridge a gap when it comes to discrimination in the forms of a lack of diversity and inclusion with regards to hiring. At the end of the day, and we’ve got ourselves to thank for this: AI bias may be an easier fix than human bias.