Dean Sadler, CEO of TribePad, commenting on new laws to restrict the use of emotion-detecting tech, believes that as long as AI is used as a tool to help humans, we shouldn't get rid of it all together - especially in recruiting.
Regulation always lags technology significantly, and because AI is an area that is growing in importance, naturally new laws would follow. As the AI Now Institute has said in its report, where AI makes the final decision then regulation is of course important. But we shouldn’t get rid of emotion-detecting technology altogether. AI should be used to assist humans in making a more informed decision.
The use of AI in recruitment is increasing at pace. Whether it’s emotional predictions gathered from facial expression analysis or sentiment predictions gathered from the words the candidate speaks, organisations are increasing their use of these tools. The benefits are obvious: better hiring outcomes for the employer and the candidate, and the potential to reduce human bias and discrimination in the interview process.
[ymal]
At TribePad we believe AI can help deliver more information to help human recruiters make better decisions. We design our software to be transparent so the recruiter understands the information they are being presented with. We present a video stream, together with a text transcript and the emotional analysis graphs together, a bit like a musical score. So the recruiter can understand the underlying information and make the best decision they can. This helps candidates to be found in ways a 2D CV cannot help. Especially as a CV is only a snapshot of someone's work history and doesn't typically tell you what the real person is like.
We think that AI should never be a 'black box' that takes important decisions out of human hands. For this reason, regulators should look at ‘black-box’ AI carefully where it has major implications for people's lives.