Innovation

The bias in the machine

Is AI set to comprehensively entrench biases in hiring practices?

01 August 2025

Professor Marthinus van Staden

In 2024, research at the University of Washington found that next-generation LLMs used in CV screening favoured white-associated names 85% of the time and female-associated names only 11% of the time. The study found an inherent bias in ranking job applications based on race and gender. Another study published by the Social Science Research Network found that AI-enabled recruitment systems are “often found to perpetuate or even amplify existing biases”.

Hiring systems can reinforce and deepen discrimination against women, older applicants and people with disabilities. The University of Melbourne’s study into discrimination by recruitment algorithms found cases where AI systems were configured to automatically reject certain demographic groups.

ITWeb Premium

Get 3 months of unlimited access
No credit card. No obligation.

Already a subscriber Log in