Bots Prove Inadequate as Hiring Managers
Companies are increasingly turning to artificial intelligence tools to streamline the hiring process and mitigate biases. However, according to a report by the BBC, these tools may actually be exacerbating the hiring process by inadvertently excluding highly qualified candidates.
An illustrative case involves a job seeker who applied to a tutoring company named Fullmind. After being initially rejected, the applicant resubmitted their resume with a modified birthdate to appear younger. This minor adjustment resulted in securing an interview because the company’s AI software automatically filtered out older candidates.
Subsequently, Fullmind faced allegations of age discrimination from the US Equal Employment Opportunity Commission due to this incident.
In another scenario, an AI tool negatively assessed a makeup artist based on her body language, despite her exceptional skills in the craft, as reported by the BBC.
These blunders in the realm of recruitment are poised to increase as more organizations integrate these technologies into their hiring procedures.
“We have yet to witness substantial evidence proving the absence of bias in these tools, or their ability to identify the most competent candidates,” remarked Hilke Schellmann, a journalism professor at New York University and author of “Algorithm: How AI Can Hijack Your Career and Steal Your Future,” in an interview with the BBC.
The Present State of Dystopia
This concerning reality is not a mere conjecture. Citing a 2023 IBM survey, the BBC revealed that 42 percent of companies were already leveraging AI for critical HR functions and recruitment.
For job seekers, this landscape resembles a dystopian nightmare.
Navigating through arbitrary criteria set by AI screening tools to tailor one’s resume becomes a prerequisite. Even after successfully passing multiple screening stages, there remains the risk of failing a potentially flawed body language assessment.
Moreover, these AI screening tools rely on historical data that may already be tainted with biases and inaccuracies. A concrete example highlighted by the BBC unveils AI models trained on the resumes of male employees at a company, leading to the exclusion of female candidates who lacked experience in baseball or basketball.
“A single biased human hiring manager can adversely impact numerous individuals in a year, which is concerning,” Schellmann emphasized. “However, an algorithm deployed across all incoming applications at a large corporation has the potential to harm hundreds of thousands of applicants.”