Skip to playerSkip to main contentSkip to footer
  • 2/15/2024
Hilke Schellmann, author of "The Algorithm," argues that AI-based tools aren't ready for use in the hiring process. Yet, many large American companies already rely upon them.
Transcript
00:00 My name is Hilke Schellman, I'm an investigative reporter, and I wrote a book called The Algorithm.
00:04 Over the last six years, I've come to understand that nearly all large companies in the United
00:14 States are using AI-based hiring tools somewhere in the hiring process. And so do job platforms
00:20 like LinkedIn, Monster, Indeed, ZipRecruiter, and others. They're resume screeners, AI games to find
00:27 the right personality trait for jobs, AI-operated social media background checks, and one-way video
00:33 interviews, where candidates get pre-recorded questions on their devices and record themselves
00:37 answering. The vendors of these tools sell a great idea. AI hiring will cost less money and
00:44 make hiring more efficient. AI hiring will find the most qualified candidates for the job and is
00:49 overall less biased than humans making hiring decisions. And indeed, the vendors are right about
00:55 a few things. AI tools speed up the hiring process and save companies a ton of money. And we humans
01:01 are not very good at hiring. People of color, older workers, women, and people with disabilities
01:06 have been underestimated for decades and have gotten fewer chances in the workplace compared
01:10 to white men. So I wanted to know a little bit more how these tools work and how it feels to
01:17 be a job applicant being asked to use these tools. So I tested them. And one of the tools I tested
01:23 was marketed to companies in the West hiring call center employees in the global South. And one of
01:27 their criteria was how well do the candidates speak English? And I started first speaking English,
01:32 answering the questions, and I got an 8.5 out of 9 English proficient. I was very proud because
01:39 English is actually my second language. So I was like, yeah, this tool works. Please try to not
01:43 take long pauses. I wanted to push the algorithm a little bit. I read a Wikipedia entry in German
01:49 that had nothing to do with work. And I sent it out to be scored, and I thought I would get an
01:55 error message back. I got an email, and it said that I scored 6 out of 9 and that my English level
02:01 was proficient, which was surprising because I didn't say a word of English that this tool could
02:06 actually infer how well I speak English. I got in touch with the company and asked them for comment
02:12 and speak through this, and they kind of didn't know either why I was scored this way. I kept
02:16 asking, like, why was I scored 6 out of 9? And they kept explaining that German and English
02:20 in a 5-D AI space were close together, but they couldn't actually answer how was I scored.
02:26 I talked to a couple of employment lawyers, and they shared with me that many of their
02:31 resume screeners they looked at had biased variables. One tool gave folks who had the
02:36 word "baseball" on their resume more points and folks who had the word "softball" on their resume
02:42 fewer points. The job had nothing to do with sports. In the United States, unfortunately,
02:47 this may be a case of gender discrimination because more men like and play baseball and
02:52 more women like and play softball. A lot of times, resume screeners are trained with resumes from
02:58 people who are currently in the job. So if a company has had biased hiring processes in the
03:03 past and maybe has more men in roles, the AI tool may find there's a lot of people who had the word
03:08 "baseball" on their resumes, and that was a predictor of success. So what does this all mean
03:14 for people looking for jobs today? A lot of the tools I've investigated are not ready for prime
03:20 time and shouldn't be used in high-stakes decision-making, including hiring. I'm not
03:25 saying we should go back to human hiring. What I'm saying is let's build better tools and not
03:31 take the bias that we see in humans and put them into the system and hide them under technology.
03:37 I want HR managers and folks building these tools to steal my methods and test these tools
03:42 themselves first because we clearly need better tools, better oversight, and regulations.
03:48 [Music]

Recommended