AI Interviews Can Be Biased and Just Plain Weird, Questrom Expert Says

AI Interviews Can Be Biased and Just Plain Weird, Questrom Expert Says
Despite that, says Kabrina Chang, job interviews led by robots are here to stay because of their efficiency
Some job applicants find that being interviewed by AI programs rather than humans is just weird. Photo by iStock/gremlin
To depict corporate soullessness cinematically (and comedically) four decades ago, The Survivors had Robin Williams fired from his job–by a parrot. Does the 21st-century version involve being interviewed—and perhaps rejected for employment—by a robot?
Nine out of 10 companies now use technology—increasingly, interviews by artificial intelligence (AI) programs—in hiring. The process requires candidates to record themselves answering timed questions and submit the video for algorithmic analysis of their words, speaking tones, and even facial tics. Some applicants learned via email that they didn’t get the job, having never interacted with a human interviewer.

There has been blowback to that dehumanizing experience and over human programmers’ biases embedded in the algorithms. Amazon in 2018 scrapped a computer program reviewing résumés after it was found to discriminate against women.
On the other hand, Unilever said that its AI technology helped produce its most diverse hires ever, says Kabrina Chang (CAS’92), Questrom School of Business associate dean for diversity, equity, and inclusion and a clinical associate professor of business law and ethics. Chang’s employment law class includes sessions on technology in hiring, and she says more than half of her students have experienced such interviews. BU Today asked her to talk about whether we have inescapably entered a brave new world of interview-by-automaton and what the repercussions might be.
Q&A
With Kabrina Chang
BU Today: A robot hardly seems the best introduction to a workplace. Why are companies doing this?
Kabrina Chang: A number of reasons, chief among them an easier, less expensive way to comb through thousands of résumés. The companies that I’ve looked at—and from my students’ experience, it’s the same thing—it’s usually your first pass. A company might have a GPA requirement; that will eliminate a lot of people. Then you get 10,000 résumés for a job at Unilever, and Unilever doesn’t have the people power to go through those. They use an AI program to do an initial screen.
You have an interview on your phone, looking at yourself. Then the AI does what it does to weed a bunch of people out. The companies that I’ve looked at ultimately have human beings speak with applicants if you pass it. [But] you could be out without ever speaking to a human being.
BU Today: What do you teach about this in your class? Are you for or against this approach, or in between?
卡
BU Today: Did your other students [who have had AI-interviews] have the same concerns?
Kabrina Chang: None of them loved it. Most of them said, “This is so strange. I’m having a conversation with myself on my phone.” They found that interaction very awkward. There was some worry about bias, but the most common theme was: this is very strange.
BU Today: Aren’t there downsides for companies—they may be missing out on talent because of technological discrimination?
Kabrina Chang: I looked at Unilever, one of the first companies to use this. After using it for a period of time, they said that they had hired their most diverse class ever—gender diversity, racial diversity, ethnic diversity. They had a successful experience. I wonder if [other] companies are using it and then not tracking what happens. It’s more efficient, but they might not be collecting the data about who actually is coming to work as a result of using this program.
You get a few takes [in an AI interview]. If I do my recording, my interview, I get maybe two or three tries. That’s better than a face-to-face interview, because you don’t get to redo that. But it’s also weird, my students tell me, because they are trying to become more mechanical, giving [the AI program] what they think it wants. The retake is great, but it also reinforces this weird, dehumanizing component.
BU Today: If you were head of corporate recruiting, would your take be, we need to do this, but we need good technology like Unilever, and track it?
卡
Comments & Discussion
Boston University moderates comments to facilitate an informed, substantive, civil conversation. Abusive, profane, self-promotional, misleading, incoherent or off-topic comments will be rejected. Moderators are staffed during regular business hours (EST) and can only accept comments written in English. Statistics or facts must include a citation or a link to the citation.