Newsletter

Subscribe to the latest news from the Nordic Labour Journal by e-mail. The newsletter is issued 9 times a year. Subscription is free of charge.

(Required)
You are here: Home i In Focus i In Focus 2022 i Theme: AI and the Nordic labour markets i AI in recruitment – a double-edged sword?
tema

AI in recruitment – a double-edged sword?

| Text: Fayme Alm

More and more businesses use AI – artificial intelligence – in recruitment. Is this new technology an efficient tool to find the best-suited candidate and to increase the inclusion of marginalised groups? The first study into this gave unexpected results.

More and more businesses use AI – artificial intelligence – in recruitment. Is this new technology an efficient tool to find the best-suited candidate and to increase the inclusion of marginalised groups? The first study to look into this gave unexpected results. 

What happens when businesses incorporate AI in their recruitment process? So far there has been no empirical research into this. But now, the study ”Algorithmic evaluation in the recruitment process – does it increase diversity in organizations?” has compared the results from a survey where an AI project ran parallel with a traditional process. It turned out the AI process amplified existing patterns.  

Moa Bursell

Moa Bursell at the Institute for Future Studies. Photo: Cato Lein

“We already know that discrimination occurs during recruitment. People with foreign-sounding names are often not chosen. We wanted to see whether automating the process might improve the pattern or make it worse, in order to gauge what social consequences this new technology might have for inclusion in the workplace,” Moa Bursell tells the Nordic Labour Journal.  

She is Associate Professor in Sociology and research leader at the Institute for Futures Studies in Stockholm. With support from the Swedish Research Council and in cooperation with Stockholm University, she is studying the use of AI in recruitment processes.

Cooperation between technology and humans

The study was carried out at one of Sweden’s largest food retail companies, which already has a high level of diversity but wants to improve its mix of staff. That is why the company wanted to find out how AI works during recruitment. 

“I mainly wanted to see how the recruitment process handled applications from people with non-European-sounding names, but the study also looked at the number of women and older people – those over 40 – who were called in for an interview,” says Moa Bursell.

The automated process used by the company took care of the actual screening phase. Applicants were asked to fill in a questionnaire with mandatory requirements, a personality test and in some cases also a problem-solving test. Then, an algorithm calculated how well the applicant met the requirements. The resulting names of qualified candidates were collected in a list which was handed to recruitment managers. 

Disproportional and proportional selection

HR departments list many positive effects as arguments for introducing AI in the recruitment process:

  • Saves time and reduces costs
  • Enhances productivity, certainty and control, as well as fairness and impartiality 
  • Leads to employment based on real merits rather than the recruiter’s gut feeling

But the study shows that despite using AI in the recruitment process, you might not get the desired results. Because when the list of AI-chosen candidates was handed to recruiters, people with non-European-sounding names had not been given the same opportunities to attend interviews as other applicants, including women.

“What was most striking was that these people did not have worse test results compared to other applicants. So they were proportionately represented in the list compiled with the help of AI, while recruiters disproportionally discarded many with non-European-sounding names,” says Moa Bursell.

“This difference did not apply to women applicants. Recruiters invited them to interviews proportionate to how many of them were on the list.”

Since Moa Bursell also had access to the results from the traditional recruitment process, she could compare the results from both processes.

“We saw that when recruiters were in control of the entire process, the diversity was greater than when they only controlled a list that had been collated automatically. In other words, the recruiters acted in a way that was less inclusive with a list created by algorithms than when they owned the entire process themselves,” says Moa Bursell.

The study does not identify the reason why the recruiters discarded a disproportionate number of people with non-European-sounding names, as it was only looking at numbers. But Moa Bursell says it is possible that recruiters create diversity in their own way through the traditional process, even if this is not in the most meritocratic manner.

More research needed

The academic discussion provides both optimism and pessimism regarding the possible impact algorithms might have, writes Moa Bursell in her report. 

She tells the Nordic Labour Journal that research on algorithms and recruitment needs to be expanded since there are more aspects to explore. 

“When you change your recruitment process, it is crucial at some point to find out what happened after the introduction of algorithms and what the results have been,” says Moa Bursell.

She suggests starting by looking at how the new technology was introduced, there is always a context in which it has to function.

It is also necessary to look at what kinds of tests applicants have to take – personality tests or cognitive tests – and whether the test is best suited for the type of staff you are looking for. For this study, it was mostly people working in checkout and warehouses.  

Moa Bursell points out that it is necessary to highlight the fact that tests might create structural obstacles. If similar tests are to be used on a grand scale, and the result is that fewer people with foreign names are hired, you risk amplifying existing patterns of discrimination.

Or if older people do worse on online tests because they have lower ”online skills”, skills that might not be relevant for the job per se. It is also necessary to see whether the algorithms work and actually contribute to more or less fairness. 

Moa Bursell hopes her own and other research will provide answers to the most pressing questions. 

“When does the change from humans to algorithms lead to better judgements and decisions? When does it have negative effects, and for whom? When we find the answers to these questions we will know which tasks to delegate to the algorithms. And whether to delegate at all,” says Moa Bursell.

The research time scale is 2020 to 2024.

Newsletter

Receive Nordic Labour Journal's newsletter nine times a year. It's free.

(Required)
h
This is themeComment