Why Technology is Limited As a Bias Killer When Hiring

Technology has helped leaders make great strides in managing candidate volume and alleviating unconscious bias when hiring. Still, the human element remains a large part of the final decision-making process. Here’s what executives need to know.
Hiring bias

Using technology in the hiring process has helped companies manage high applicant volume and alleviate unconscious bias in managers’ decision-making. However, humans still ultimately decide who to hire, making technology’s role imperfect while paving the way for additional challenges leaders need to grapple with.

This challenge comes at a time when the hiring process is more automated than ever, according to a 2015 survey from Allegis Group Inc., a group of companies devoted to staffing and recruitment services. The survey found that 71 percent of respondents said the hiring process is more automated than before, a trend that 60 percent of the survey’s respondents said has created additional unintended consequences as a result.

Additionally, 54 percent of respondents said they’ve had more trouble in judging cultural fit, 46 percent said they’ve seen greater difficulty in judging candidates’ skills and 33 percent said they’ve found that their recruiting process has slowed with automation.

“Automation is crucial in terms of streamlining processes and handling repetitive, data-intensive tasks, but it fails to interpret a person’s qualities and potential,” said Kelly Van Aken, director of experience and technology solutions at Aerotek, a unit of Allegis Group. Therefore, it’s crucial to have an interactive, personal relationship between hiring manager and job candidate, she said.

But bias in hiring begins long before technology is able to play a significant role, and it remains limited as the process moves along, with the final decision often coming down to a manger’s gut feeling.

To be sure, tools such as Textio can help people write job descriptions to strip out overly masculine or feminine language in initial job descriptions, something that has proven to be a problem in the conventional hiring process. But this isn’t a catchall solution, either.

“You can’t just solve a lot of that unconscious bias with technology. There’s definitely those human interactions,” said Fara Rives, director of product development at Allegis Global Solutions. Humans must run these tools and make changes based on suggestions from the technology. Also, recruiters still need to screen first-round candidates, and hiring managers still need to meet with their potential employees before making a job offer.

Slowly but surely, hiring is becoming more objective and fair, Rives said. “I do think [technology is] helping, and I think awareness is a big piece of that.”

Training for Change

Training is a way to build awareness of this problem. Rives said Allegis did unconscious bias training for a financial services client, leading to an 87 percent diverse slate of candidates, up from 36 percent prior to training. And much of this diversity carried through to hires, she said.

Within the hiring process comes the nuanced methods of identifying cultural fit. Unconscious biases influence who we think will be successful in a role based on what they look like, said Gabriela Burlacu, human capital management researcher at SAP SuccessFactors, a human capital management software company. If someone successfully completed the role previously, for instance, we expect the next hire to look the same.

Training can help here as well, Burlacu said. Interviewers should understand how to push past biases to identify performance and potential; this doesn’t require a sophisticated technology. A list of competencies and other factors of success in the job should be front and center for those interviewing a candidate. An interview guide based on the role’s required skills can help hiring managers rely less on gut feel, she said.

Nevertheless, bias in hiring is just one potential problem area. Companies can set targets of diversity in the candidate pool and then bring on a more diverse group of hires than the previous year, but internal barriers remain, pushing them out, Burlacu said.

Performance management processes come with biases as well, Burlacu said, citing a study from Stanford researchers that found that male and female managers both tend to be bad at managing the performance of women. For instance, developmental feedback for men tends to be more straightforward and linked more closely to business outcomes.

RELATED: Not All Bias is Bad Bias

This is an area of people management where technology can help, Burlacu said, but it’s ultimately up to the people involved to make the necessary culture changes.

Technology can shed light on flaws and biases in our management systems, point people to the best hire and help to effectively manage them, all leading to a more successful, productive workforce.

“People will start realizing that their own errors in human judgment have actually been holding the organization back,” Burlacu said.

Lauren Dixon is an associate editor at Talent Economy. To comment, email editor@talenteconomy.io.