Trends

HR[preneur] Episode 14: Is Your Hiring Process Biased?

HRpreneur Episode 14

The hiring process can be long and costly, which is why some employers are turning to data-driven algorithms that promise to deliver qualified candidates in a fraction of the time, all while eliminating human bias. But what if these algorithms inadvertently reinforce discrimination?

Episode Info

Is Your Hiring Process Biased? (click to listen/download podcast)

Reference Shelf

Speaker Info

Kara Murray is the Vice President of Sales Operations for ADP's Small Business Client Channel. Kara has been with ADP for 9 years and has been in various sales and sales leadership positions while she has been with ADP. One of her primary goals is to educate our clients on the ever-changing HR landscape and how ADP can help them overcome everyday workplace challenges.

Kristin LaRosa is Senior Counsel for ADP's Small Business Services division. Prior to joining ADP, Kristin worked as an employment lawyer where she represented employers in litigation and provided legal advice and counseling on day-to-day employment and HR matters.

Meryl Gutterman is Counsel for ADP's Small Business Services division. Prior to joining ADP, Meryl worked as an attorney in private practice representing small businesses in employment-related matters.

Full Transcript

Kara Murray: The hiring process can be long and costly, which is why some employers are turning to data-driven algorithms that promise to deliver qualified candidates in a fraction of the time, all while eliminating human bias. But what if these algorithms inadvertently reinforce discrimination in the hiring process? I'm Kara Murray, and this is HR{preneur}, a podcast by ADP. We know you work incredibly hard to support your employees and make your business a success. More than likely, this means you wear lots of hats, and one of those might be HR Professional. We're here to help you get the insight you need in order to tackle day-to-day workplace issues. This week I'm joined by Kristin LaRosa and Meryl Gutterman. Both work as counsel for ADP's Small Business Services. I want to also thank the ADP Client Appreciation Program for sponsoring today's episode. You can get free payroll by referring ADP, and if you want to find out more you can talk to your local sales representative.

All right, for those of us who are not familiar, Kristin and Meryl, can you provide a brief overview of the role of algorithms when it comes to hiring?

Kristin LaRosa: Sure. So I think typically an algorithm will read, process and analyze applicant data. In many of these cases, these algorithms would look at traditional factors, so think about things like education and job history, but they'll also look at non-traditional factors such as social media activity.

Meryl Gutterman: Also, to help rank candidates, the algorithm will look at all of the applicants that have applied in the past to find some common traits among those who have been hired. In other words, the algorithm correlates certain traits with a successful hire.

Kristin: Yeah, and the problem is while these algorithms may seem neutral, they may actually have inherent biases based on factors that resulted in previously biased hiring decisions.

Kara: That's interesting. Can you provide an example?

Kristin: Sure. So let's say you have a company that has an opening for a software engineer. And since the software engineering field has traditionally been dominated by men, perhaps the vast majority of the applications that the company has received in the past were from men.

Meryl: Right, and without effective controls in place the algorithm learns to downgrade characteristics that may be associated with women, such as, let's say, attending a women's college, for example, because a few of the current workforce have those traits.

Kara: Yeah, I read an article recently that discusses a similar concept. A company had built an algorithm that ended up downgrading certain résumés that included words like women's club, and instead favored male candidates who used certain key words.

Kristin: Exactly. So in that case, the algorithm was likely built on biased data in which men applied and were hired more often than women. So in other words, for the algorithm, men equated to what was deemed a good hire.

Meryl: Right, so ZipRecruiter has actually studied the topic of gender bias in job ads, and they found that using words like support or understand, or aggressive and ambitious, actually skew results. They also found that when employers use job postings with gender neutral phrasing, they receive over 40 percent more responses.

Kristin: Yeah, and it's critical to use inclusive language and try to avoid terms that could be interpreted as excluding a protected class. So, for example, think about terms like energetic, or tech savvy, or recent college grad. These type of terms could be seen as discriminatory towards older candidates. So what we recommend instead is to try listing specific requirements, such as saying you're looking for a candidate who's proficient in html, or describing a position as entry level.

Meryl: Absolutely. And in addition to promoting an inclusive workplace that'll help drive productivity and morale, it'll also be important to help make sure that you are following all of the anti-discrimination and harassment laws when you're developing your hiring criteria.

Kara: Okay. So given the potential shortcomings of algorithms, how do companies tackle bias in hiring?

Meryl: Well, that's a great question. I would say that it's complicated, and will depend on leveraging the right recruiting solutions for your company. And also, it's important to be educating your hiring managers to make sure that they are avoiding potential biases.

Kristin: That's right. And employers need to recognize that what may appear on the surface to be a neutral policy could actually disproportionately impact a protected group. So, for example, requiring certain physical abilities or a particular dress code, could inadvertently discriminate against applicants with disabilities or applicants with certain religious beliefs who are required to wear certain religious garb, for example. So you really want to look closely at whether you're screening criteria effectively targets job related factors only.

Kara: And what if the employer doesn't realize they're inadvertently discriminating against certain applicants?

Kristin: Right, so that's what we call unconscious bias, which may occur when employers make decisions based on an unknown or an unconscious stereotype. And it's believed that these biases are typically formed over a lifetime as we look for shortcuts to process and sort information in our minds.

Kara: That's interesting. Do you have an example of how this is played out in the hiring process?

Kristin: I do. There's actually a well-known study on the issue, and the study looked at symphony orchestras in the US which had been predominantly made up of men. And they found that after orchestras began holding blind auditions in which the identity of the candidates were withheld, the proportion of women who were ultimately hired increased dramatically.

Meryl: Yeah, and one of the theories for why this happened was that the judges' prior experience was primarily with male performers, and they unconsciously associated symphony orchestra musicians with being male more than they did with being female, which then impacted the decision to hire women.

Kara: This is a perfect continuation to our discussion around gender bias keywords in job ads. If you're not self-aware, though, how can you tackle unconscious bias?

Meryl: I think it's important to think of any biases that could influence your hiring decisions, such as associating a particular job with a certain gender. So you should make sure your job ads are using gender neutral phrasing and asking candidates the same set of interview questions, and focus only on those factors that have a direct impact on performance.

Kristin: Yeah, and going back to that study that we referenced earlier, you want to look for ways to introduce the "blind auditions" into your hiring process. So, for example, you can make it a policy to remove names when giving résumés and applications to the person who decides who they're going to call in for interviews. And what this can do is it can help reduce both explicit and implicit discrimination since you're unable to act on any biases because the individual's identity is completely unknown.

Kara: Okay. That's good advice. What are some additional steps that employers can take to help guard against discrimination by those making the hiring decisions?

Meryl: I think it's important to include a diverse group of people in the hiring process, and make sure that you're clearly defining your screening criteria. And then, once you have that criteria, make sure that you're applying it consistently to all candidates applying for a position.

Kristin: Yeah, and I think you also want to avoid relying solely on an algorithm to screen applicants. So if you use technology to help screen and select candidates, you want to make sure that the data it uses is from a diverse pool, and that it's accurate and job-related, and then free from any unintentional bias.

Meryl: Right, so if information from your current workforce is the only data that you're using, relying on an algorithm tool may create a barrier for groups that are not already well-represented in your current workforce.

Kara: And how can employers ensure algorithms are valid?

Meryl: That's a great question, and it's something that's difficult to do. One thing you can do is to check out the EEOC's website which has guidelines to help employers assess employee selection procedures, and those guidelines can serve as a starting point to help you assess the validity of your algorithms. You could look at objective characteristics that have been determined to result in successful job performance in the past to see if your algorithms are on track, and then you want to just always look at your selection process to see if it's screening out a protected group. And if it is, then you may look to find another way that could be equally effective to predicting job performance that won't exclude that protected class.

Kristin: Yeah, and I would add here that you also want to train your decision makers on how to use the results effectively and make sure that you're closely monitoring hiring decisions to spot any potential issues early.

Kara: Okay. Well, I think that's about all the time we have for today, so do you guys have any additional advice for our listeners?

Kristin: Sure. And I know we kind of touched on this throughout the conversation, but I would say that we want employers to be mindful that while the use of technology can help improve the hiring process, it's really important to exercise caution to help prevent any kind of unintentional discrimination.

Meryl: Absolutely. And if you do decide to leverage algorithms in your recruiting process, then make sure you're gathering multiple perspectives, make sure you're using reliable data, and make sure that any rules that you use to weed out candidates are valid.

Kara: Great. Well, thank you so much, Kristin and Meryl. We want to thank you all for listening to HR[preneur]. I'm Kara Murray. For all the latest episodes, subscribe on Apple Podcasts, Google Play, or wherever you listen to podcasts.

Podcast Overview

HR{preneur}, a podcast by ADP's Small Business Services, is designed to help you get the insight you need in order to tackle day-to-day workplace issues. In each episode, you'll hear from industry experts about the latest in HR, such as the #MeToo movement, evolving marijuana laws, and more. Each episode will be between 10 and 15 minutes long, but full of practical advice. Find us on Apple® Podcasts or visit the HR{preneur} podcast page on Podbean.