Human resources is about humans, and so far artificial intelligence (AI) has proven fallible in many attempts to mimic recruiting and hiring practices, from Amazon scrapping a secret recruiting tool that showed a bias toward women to HireVue eliminating facial expression monitoring from its assessment tool. There are arguments on both sides as to whether AI is helpful or harmful in sourcing and screening diverse candidates.
But regardless of whether you use AI, it’s still important to identify where your diversity efforts fall short—whether it’s during the sourcing process or during the screening process.
Do You Have a Diversity Sourcing Problem?
One way to tell if you have a diversity sourcing problem is to review the number of diverse applicants for a given job opening. Look at race, gender, age, veteran status, disabilities, and, if possible, make sure you have at least two or more candidates from minority groups in your candidate pool.
- If your candidate pool doesn’t show diversity, review your job description to make sure it’s inclusive.
- Also review where you’ve posted your job opening and consider whether you could diversify where you post.
Some companies have instituted a diverse slate hiring policy, a concept popularized in 2003 by the National Football League (NFL) when they introduced the Rooney Rule and again by Facebook in 2015. This policy essentially requires recruiters or hiring managers to recruit from a diverse pool of applicants. Facebook said, “The more people you interview who don’t look or think like you, the more likely you are to hire someone from a diverse background.”
With this policy in place, the NFL increased the percentage of Black head coaches from 6% to 22% in just four years. However, the diverse slate hiring policy works only when there is more than one diverse applicant in the pipeline, and only at the beginning of the employee lifecycle.
Do You Have a Diversity Screening Problem?
Even if you source from diverse pools, you could still have a diversity screening problem. The screening problem could be caused by AI looking for keywords or phrases in a resume or identifying facial movements, tone, or voice inflection in video interviews.
AI and machine learning are only as effective as the data sets they use. They can improve in time, but they are also coded by humans who may have implicit and unconscious biases.
Ensuring your recruiting team and hiring managers are properly trained to recognize these biases will be key to overcoming these challenges.
Diversity Only Works If You Have Inclusion
From my Forbes article, “Diversity only works if you have inclusion. It’s one thing to hire for diversity, but it doesn’t really matter if those people don’t have a seat at the table to make or influence important business decisions. Inclusion starts at the leadership level and must be intentional at encouraging full participation amongst your entire team as well as inviting in external teams when needed. Different perspectives can’t hurt a company, but not listening to them can.”
Diversity and sourcing and screening problems can be easily solved with technology, training, and intention. For example, when Google first started recruiting employees, they only recruited from 75 top-tiered colleges and universities, but have since broadened that perspective to include over 305 schools.
Well-intentioned teams can still fall short of their diversity hiring goals. To improve your practices, here are a few tips:
- Always be intentional about recruiting diverse candidates.
- Review your job descriptions to ensure they are inclusive.
- Review where you post your job openings and be sure to include diverse sources.
- Train hiring managers on how to eliminate unconscious bias.
- Question any AI tools you currently use, and when selecting one, inquire about how the tech works. Was it preloaded or programmed with resumes from a typical candidate?