Can Machines Support Diversity and Eliminate Bias?

Add bookmark
David Rice
David Rice
10/01/2020

AI in Talent

It seems like every major company has had to look at itself in the last four months and ask genuine, meaningful questions about its efforts to address diversity. Some have found the answers unsettling and are taking immediate action to try and remedy a lack of diversity in the hiring process and in how they identify and develop employees.

What that looks like differs across industries and companies. For some, leaders are undergoing unconscious bias training in attempt to understand how bias has crept into process. For others, particularly larger organizations and recruiters, turning to another source to find the best talent, such as technology, is more common.

While AI can take on traits of its human creators, the fact that bias is not inherent in AI the way humans are influenced by it holds promise and could even help to highlight and address areas where unconscious bias has crept into hiring processes. But the end of the day, humans make the decisions and have to hold themselves accountable for creating diversity and equity in the workforce. It’s important to note that like so many aspects of our lives, this problem can’t be solved with a “one or the other” mindset, instead calling for a balanced approach.

“Artificial intelligence can be a lot of things, but what it’s also important to note what AI isn’t,” says Jan Van der Hoop, President of Fit First Technologies. “AI isn’t a catchall solution to every problem an organization has and shouldn’t be treated as such.”

The Recruiting Drive

A diverse talent pool and employee-base has been shown to have business benefits across numerous analysis in recent years. AI platforms boast that you can get a variety of benefits from them, whether that’s revamping hiring processes or identifying talent gaps, building training programs or development pipelines, executing performance management strategies or monitoring internal mobility.

But it’s the talent acquisition space that garners the most attention, and perhaps with good reason. As companies ramp up hiring in the new normal, diverse candidates are becoming an increasingly hot commodity and AI is becomingly a more common way to help companies find those candidates.

Recruiting therefore, is one of the areas driving the most significant AI platform growth at the moment.

When it comes to recruiting, AI has shown benefits in reducing time-to-hire, time-to-interview, cost per hire, and candidate quality, according to an article from Louis Columbus for Forbes.

AI has the potential to help overcome another hurdle many in the workforce encounter; the experience barrier. In a crowded job seekers market, finding someone with an extensive resume isn’t as difficult as it was pre-pandemic, but that doesn’t necessarily mean they are a good fit behaviorally, culturally or in terms of skill.

READ: Recruiting Automation is Evolving to Meet Talent Acquisition Needs

While AI can help match behaviors to identify candidates who best fit the role and who will feed off of and enhance the organization’s culture so to speak, it can also help in assessing skills as it can learn the same way from entry level employees as it does for executives, or the same way for someone whose been in the workforce for three years as someone whose been it for twenty.

“Obviously if you’re a higher level employee, there will be more information about you because you’ve been in the workforce longer and have had a longer tenure than others,” Eric Storm, VP of North America for AI company Starmind, said. “But if you’ve been there a year and had some internships beforehand, AI can learn from it the same way.”

Hurdles for AI to Overcome

There are some issues in AI being the key to unlock a truly diverse workforce, however. None more paramount than the diversity of the talent that builds AI. For an industry expected to grow by nearly $200 billion over the next five years, this is important as it is human beings that drive the growth and adoption of AI.

According to an article from the World Economic Forum, the technologists developing AI are not nearly diverse enough. Tech companies focused on AI development have to mitigate bias in the future by ensuring that the people developing the technology today don’t perpetuate bias in the algorithms that make AI work.

Another factor comes in the form of what we feed AI. It’s vital that AI vendors help companies who have already adopted AI in recruiting and talent management processes be more conscious of the candidates they provide AI to analyze. Some best practices for recruiters using AI is to remain vigilant about diversifying sourcing pools. Numerous job boards and scientifically proven assessment tools can be deployed to improve the quality of candidate while reducing the risk of having bias present in the process.

Additionally, employers may want to eliminate certain types of language that are biased and need to be removed from job descriptions that applicants read. Language is often a predictor of what kind of candidate you’ll get. Just as the term “digital native” can deter an older candidate from applying, words like “fearless” or “driven” can lead to more male candidates, whereas “transparent” and “team player” may have the impact of kicking up more female options.

And finally, if a diverse workforce is what you seek, then don’t forget the other piece of the D&I puzzle, inclusion. In job postings and interviews the term cultural fit implies the person fitting into the culture or identity of the business. If the company is looking to become more diverse, this doesn’t make much sense. Instead, change the language to reflect the need for cultural additions to ensure you’re getting people who add something new to the mix, be it in terms of background or skillset.


RECOMMENDED