Can new AI recruitment tools help companies increase diversity?
Hiring and recruitment startups show promise, but could face implicit bias issues of their own.
Artificial intelligence is making hiring easier for companies, by automating tasks that once took lots of time, like sifting through résumés. According to a recent study by Korn Ferry, recruiters say AI helps them find quality candidates and frees them up to focus on the more human aspect of their jobs. Increasingly, the technology is also being adopted to help companies overcome implicit bias in their recruitment.
Diversity has become a priority for many U.S. companies, with studies like McKinsey & Company’s recent research finding that gender and ethnic diversity in the workplace leads to higher profits. One of the reasons often listed for the lack of diversity in major corporations is unconscious bias, an ingrained human trait that can affect hiring at any stage of the recruitment process.
Many AI startups are stepping up to take on the challenge of reducing the many factors that may unconsciously cloud a recruiter’s judgment, starting with the pool of candidates who apply or are sought out for a position.
One such company is Textio, which analyzes company job descriptions, highlighting such issues as to whether they contain too much jargon, or read too masculine or feminine, and suggests alternative language. Another company, Mya, uses a chatbot to objectively interview candidates, asking performance-based questions. TalVista leverages machine learning across the hiring process, from optimizing job descriptions to redacting demographic info from résumés.
Another company, Blendoor, takes a several-pronged approach to help companies mitigate unconscious bias, from keeping a database of qualified candidates, to anonymizing a company's existing applicant pool, to providing insights to companies to help them pinpoint where bias could be coming in. Blendoor's founder and CEO, Stephanie Lampkin, is an African-American LGBTQ woman with an engineering degree from Stanford and an MBA from MIT who learned to code at age 13. She founded Blendoor in 2016 to take on the implicit bias issue in hiring from all angles.
“Our team is building a platform to help companies elucidate precisely where bias happens whether due to an individual, a software tool, system or process," says Lampkin.
There's a lot of excitement about what AI can do to help companies diversify their teams, but we've already seen ways that technology can mimic society's worst biases.
“There are already many studies showing that AI can discriminate,” says Safiya Noble, author of the 2018 book, Algorithms of Oppression: How Search Engines Reinforce Racism. "I don't think many software engineers are sufficiently prepared to build AI that mitigates discrimination. If an engineer is building an ideal profile of a job candidate using historical or demographic information about the type of person who has typically held that job, we can see how this data would only exacerbate or extend discrimination,” Noble says.
Lampkin acknowledges AI’s issues but remains convinced that it can make a positive difference. “Where most people see problems and focus their energies on delineating the problem, I focus on solutions,” she explains. Blendoor is working with historically black colleges and universities, boot camps and other organizations to source more diverse talent, collecting data beyond the résumé like working style, as well as historical performance and employer feedback.
For the industry at large, Noble has a radical idea: “At a minimum, the way we teach and think about software engineering has to be completely overhauled, where students deeply study the social sciences and humanities as they learn how to code, and they iterate their projects and code against social realities. This is not the way engineering is currently taught, so we have much work to do.”
For related media inquiries, please contact firstname.lastname@example.org