Can a computer be prejudiced?
Algorithms can help uncover bias, but aren’t perfect either.
Our editorial transparency tool uses blockchain technology to permanently log all changes made to official releases after publication. However, this post is not an official release and therefore not tracked. Visit our learn more for more information.
We all like to think of ourselves as objective, rational thinkers, including when we’re tasked with hiring for a company or making a product that treats all of its users equally. We carefully scan through 50 or so resumes, interview the best candidates and pick the perfect person for the role. We help build a new website with fine-tuned personalization that easily adapts to each user’s gender, age and culture... or does it?
Well, maybe not. Humans are wired to be biased -- and all of the diversity training and well-meaning statements about non-discriminatory practices don’t erase basic human preferences. AI is being utilized to help weed out potential prejudices at companies and in code, but even algorithms aren’t perfect.
A study from The University of Pennsylvania’s Wharton school about hiring for jobs in STEM fields, showed that women and minority candidates with 4.0 GPAs were treated the same as white male candidates with 3.75 GPAs. That may seem like a small numerical gap, but it can easily be the difference between getting hired or not.
“Human bias is what made us who we are, and what made us evolve to the top of the food chain,” explains Julien Barbieri, co-founder and CEO of Holberton School, a project-based academy for software engineers.
While technology is supposed to be more objective and rational than humans, the code can also get skewed by the thinking of its programmers. “With every company becoming a tech company,” said Barbieri “every product is now at scale, serving -- or supposed to serve -- everyone on the planet.” With more products aiming to serve large communities, it’s even more important to fix human bias in hiring -- and the tech workforce -- today than 10 years ago.
The Holberton School is working to help eliminate bias. Applicants to the alternative college don’t have to be high school graduates or experienced programmers. Holberton also uses an automated admissions system, that doesn’t consider race, age, zip code, gender, or many of the other factors that can cause an admissions bias.
A study about race and hiring found “minority applicants who ‘whitened’ their resumes were more than twice as likely to receive calls for interviews, and it did not matter whether the organization claimed to value diversity or not,” explained Paul Rubenstein, Chief People Officer at Visier, a workplace planning platform.
A report by Visier, “The Truth About Ageism in the Tech Industry,” identified systemic bias in the tech industry against hiring workers older than 45. “What is missed -- and what our study uncovered -- is that older workers are more productive and more loyal, which makes their work output higher quality,” he said.
Bias-detecting technology can help identify things that may not be obvious, such as gendered language in job ads. “There are plenty of seemingly random [word] examples like ’exhaustive, enforcement, and fearless’ that are statistically proven to skew your talent pool toward men,” explained Charna Parkey, an applied scientist at the augmented writing platform Textio. “Using these words doesn’t show ill intention, but with data you can find the language patterns that attract different people to your open roles.”
Every job posting or recruiting email that Textio analyzes is assigned a bias score according to the presence or absence of gendered language patterns in the post. “Because the patterns are statistical and quantitative, they [can] predict the gender of the person eventually hired,” said Parkey.
However, simply adjusting the language of a job ad can impact a company’s makeup. “You’re much more likely to hire a woman into a tech role if your pipeline has several women to consider,” she said.
The startup company Tilr uses an AI algorithm to reduce hiring discrimination that blindly (no names, gender, race – just skills) matches qualified workers with companies to fill their immediate placement needs. “Our technology considers 1) the person’s skill set, 2) the person’s availability 3) the ideal locations and travel distance and 4) their preferences,” said Carisa Miklusak, Tilr’s CEO. When jobs are posted by employers, the technology considers the requirements and sends it to the job seekers that match.”
Rubenstein sees the promise in AI, despite the challenges. He says AI should not be taken out of the hiring process. “The melding of neuroscience and AI has produced some positive results in terms of helping organizations level the playing field,” he concludes. “When it comes to fairly assessing job candidates, humans could use quite a bit of help.”
For more information, see:
For related media inquiries, please contact firstname.lastname@example.org