New York City businesses that use artificial intelligence to help find hires now have to show the process was free from sexism and racism.
New York City businesses that use artificial intelligence to help find hires now have to show the process was free from sexism and racism.
I remember hearing about a high-profile case where the AI would dock points if someone’s resume listed them as participating in women’s sports as an extracurricular, while giving extra points if it listed them as participating in men’s sports.
Also, bias doesn’t necessarily have to come from happenstance. Unfortunately, humans tend to have unconcious (or, sometimes, not-so-unconcious) biases against women and people of color. There was a study where researchers sent identical resumes to a random group of recruiters-- but half of the resumes had a male name and half had a female name.
They found that both male and female recruiters were more likely to rate the resumes with the male name higher and be more likely to recommend they be advanced to the next round of interviews. IIRC, similar studies have found similar results if you give the resumes a “Black sounding” name versus a “white sounding” name.
So if you train an AI on your own company’s hiring data-- which is likely to be tainted by the unconcious bias of your own recruiters and hiring managers-- then the AI might pick up on that and replicate it in its results.