General Discussion
Related: Editorials & Other Articles, Issue Forums, Alliance Forums, Region ForumsAmazon scraps secret AI recruiting tool that showed bias against women
Amazon.com Incs (AMZN.O) machine-learning specialists uncovered a big problem: their new recruiting engine did not like women. The team had been building computer programs since 2014 to review job applicants resumes with the aim of mechanizing the search for top talent, five people familiar with the effort told Reuters.
Automation has been key to Amazons e-commerce dominance, be it inside warehouses or driving pricing decisions. The companys experimental hiring tool used artificial intelligence to give job candidates scores ranging from one to five stars - much like shoppers rate products on Amazon, some of the people said.
Everyone wanted this holy grail, one of the people said. They literally wanted it to be an engine where Im going to give you 100 resumes, it will spit out the top five, and well hire those. But by 2015, the company realized its new system was not rating candidates for software developer jobs and other technical posts in a gender-neutral way.
That is because Amazons computer models were trained to vet applicants by observing patterns in resumes submitted to the company over a 10-year period. Most came from men, a reflection of male dominance across the tech industry. In effect, Amazons system taught itself that male candidates were preferable. It penalized resumes that included the word womens, as in womens chess club captain. And it downgraded graduates of two all-womens colleges, according to people familiar with the matter. They did not specify the names of the schools.
https://www.reuters.com/article/us-amazon-com-jobs-automation-insight/amazon-scraps-secret-ai-recruiting-tool-that-showed-bias-against-women-idUSKCN1MK08G
MineralMan
(146,329 posts)Apparently, there wasn't enough of that involved here. The algorithms they used were biased towards men. I'm betting they were also written by men, mainly. Most men are not good at recognizing their own biases. Most humans aren't.
So, the garbage went in and the garbage came out, as those things typically go.
marylandblue
(12,344 posts)Smarter because they can see our sexism, or not smarter because they just reproduce and automate the problem.