अब आप न्यूज्ड हिंदी में पढ़ सकते हैं। यहाँ क्लिक करें
Home » Tech » Amazon scraps internal Artificial Intelligence recruiting tool biased that was against women

Amazon scraps internal Artificial Intelligence recruiting tool biased that was against women

Amazon.com Inc’s machine-learning specialists uncovered a serious issue where the new recruiting engine did not like women. The Amazon team had been working on software programmes that can review job applicants’ resumes and select the top talented five people familiar with the efforts.

By Newsd
Updated on :

Amazon.com Inc’s machine-learning specialists uncovered a serious issue where the new recruiting engine did not like women. The Amazon team had been working on software programmes that can review job applicants’ resumes and select the top talented five people familiar with the efforts. The company had been building the programme since 2014 as automation has been key to Amazon’s e-commerce dominance for both, inside the warehouses and the driving pricing decisions.

This experimental hiring tool used artificial intelligence to give scores to the job candidates ranging from one to five stars similar to the shoppers’ rate products on Amazon.

 “Everyone wanted this holy grail. They literally wanted it to be an engine where I’m going to give you 100 resumes, it will spit out the top five, and we’ll hire those,” said a source.

However, by 2015, the company realized that the new system was not rating candidates for software developer jobs and other technical posts in a gender-neutral way.

That is because Amazon’s computer models were trained to vet applicants by observing patterns in resumes submitted to the company over a 10-year period. Most came from men, a reflection of male dominance across the tech industry.

In effect, Amazon’s system taught itself that male candidates were preferable. It penalized resumes that included the word “women’s,” as in “women’s chess club captain.” And it downgraded graduates of two all-women’s colleges, according to people familiar with the matter. They did not specify the names of the schools.

Amazon edited the programs to make them neutral to these particular terms. But that was no guarantee that the machines would not devise other ways of sorting candidates that could prove discriminatory, the people said.

The Seattle company ultimately disbanded the team by the start of last year because executives lost hope for the project, according to the people, who spoke on condition of anonymity. Amazon’s recruiters looked at the recommendations generated by the tool when searching for new hires, but never relied solely on those rankings, they said.

Amazon declined to comment on the technology’s challenges, but said the tool “was never used by Amazon recruiters to evaluate candidates.” The company did not elaborate further. It did not dispute that recruiters looked at the recommendations generated by the recruiting engine.

The company’s experiment, which Reuters is first to report, offers a case study in the limitations of machine learning. It also serves as a lesson to the growing list of large companies including Hilton Worldwide Holdings Inc (HLT.N) and Goldman Sachs Group Inc (GS.N) that are looking to automate portions of the hiring process.

Around 55 percent of US human resources managers said artificial intelligence, or AI, would be a regular part of their work within the next five years, according to a 2017 survey by talent software firm CareerBuilder.

Related