Create a free account, or log in

Unconscious bias: Amazon forced to scrap machine learning recruitment tool because it didn’t like women

E-commerce giant Amazon reportedly had to scrap a machine-learning recruitment tool because it was biased against resumes that featured the word ‘women’.
Amazon

E-commerce giant Amazon reportedly had to scrap a machine-learning recruitment tool because it was biased against female candidates.

According to a report from Reuters, Amazon started building the platform in 2014, to help speed up the recruitment process by making it automated and more efficient. It rated resumes out of five, automatically ranking candidates in terms of suitability.

However, by 2015, the company realised the tool was consistently ranking resumes that featured the word ‘women’ poorly, compared to those that didn’t.

Any resume that mentioned involvement in women’s clubs or sports teams, for example, were downgraded. According to Reuters, it also downgraded graduates of two women’s colleges.

Any artificial intelligence or machine learning technology is only as smart as the information it’s learning from. Amazon’s tool was designed to analyse hiring patterns over a ten-year period — a period when the vast majority of hires, especially engineers and data scientists, were male.

This trend led to the tool ‘learning’ that Amazon did not want to hire female applicants.

According to Reuters, Amazon recruiters never relied on the tool entirely and stopped using it once the issue was raised. Now, the tech giant only uses a very watered down version of the tool, and only for administrative tasks.

According to Reuters data, as of 2017 Amazon’s workforce is 40% female — a more equal gender split than Facebook (36%), Apple (32%), Google (31%) and Microsoft (26%).

However, of these tech giants, Amazon is the only one that does not disclose the gender breakdown of its technical workforce.

NOW READ: Women in STEM has been given a $4.5 million boost, and the promise of an ambassador, but is it enough?

NOW READ: When Alexa went rogue: The importance of context design