Amazon Ditches it’s AI Recruiting Engine for ‘Disliking Women for Technical Jobs’

Amazon Ditches it's AI Recruiting Engine for 'Disliking Women for Technical Jobs'
Share this:

Amazon’s machine-learning experts recently discovered that their new recruiting tool favored men.

Five individuals conversant with this undertaking told Reuters that the team behind the development of this cutting-edge engine had been creating computer programs since 2014 for reviewing job applicants’ resumes, with the intention of automating the search for top talent.

Automation has been a vital aspect for propelling the dominance of Amazon in the e-commerce sector, whether it is in warehouses or even pushing driving decisions. Some of the individuals familiar with the project said that the company’s experimental recruiting engine utilized artificial intelligence(AI) in giving candidates scores between one and five stars, the same way shoppers rate the products sold on Amazon.

“Everyone wanted this holy grail. They literally wanted it to be an engine where I’m going to give you 100 résumés, it will spit out the top five, and we’ll hire those,” one of the people explained.

However, by 2015, Amazon discovered that its new system was gender-biased when it came to assigning candidates the rating for software developer jobs among other technical positions.

The reason is that the computer models developed by Amazon were trained for vetting applicants through looking at patterns in the resumes submitted to the e-commerce giant throughout a duration of ten years. Most of the applications were from men, which is a clear indication of male dominance across the technology space.

According to sources familiar with this matter, Amazon’s system was able to train itself that male candidates were ideal by considering the number of applications from men.

In fact, it penalized all applications or resumes that consisted the word “women’s”, for instance, “women’s chess club captain”. What’s more, it downgraded all graduates who ahead attended two all-women institutions of higher learning.

Amazon made some edits to the programs in a bid to make them neutral to such terms.

However, that was not a guarantee that the machines would not, in turn, come up with other techniques of sorting potential candidates that would eventually prove biased.

At the beginning of 2018, the Seattle-based company decided to disband the team due to executives losing hope for the particular project.

Recruiters at Amazon focused on the recommendations provided by the recruiting tool when conducting their search for new talent.

However, they did not mainly depend on the rankings. Although Amazon declined to say anything about its recruiting tool or its hurdles, the company said that it is devoted to workplace equality and diversity.

Amazon’s experiment, which Reuters was first to report, provides a case study revolving around machine learning. It also functions as a lesson, particularly for the growing list of big companies such as Goldman Sachs and Hilton Worldwide Holdings, which are seeking to conduct automation for various parts of the recruitment process.

According to a survey conducted by talent software company CareerBuilder in 2007, about 55% of United States human resources managers asserted that artificial intelligence would be a normal part of their work, particularly within the coming five years.

Amazon ’s project started during a critical moment for the largest online retailer in the world.

Machine learning had started gaining traction, especially in technology space due to an increment in low-cost computing power. During that time, Amazon’s Human Resource unit was about to start a massive hiring process.

As such, Amazon created a team in its engineering hub, which is located in Edinburgh that grew to about 12 individuals. Their objective was to come up with an AI that could crawl the web fast and identify candidates who are worth hiring.

The group developed 500 computer models geared toward specific job locations and functions. They taught each of them, models, how to spot some 500,000 terms that were discovered on previous candidates’ resumes.

The algorithms learned how to assign little importance to skills that were similar across all the IRT applicants, for instance, the ability to write down different computer codes.

Gender bias was not the only problem. People familiar with this topic said that issues with the data that highlight the model’s judgments mean that unqualified candidates were mostly recommended for various types of jobs.

The cure or the problem?

Apart from Amazon, other companies are pushing forward, which shows the eagerness of workers to leverage AI in hiring efforts.

The Chief Executive Officer of HireVue Kevin Parker said that automation is allowing companies to focus beyond similar recruiting networks, which they have relied upon for a long time.

His company, located near Salt Lake City, conducts analysis for candidates’ facial and speech expressions, particularly in video interviews in a bid to minimize dependence on resumes.

“You weren’t going back to the same old places; you weren’t going back to just Ivy League schools,” said Parker. His company’s customers consist of Unilever PLC and Hilton.

Furthermore, Goldman Sachs has developed its own tool for analyzing resumes. It tries to match candidates with the departments where they would be an ideal fit.

Linkedin, which is the largest professional network in the world, has gone further. It provides employers with algorithmic rankings of candidates on the basis of their fitness for job postings on the particular site. On the same note, John Jersin, VP of LinkedIn Talent Solutions, asserted that the service is not for conventional recruiters.

“I certainly would not trust any AI system today to make a hiring decision on its own. The technology is just not ready yet,” he said.

Some activists are concerned about the transparency of AI. Also, the American Civil Liberties Union is presently against a law that allows the criminal prosecution journalists and researchers involved in testing recruiting sites’ algorithms for bias.

“We are increasingly focusing on algorithmic fairness as an issue,” explained Rachel Goodman, who is a staff attorney with the Racial Justice Program at the American Civil Liberties Union (ACLU) Additionally, Goodman and other AI critics acknowledged that it could be highly difficult to sue a worker over automated recruitment; job candidates may never know it was being utilized when hiring them.

Share this:

Leave a Reply

avatar
  Subscribe  
Notify of