Two years ago, Microsoft built an AI named Tay and set her loose on social media.
Exposed to humanity in all its glory, Tay quickly descended into white supremacy and Nazism, announcing to the world that “Hitler was right”, and “I fucking hate feminists and they should all die and burn in hell.” She was quickly taken offline.
That was, of course, an extreme example, but as women around the world know, sexism is often a much more banal experience. And while AI might be revolutionizing how we tackle things like climate change or education, it turns out there are some ways in which it is oddly stuck in the past.
Since 2014, online retail giant Amazon has been using an experimental machine-learning program to recruit new employees.
“Everyone wanted this holy grail,” a source familiar with the project told Reuters. “They literally wanted it to be an engine where I’m going to give you 100 resumes, it will spit out the top five, and we’ll hire those.”
The program, developed by a team of about a dozen engineers, was designed to spot the best candidates and give them a rating from one to five stars – just like Amazon’s product reviews. To do this, the team created 500 computer models and taught each of them to recognize 50,000 terms from previous applicants’ resumes.
The project was a success in some ways – for instance, learning to deprioritize skills that were common among most applicants. But quite quickly, the team realized a big problem: the program had taught itself some seriously questionable hiring practices, prioritizing male candidates, and masculine language, over women.
Just like Tay, it seems Amazon’s AI project was a victim of its upbringing. It was programmed to find patterns in resumes from the previous 10 years, and most of these were from men. As a result, it started to favor resumes that included words more commonly used by male applicants, such as “executed” and “captured”. More damningly, it began to downgrade graduates of all-women colleges, and penalize resumes containing the word “women’s” – so membership of a college’s Women’s Software Development Society, for example, could actually hurt your chances of winning a software development job.
The AI’s gender bias, along with problems with the program recommending unqualified candidates, meant that the project was eventually shut down.
Amazon told Reuters that the AI was “never used by Amazon recruiters to evaluate candidates,” and the company now uses a much weaker version to help with mundane chores like deleting duplicate applications. According to one source, a new recruiting AI has been commissioned – this time aimed at increasing diversity.
Although machine learning is already transforming our professional lives, technology specialists, as well as civil liberties groups such as the ACLU, say more work needs to be done to avoid issues like Amazon’s.
“I certainly would not trust any AI system today to make a hiring decision on its own,” vice president of LinkedIn Talent Solutions John Jersin told Reuters. “The technology is just not ready yet.”