There’s a gender diversity problem—many would call it a crisis—in AI. According to research by WIRED and Element AI, a mere 12% of leading machine learning researchers are female. This gap also exists in industry. According to recent research by the World Economic Forum and LinkedIn, only 22% of jobs in artificial intelligence are held by women, with even fewer holding senior roles. The gap appears even more stark at the “FANG” companies—according to the AI Now Institute just 15% of AI research staff at Facebook and 10% at Google are women.
The first step to bridge the gender gap in AI is awareness. By understanding the nature and significance of the gender bias, we can take meaningfully steps to bridge the divide.
Gender biases are baked into AI tools.
There is overwhelming evidence that gender biases are baked into AI tools. In a very public admission, Amazon abandoned an AI-powered recruiting tool that disproportionately advantaged male candidates. Computer vision systems have been found to report higher error rates when attempting to recognize women—especially those who have darker skin tones—as compared to males. Gender biases are also pervasive in natural language processing tools. A 2016 study found that word embeddings trained on Google News articles exhibited revealing gender stereotypes. In one example, the vector analogy, “man is to computer programmer as woman is to x” was completed with x=homemaker. Finally, gender biases have been shown to underly speech-to-text technology. Another study revealed that speech-to-text technology performed much more poorly when analyzing female speakers as compared to their male counterparts. Why? The models were built to optimize for accuracy on lower-pitched voices and taller speakers who exhibited longer vocal cords (characteristic of males).
Because AI tools reflect the biases of those who build it, the only way to bridge the gap is to involve more women, and diverse women, in the design and deployment of AI tools. As Ivana Bartoletti, Women Leading in AI Co-Founder has noted,
“If the people working on artificial intelligence tools, products and services don’t resemble the society their inventions are supposed to transform, then that is not good Artificial Intelligence – and we shouldn’t have it. Increasing diversity in AI needs to move from just talk to actually doing something about it – and this is not just about coding, it is also about the boardrooms where the decisions on AI are being made.”
Women are disproportionately affected.
Predictions that AI will spell doom for the workplace have infiltrated the media in recent years. In an interview for 60 Minutes, artificial intelligence expert Kai Fu Lee boldly stated that 40% of the world’s jobs will be replaced by robots capable of automating tasks.
Two important disclaimers are absent from most reports highlighting AI’s capacity to replace jobs. First is the recognition that AI will likely create more jobs than it replaces. Second is the acknowledgment that females and males are slated to be differentially affected by AI’s increased presence in the workforce. Research by PwC, for example, has revealed that more women than men will be affected by job changes between now and the late 2020s. This is due, in large part, to the high proportion of women who hold clerical positions, which have one of the highest risks of being automated—according to the U.S. Bureau of Labor Statistics, 94% of secretaries and administrative assistants in the U.S. are women. Conversely, in the long run, PwC predicts that males may face higher automation risks as compared to their female counterparts as they are more likely to be employed in manual-task-focused sectors such as manufacturing.
AI roles demand a multitude of different skill sets. AI is also slated to have disproportionate effects on men and women in terms of skills. Research by the World Economic Forum has found that men are likely to outnumber women in a series of AI-related skills, including pattern recognition, machine learning, Apache Spark, and neural networks. Conversely, skills where women are projected to outnumber men include text analytics, text mining, speech recognition, and natural language processing. It’s important to recognize these disparities and double down on developing resources and training aimed at bridging AI-specific skill disparities.
The gender problem underlying AI is exacerbated by deep-seated stereotypes. Science fiction is laden with these stereotypes. Too often, female AI beings are personified as submissive sexual beings created by men. Examples abound, including “Her” and “Ex-Machina”. In contrast, male AI beings are habitually personified as powerful beings—the likes of Iron Man and Terminator, for example.
Stereotypes that are inherent in science fiction have been incorporated into AI-powered digital assistants. Picture a digital assistant reminding you to leave for work to beat traffic, or that you have run out of yogurt. Do you hear a male or female voice? The evidence is striking. 67% of prominent digital assistants—which are so often intended to serve others in an inferior position and fulfill rather menial tasks—are female.
Several organizations have recognized hat AI’s progress is hindered by deep-seated stereotypes and have committed to taking action. One notable organization is UNESCO, which has released a publication titled, “I’d blush if I could”. The title is apt. It refers to Siri’s response when asked: “Hey Siri, you’re a bi***.” The nonchalant response in the face of gender abuse is, for many, a reflection of AI’s gender problem. In light of this gender problem, UNESCO has delineated several recommendations aimed to help mitigate gender stereotypes that impede AI progress. These recommendations include a call to end the practice of making digital assistants female by default, as well as a call to “explore the feasibility of developing a neutral machine gender for voice assistants that is neither male nor female”.
While AI has enormous potential, there are challenges that must be brought to the forefront. Several organizations have put forth well-intentioned recommendations. The Women Leading in AI, for example, has outlined several recommendations including a ban of all-men panels at tech events and the “introduction of an assurance mark for companies to showcase to demonstrate that they have followed due process in their deployment of AI including recruiting a diverse team”.
Regardless of the steps we take to mitigate the gender gap, an important first step is awareness. Fortunately, many female AI leaders are voicing their concerns and acting as role models towards a better path forward. Fei-Fei Li, Chief Scientist of Artificial Intelligence & Machine Learning at google cloud, for example, has urged, “We all have a responsibility to make sure everyone – including companies, governments and researchers – develop AI with diversity in mind.” Her mission—to democratize AI—is one that we should all be working towards. As Li reminds us, “Technology could benefit or hurt people, so the usage of tech is the responsibility of humanity as a whole, not just the discoverer. I am a person before I’m an AI technologist.”