Some experts have raised concerns that artificial intelligence (AI) could have its own gender gap if more women aren’t involved in its development and dataset analysis.
“It’s not just AI, but I would say engineering as a whole,” Dr. Georgianna Shea, chief technologist at the Foundation for Defense of Democracies’ Center on Cyber and Technology Innovation (CCTI), told Fox News Digital. “Whenever there’s any type of engineering process for anything, you don’t want to end up with bias-based engineers.”
Adding to the debate, Melinda French Gates, co-chair of the Bill & Melinda Gates Foundation, recently said in an interview that she was concerned there was a lack of women working in the field of artificial intelligence, which she said made her nervous about potential biases in platforms.
Shea said the problem is two-fold: Not only does the field require more women to help guide the development of AI platforms, the datasets used to inform and train the AI already use biased data.
“There’s another aspect that the data you’re included in those AIs are … [that you’re] making sure you’re getting the data about women, including women understanding how much data represents them,” Shea said.
Shea used the example of a gender-dominated field like nursing, of which women comprise roughly 86% of workers, which would potentially then pick information that favors women when drawing conclusions about the nursing field and industry, putting male workers at a disadvantage for utilizing an AI platform to gain relevant information.
“Men and women are physiologically different, so if you have a set of men who are testing a drug, then maybe the body mass index is higher or lower than it would be if it was a woman … there’s just fundamental differences, so the data itself is going to show that this how it came out based on that test set of people,” she argued.
Women have raised concerns about how gender bias might influence AI for years now, with the Stanford Social Innovation Review discussing possible issues in 2019: The authors argued that institutions making decisions based on AI and using machine learning suffer a “pervasive” gender bias that has profound effects on women’s short and long-term well-being and security.
Part of that is due to filtering all data into a single processor without disaggregation by sex and gender, which leads to “concealing important differences … and hides potential overrepresentation and underrepresentation.”
In the tech industry, women comprise around 28% of the tech industry workforce as of 2022, according to a compilation of data from Zippia.com. Further, women make up around 34.4% of the workforce at the largest tech companies in the U.S.
Only 15% of engineering jobs are held by women, and women leave the tech industry at a 45% higher rate than men do, according to DataProt.
Shea compared biased data sets to military equipment, such as a tank or similar vehicle, which was developed for men since they alone could serve in combat roles up until the military changed protocol around 10 years ago. With full integration in 2015, the military and engineers had to start refitting vehicles to accommodate a woman’s minimal height and weight requirements for safety.
The key to ensuring that any AI platform does not stumble on this issue is to consider the context of an AI platform, according to Shea.
“You have to understand, why are we building this system? What is the purpose? Who’s it going to affect? What kind of guidance do we need that’s not going to incorporate those societal biases, those data biases that might be in there?” she said.
“So you have to identify that and include it in the process and exclude gender as a component for selection,” she added.