Mind the difference: How to avoid gender bias in AI
Elisabeth Staudinger
Managing Board Member @ Siemens Healthineers | We pioneer breakthroughs in healthcare. For everyone. Everywhere. Sustainably.
Fundamental medical advances can rarely be dated as precisely as March 9, 1994. As of that milestone date, medical studies in the U.S. have also been required to include women and minorities. Mind you: we’re talking about 1994, not 1944 or 1894. This date underscores how biomedical research has historically been a primarily male domain. It was defined and largely controlled by men and focused almost exclusively on their health. All the while, women rarely had the opportunity to study or were not taken seriously. As one consequence, 94 percent of the winners of the Nobel Prize for Medicine are men.
And this gender bias still has consequences today. Those who don’t consider the biological and medical differences between the sexes run the risk of incorrectly diagnosing or treating women.
Diseases or medical conditions that primarily affect women are often underfinanced relative to the burdens they cause. There are also many gaps in data on women’s health. This inequality affects women’s health worldwide by creating medical blind spots, influencing research, or driving investment decisions.
Take heart attacks, for example, a particularly striking illustration of medical differences between the sexes, exacerbated by the so-called ‘Hollywood Heart Attack’. If a man feels a sharp pain in his chest that radiates into his left arm, he is very likely to immediately call for emergency medical help. In women, however, symptoms of a heart attack are often less obvious: They can range from shortness of breath, nausea, and vomiting, to pain in the back or jaw. This is also why women are more than twice as likely to die from a heart attack than men. Moreover, women who suffer a heart attack also have a particularly high risk of poor outcomes.
There are also differences between the sexes in laboratory diagnoses, where the troponin test plays a role. Found in heart muscles, this protein is increasingly released into the blood after damage to the heart and can be measured. The topic here: Critical troponin values for men are higher than for women. If the male value is used in diagnosing a woman, her condition is likely to be less critically assessed than it is. The solution? The more we call attention to such medical differentiations, the more aware people will be of the consequences and take appropriate action. It’s a huge lever for making a real difference in cardiovascular cases.
Men and women react differently to psychotropic drugs: Women often need lower doses, and their metabolism breaks down substances differently. Women also develop different side effects than men. And since dosage recommendations are often tailored to men, women can quickly overdose. Complicating the situation: Many active ingredients in medications show gender-specific differences that are often not taken into account.
A study of health data compiled from 6.9 million Danes found that women were on average around four years older than men at the time hundreds of diseases were diagnosed. National healthcare strategies must take such differences into account, and the authors of the study reached a clear conclusion: “We can no longer use the ‘one size fits all’ model”. I fully agree. We should see the lack of gender equality in healthcare as a structural deficiency and resolve the issue. And especially when it comes to using artificial intelligence, we cannot, and must not repeat old mistakes.
领英推荐
AI is a crucial innovation in medical technology. To properly develop its full potential here, it must take gender differences into account. The background: Training data for AI naturally draws on past cases and records. If this medical data was biased – for example, favoring more cases of men than women or data from wealthy countries, or being limited to specific ethnic groups – there’s a risk of replicating this bias into the future. If I, for example, use the images of 100 cases of male patients and only 20 cases of women, the AI is likely to weigh males more heavily and its resulting algorithm would be “biased”. In short: We must ensure that training data is not distorted by human bias, since the value of AI applications ultimately depends on the quality of the data used for their training.
A medical study revealed that AI models created to predict liver disease from blood tests are twice as likely to miss the disease in women than in men. The European Parliament considers “bias in AI and the perpetuation of existing inequities” to be one of the greatest risks posed by AI in medicine. Reality confirms this assumption. A recent comprehensive study revealed that the generative AI GPT-4 did not appropriately model the demographic diversity of medical conditions but tends to stereotype. GPT-4 showed biases related to race and gender when performing medical tasks, such as creating medical case studies, helping with diagnosis, suggesting treatment plans, and evaluating patients personally. It exaggerated known disease prevalence differences between groups, over-represented stereotypes and amplified harmful societal biases. This shows that it's one thing to ask a computer for the right directions or a nice restaurant recommendation, but it’s quite another to answer questions about someone's health. The latter is a doctor's restricted domain and for good reasons.
The question here is how Siemens Healthineers is positioning itself with regard to AI. In our work, we meticulously gather data sets from all five continents encompassing a diverse range of sources, including public clinical registries, medical associations, and trusted research partners. Our experts thoroughly examine each data point and enrich it with additional information such as anatomical landmarks, diagnostic indicators, and tumor characterizations. We check the data quality in several steps and train our algorithms with representative data sets specially curated for the task. This way, we can ensure that cases for which we have the most images and data do not automatically have the greatest influence on the results. We can meet this high standard because we have an immense collection of medical data – a data pool of more than two billion data points consisting of clinical images, medical reports, and additional billions of genomic and laboratory diagnostic data.
We know, of course, that opportunities for a healthy life are not evenly distributed throughout populations. Differences exist not only between the rich and poor but also between the sexes. We must be aware of these differences in our everyday work, particularly when using artificial intelligence. The good news here is that we have already amassed a wealth of knowledge about the causes of possible gender bias. And we have the necessary tools at our disposal to not amplify them.
Siemens Healthineers is committed to solving system-wide challenges in healthcare and enabling equal access to healthcare for everyone, everywhere. In doing so, we focus on leveraging our strategic strengths – patient twinning, precision therapy, and, of course, digitalization, data, and AI.?This is how we pursue our corporate purpose: We pioneer breakthroughs in healthcare. For everyone. Everywhere. Sustainably.
In the next article I will focus on concrete solutions that are already helping to narrow the health gap between women and men.
Read also the first article Mind the Gap: Assessing inequalities and opportunities in women’s health
Co-Founder | Women in Tech | Lawyer | GRC
6 个月Addressing gender bias in AI demands diverse perspectives and ethical frameworks throughout development. Ensuring transparency and accountability is essential for creating fair and equitable technologies.?
Global Head of Sales, Diagnostic Imaging, Siemens Healthineers | Pre-teen girl mom | Travel and walking enthusiast
9 个月Thank you for bringing awareness to this important issue! It's incredible how deep this bias goes. #womenshealth
+20 years of Business Development and Business Management in Healthcare Industry | Strategic Account Manager at siemens-healthineers.com | DE&I Manager | Board Member
9 个月Thanks Elisabeth for raising the AI topic in the gender health gap: I truly believe it is crucial to avoid propagating gender bias in AI both in the data used for AI systems training and in AI model development. For this we need to guarantee data quality and to have more women participating in software development.
Business Architect | AI Explorer
9 个月Thank you for the article. It is really important that we don't repeat the same mistakes regarding biases with AI. It also shocked me that it took until 1994 to require the inclusion of women in studies.