How Will AI Affect Gender Gaps in Health Care?
Data in medicine skews male. Over 25 years after a law that mandated the inclusion of women in biomedical research, clinical practice continues to rely on evidence collected mostly from men and assumed to represent the other half of humanity. Artificial intelligence presents opportunities to address the resulting disparities in health care, but it also carries the risk of perpetuating them.
Male bias pervades every step of the process that shapes health care: from discovery and testing, to clinical practice to outcomes.
- Females are routinely excluded or under-represented in the cells, animals and humans studied in biomedical, clinical and public health research: Only 30%, 43% and 54% of 11.5 million papers published during 1980-2016 reported both male and female populations.
- Even when females are included in studies, researchers rarely analyze or report results by sex (biological characteristics) or gender (societal norms and behavior).
- Clinical guidelines and practice often ignore sex and gender differences, using men as proxies for women.
- When measuring effectiveness, it is again not the norm to disaggregate outcomes for women.
Behind the Bias
There are many reasons for the exclusion of women, such as a male-default mindset, complexities and trade-offs. Historically male-dominated professions, science and medicine tend to view the male as the baseline — for example, anatomy textbooks typically use images of men to represent universal body parts, and specimens in natural history museums are mostly male. Researchers who study only males can avoid sex-specific complicating variables and conclude studies more quickly, cheaply and with fewer participants. For decades, well-meaning regulations excluded women of childbearing years from clinical trials to avoid fertility risks and fetal harm. Newer, more inclusive laws and policies don’t apply to privately funded research, don’t set benchmarks such as real-world proportions of disease burden and don’t require disaggregated results.
There is, however, growing evidence that women and men experience disease and respond to treatments differently. Research has found sex and gender differences in many widely prevalent diseases, including cancers as well as cardiovascular, metabolic, autoimmune and neurological conditions. Because prevention, diagnosis and treatment practices rely on predominantly male data, we don’t know enough about what works for women and what doesn’t, when, or why. We know too little about most diseases (other than breast cancer) that affect only or mostly women — for example, endometriosis and depression. Researchers are also likely to have missed out on treatments that might have worked better — or only — for women.
Gender data gaps in health care are already putting women at risk in many ways.
Care delays and errors: Doctors tend to misdiagnose, underdiagnose or under-treat diseases that predominantly affect women or in situations where women present with different symptoms for common diseases. For example, heart attacks are more likely to be misdiagnosed in women, because they don’t always experience the typical (male) symptom of chest pain. It takes four to 10 years to diagnose endometriosis, a gynaecological disease that affects one in 10 women, partly because many physicians dismiss or doubt reports of severe or chronic pain by female patients. Compared to men, women who report chronic pain are prescribed less, and less-effective, pain medication.
To ensure health equity for women, we need a sex and gender lens incorporated into data, algorithms and health care.
Adverse drug reactions: Typically, women access health care more than men and are prescribed more drugs; they also experience more adverse effects and, hence, comply less with treatment plans. Women’s risk of adverse effects is 1.5-1.7 times higher than that of men, and eight out of 10 prescription drugs taken off the market during 1997-2001 were more dangerous for women. One risk is that the standard (male) dose may turn out to be too high for women — for example, the recommended dose of sleeping pill Ambien was halved because women metabolize it more slowly.
Lower survival rates: Women outlive men on average but experience more disease and disability. They also have lower survival rates for certain diseases, for example, heart disease, which is the leading cause of death in women — as well as men — globally. More women than men die within a year of a heart attack in the U.S. (26% versus 19%, respectively); the same pattern persists within five years (47% versus 36%) — perhaps because women are less likely to be diagnosed correctly or prescribed medications to lower the risk of heart attacks. After a heart attack, caregivers — who are typically female — are more likely to suffer another heart attack or die within a year.
Health care providers, payers and life sciences companies are already starting to apply artificial intelligence — such as machine learning algorithms — to guide patient care and administrative tasks. Algorithms are detecting disease from radiology images, recommending personalized treatment protocols for patients, predicting medical emergencies and guiding clinical trial design, among other applications. Compared to humans, algorithms can learn faster from huge volumes of data, and their impacts unfold at scale.
Machine learning algorithms work by learning from the past — deducing statistical patterns between inputs and outputs in large training data sets — and applying those patterns to make predictions about future inputs. As algorithms are designed and trained, they absorb biases and assumptions ingrained in historical data as well as their (mostly male) designers and project those flaws into the future. Biases in data, design or outcomes can be difficult to detect, especially for sophisticated models — these involve millions of parameters, making it difficult or impossible for humans to understand or explain how they arrive at decisions or to investigate potential mistakes.
This presents three risks relating to gender gaps in health care:
- Algorithms have the potential to amplify historical male bias — as well as other, often intersecting biases such as race — and risk female patients’ safety and outcomes. For example, an algorithm that learns “typical” symptoms of a heart attack from male-biased data could misdiagnose vast numbers of women.
- Second, such failures can widen a trust gap among women, who already face difficulty in getting diagnoses and effective treatments and whose role in making health care decisions for their families may erode trust and adherence more widely.
- Third, physicians and/or patients may view algorithms as unbiased or infallible, follow their lead and ignore contradictory information. Such automation bias will make it harder to prevent or spot mistakes.
On the other hand, AI could help address existing disparities. Algorithms capable of handling large data sets could support efforts to fill the data gap, such as using new research to combine different types of data — genetic and epidemiological, for example — to understand sex differences and develop effective interventions for women. AI can also help reduce care gaps: Algorithms can absorb fast-changing medical knowledge, adopt new clinical guidelines and apply or recommend them consistently without the delays and variations inevitable among humans alone. To ensure health equity for women, we need a sex and gender lens incorporated into data, algorithms and health care — spanning AI solutions such as proactive bias testing to systemic changes promoting gender diversity in research, innovation, funding, policy and practice.