澳洲幸运5官方开奖结果体彩网

Racial Bias in Medical Care Decision-Making Tools

How many Black, Hispꦦanic, and poor people lose out 📖on healthcare

Part of the Series
Race and Income Inequality
black man with arms crossed in patient gown consulting with doctor
UpperCut Images/Getty Images

Racial bias in medical care can show up in some unexpected places. One example: the clinical decision tools that play an important role in how today's patients are tested, diagnosed, and treated.

These tools contain algorithms, or step-by-step procedures, usually computerized, for calculating factors such as risk of hea🅺rt disease, the need for a chest X-ray, and prescription medicine dosages. Artificial ꦓintelligence can be used to scour health records and billing systems to create the needed data sets.

On the surface, it may sounds objective. But recent studies have shown that the data analysis used in these algorithms can be biased in crucial ways against certain racial andꦚ socioeconomic groups. This can have myriad consequences in terms of the amount and quality of𓂃 healthcare that people in these groups receive.

Key Takeaways

  • Medical decision tools play a large role in how today's patients are tested, diagnosed, and treated.
  • Unfortunately, the algorithms that these tools rely on can sometimes be biased.
  • Using medical spending data to rate a person's medical condition can misjudge the severity of poor and minority patients' illnesses when lower medical spending reflects a lack of access to medical care rather than a lack of need.
  • The body mass index (BMI) algorithm used to diagnose patients as overweight or obese has created an atmosphere of weight-shaming and distrust between patients and doctors as more Black women than Hispanic or White women are now categorized as obese.
  • Data input and outcomes are now starting to be checked for racial, ethnic, income, gender, and age bias so that disparities can be recognized and algorithms corrected.

Racial Bias Affects the Sickest Patients

In 2019, a study of an algorithm widely used by U.S. hospitals and insurers to allocate extra health management assistance was shown to systematically discriminate against Black people. The decision tool was less likely to refer Black people than White people to care-managem🍸ent programs for complex medical needs when both racial groups were equally sick.

The underlying reason for the bias was linked to the algorithm's assignment of risk scores to patients based on their previous year's medical costs. The assumption was that identifying patients with higher costs would identify those with the greatest medical needs. However, many Black patients have less access to, less ability to pay for, and less trust in medical care than White people who are equally sick. In this instance, their lower medical costs did not accurately predict their health status.

Care-management programs use a high-touch approach, such as phone calls, home visits by nurses, and prioritizing doctor appointments to address the complex needs of the sickest patients. The programs have been shown to improve outcomes, decrease emergency room visits and hospitalizations, and lower medical costs. Because the programs themselves are expensive, they are assigned to people with the highest risk scores. Scoring techniques that discriminate against the sickest Black patients for this care may be a significant factor in their increased risk of death from many diseases.

Race As a Variable in Kidney Disease

Algorithms can contain bias without including race as a variable, but some tools deliberately use race as a criterion. Take the eGFR score, which rates kidney health and is used to determine who needs a 💧kidney transplant.

In a 1999 study that set the eGFR score criteria, researchers noticed that Black people had, on average, higher levels of creatinine (a byproduct of muscle breakdown) than White people did. The scientists ass🍰umed that the higher levels were due to higher muscle mass in Blacks. They therefore adjusted the scoring, which essentially meant that Black people must have a lower eGFR score than Whites to be diagnosed with end-stage kidney disease. As a consequence, Blacks had to wait until their kidney disease reached a more severe stage in order to qualify for treatment.

More recently, a student of medicine and public health at the University of Washington School of Medicine in Seattle observed that eGFR scores were not accurate for diagnosing the severity of kidney disease in Black patients. She fought to have race removed from the algorithm, and won. In 2020, UW Medicine agreed that the use of race was an ineffective variable and did not meet scientific rigor in medical diagnostic tools.

Important

In 2021, a joint task force of the National Kidney Foundation and American Society of Nephrology recommended the adoption of a new eGFR 2021 CKD EPI creatinine equation that estimates kidney function without using race as a variable.

Body Mass Index and Racial Bias

Even the simplest medical decision tool that does not include race can reflect social bias. The body mass index (BMI), for example, is based on a calculation that mꦦultiplies weight by height. It is used to id𝔍entify underweight, overweight, and obese patients.

In 1985, the National Institutes of Health tied the definition of obesity to an individual's BMI, and in 1998 an expert panel put in place guidelines based on BMI that moved 29 million Americans who had previously been classified as normal weight or just overweight into the overweight and obese categories.

Today, by BMI standards, the majority of Blacks, Hispanics, and White people are overweight or obese. But a 2021 report from the Centers for Disease Control and Prevention (CDC) found that the percentage of Americans who could be classified as obese varies by race or ethnic group.

According to the CDC, the 🧸brea♔kdown among adults overall was:

  • Non-Hispanic Black: 49.9%
  • Hispanic: 45.6%
  • Non-Hispanic White: 41.4%
  • Non-Hispanic Asian: 16.1%

Breaking out female adults classified as obese, the differences appeaܫr even more significant.

  • Non-Hispanic Black: 57.9%
  • Hispanic: 45.7%
  • Non-Hispanic White: 39.6%
  • Non-Hispanic Asian: 14.5%

Branding such large percentages of populations as overweight or obese has created an atmosphere of weight-shaming and distrust between patients and doctors. Higher-weight people complain that doctors don't address the health problems or concerns that brought them in for a checkup. Instead, doctors blame the patient's weight for their health issues and push weight loss as the solution. This contributes to many Black and Hispanic patients avoiding healthcare practitioners and thus perhaps missing opportunities to prevent problems or catch them early.

Furthermore, it is becoming increasingly clear that being overweight or obese is not always a health problem. Rates for some serious conditions, such as heart disease, stroke, type 2 diabetes, and certain types of cancer, are higher among those who are obese. But in certain situations, such as recovery after heart surgery, being overweight or moderately obese (but not morbidly obese) is associated with better survival rates.

New obesity guidelines for Canadian clinicians, published in August 2020, emphasize that doctors should stop relying on BMI alone in diagnosing patients. People should be diagnosed as obese only if their body weight affects their physical health or mental well-being, according to the new guidelines. Treatment should be holistic and not solely target weight loss. The guidelines also note that, "People living with obesity face substantial bias and stigma, which contribute to increased morbidity and mortality independent of weight or body mass index."

Consideration of an individual's BMI may be replaced by other measures, such as waist circumference. And obesity itself may be redefined. In January 2025 a group of 58 researchers proposed a new definition that would shift the focus from BMI to excess body fat and its effect on health. The group proposed two categories of obesity: preclinical, when an individual has excess fat but their organs are functioning normally, and clinical, when too much fat is harming tissue and organs.

Reducing Bias in Decision Tools

Medical algorithms are not the only type of algorithm that can be biased. As a 2020 article in The New England Journal of Medicine noted, "This problem is not unique to medicine. The criminal justice system, for instance, uses recidivism-prediction tools to guide decisions about bond amounts and prison sentences." The authors said that one widely used tool, "while not using race per se, uses many factors that correlate with race and returns higher risk scores for Black defendants."

The increasing use of artificial intelligence (AI)澳洲幸运5官方开奖结果体彩网:machine learning in particular—has also raised questions about bias based on race, socioeconomic status, and other factors. In healthcare, machine learning often relies on electronic health records. Poor and minority patients may receive fractured care and be seen at multiple institutions. They are more likely to be seen in teaching clinics where data input or clinical reasoning may be less accurate. And they may not be able to access online patient portals and document outcomes. As a result, the records of these patients may have missing or erroneous data. The algorithms that drive machine learning may thus end up excluding poor and minority patients from the data sets and needed care.

The good news is that awareness of biases in healthcare algorithms has grown in the past few years. Data input and outcomes are being checked for racial, ethnic, income, gender, and age bias. Medical specialty societies in the U.S. are recognizing the harms caused by race-based medicine and moving to end the consideration of race in clinical algorithms. When disparities are recognized, the algorithms and data sets can be revised toward better objectivity.

What Is an Algorithm?

There is no standard legal or scientific definition for algorithm, but the National Institute for Standards and Technology refers to it as "A clearly specified mathematical process for computation; a set of rules that, if followed, will give a prescribed result."

What Is an Example of an Algorithm?

In the broadest sense, an algorithm is simply a step-by-step process for answering a question or achieving a desired result. So, for example, a cake recipe is a form of algorithm. In the world of finance, an 澳洲幸运5官方开奖结果体彩网:automated trading system would be an example.

What Is Machine Learning?

IBM, a pioneer in the field, defines machine learning as "a branch of artificial intelligence (AI) and computer science which focuses on the use of data and algorithms to imitate the way that humans learn, gradually improving its accuracy."

The Bottom Line

Despite their appearance of dispassionate objectivity, the algorithms that medical professionals use to make certain decisions can be prone to bias based on race, class, and other factors. For that reason, algorithms can't simply be taken on faith but must be subject to rigorous analysis. As a 2021 article in the MIT Technology Review noted, "The term 'algorithm,' however defined, shouldn't be a shield to absolve the humans who designed and deployed any system of responsibility for the consequences of its use."

Article Sources
Investopedia requires writers to use primary sources to support their work. These include white papers, government data, original reporting, and interviews with industry experts. We also reference original research from other reputable publishers where appropriate. You can learn more about the standards we follow in producing accurate, unbiased content in our editorial policy.
  1. Science. "."

  2. HealthAffairs.org. "."

  3. Scientific American. "."

  4. National Kidney Foundation. "."

  5. Medium. "."

  6. Centers for Disease Control and Prevention, National Health Statistics Reports. "," Page 14.

  7. National Institute of Diabetes and Digestive and Kidney Diseases. "."

  8. Journal of the American Heart Association. "."

  9. CMAJ Group, Canadian Medical Association Journal. "."

  10. Powe, Neil R., et al. "." American Journal of Kidney Disease, vol. 79, no. 2, February 2022, pp. 268-288.

  11. New England Journal of Medicine. "."

  12. Gianfrancesco, Milena A., et al. "." JAMA Internal Medicine, vol. 178, no. 11, November 2018, pp. 1544–1547.

  13. Hernandez-Boussard, Tina, et al. "." Health Affairs Journal, vol. 42, no. 10, October 2023, pp. 1369–1373.

  14. National Institute for Standards and Technology. "."

  15. IBM. ""

  16. MIT Technology Review. "."

Compare Accounts
The offers that appear in this table are from partnerships from which Investopedia receives compensation. This compensation may impact how and where listings appear. Investopedia does not include all offers available in the marketplace.
Part of the Series
Race and Income Inequality

Related Articles