DNYUZ
No Result
View All Result
DNYUZ
No Result
View All Result
DNYUZ
Home News

Doctors Catch Cancer-Diagnosing AI Extracting Patients’ Race Data and Being Racist With It

December 20, 2025
in News
Doctors Catch Cancer-Diagnosing AI Extracting Patients’ Race Data and Being Racist With It

Just when you thought you heard it all, AI systems designed to spot cancer have startled researchers with a baked-in penchant for racism.

The alarming findings were published in the journal Cell Reports Medicine, showing that four leading AI-enhanced pathology diagnostic systems differ in accuracy depending on patients‘ age, gender, and race — demographic data, disturbingly, that the AI is extracting directly from pathology slides, a feat that’s impossible for human doctors.

To conduct the study, researchers at Harvard University combed through nearly 29,000 cancer pathology images from some 14,400 cancer patients. Their analysis found that the deep learning models exhibited alarming biases 29.3 percent of the time — on nearly a third of all the diagnostic tasks they were assigned, in other words.

“We found that because AI is so powerful, it can differentiate many obscure biological signals that cannot be detected by standard human evaluation,” Harvard researcher Kun-Hsing Yu, a senior author of the study, said in a press release. “Reading demographics from a pathology slide is thought of as a ‘mission impossible’ for a human pathologist, so the bias in pathology AI was a surprise to us.”

Yu said that these bias-based errors are the result of AI models relying on patterns linked to various demographics when analyzing cancer tissue. In other words, once the four AI tools locked onto a person’s age, race, or gender, those factors would form the backbone of the tissue analysis. In effect, AI would go on to replicate bias resulting from gaps in AI training data.

The AI tools were able to identify samples taken specifically from Black people, to give a concrete example. These cancer slides, the authors wrote, contained higher counts of abnormal, neoplastic cells, and lower counts of supportive elements than those from white patients, allowing AI to snuff them out, even though the samples were anonymous.

Then came the trouble. Once an AI pathology tool had identified a person’s race, they became overly-obsessed with finding previous analyses that fit that particular identifier. But when the model was trained mostly on data from white people, the tools would struggle with those who aren’t as represented. The AI models had trouble distinguishing subclasses of lung cancer cells in Black people, for example — not because there was a lack of lung cancer data for them to draw from, but because there was lacking data from Black lung cancer cells to draw from.

That was unexpected, Yu said in the press release, “because we would expect pathology evaluation to be objective. When evaluating images, we don’t necessarily need to know a patient’s demographics to make a diagnosis.”

Back in June, medical researchers discovered a similar racial bias in large language model (LLM) psychiatric diagnostic tools. In that case, results showed AI tools often proposed “inferior treatment” plans for Black patients whenever their race was explicitly known.

In the case of AI cancer-screening tools, the Harvard research team also developed a new AI-training approach called the FAIR-Path. When this training framework was introduced to the AI tools prior to analysis, they found that it thwarted 88.5 percent of the disparities in performance.

That there’s a solution out there is good news, though that remaining 11.5 percent is nothing to sneeze at either. And until training frameworks like this are made mandatory across all AI tools in the pathology field, questions of the system’s inherent biases will remain.

More on cancer: Amazon Data Center Linked to Cluster of Rare Cancers

The post Doctors Catch Cancer-Diagnosing AI Extracting Patients’ Race Data and Being Racist With It appeared first on Futurism.

Will Threat of Prison Make One of New York’s ‘Worst Landlords’ Change?
News

Will Threat of Prison Make One of New York’s ‘Worst Landlords’ Change?

by New York Times
February 22, 2026

A five-story brick building on West 170th Street has been known to go months without heat or hot water during ...

Read more
News

New York Braces for Another Fierce Winter Storm

February 22, 2026
News

‘I Pushed Him Hard Into a Pile of Black Bags Covering the Sidewalk’

February 22, 2026
News

Humans Are Going Back to the Moon, but We Need to Know More About Moonquakes First

February 22, 2026
News

Pennsylvania high school senior accused of running ‘large-scale’ catfishing, sextortion scheme targeting 21 minors

February 22, 2026
Tim McGraw stood firm against industry elites who tried to cancel his ‘controversial’ hit

Tim McGraw stood firm against industry elites who tried to cancel his ‘controversial’ hit

February 22, 2026
Hanging on to Your Inner Child Might Be the Secret to Health and Happiness

Hanging on to Your Inner Child Might Be the Secret to Health and Happiness

February 22, 2026
Bill Maher Claps Back at Trump’s ‘Nervous’ Dinner Rant, Vows to Prove He Doesn’t Have ‘Derangement Syndrome’

Bill Maher Claps Back at Trump’s ‘Nervous’ Dinner Rant, Vows to Prove He Doesn’t Have ‘Derangement Syndrome’

February 22, 2026

DNYUZ © 2026

No Result
View All Result

DNYUZ © 2026