DNYUZ
No Result
View All Result
DNYUZ
No Result
View All Result
DNYUZ
Home News

Doctors Catch Cancer-Diagnosing AI Extracting Patients’ Race Data and Being Racist With It

December 20, 2025
in News
Doctors Catch Cancer-Diagnosing AI Extracting Patients’ Race Data and Being Racist With It

Just when you thought you heard it all, AI systems designed to spot cancer have startled researchers with a baked-in penchant for racism.

The alarming findings were published in the journal Cell Reports Medicine, showing that four leading AI-enhanced pathology diagnostic systems differ in accuracy depending on patients‘ age, gender, and race — demographic data, disturbingly, that the AI is extracting directly from pathology slides, a feat that’s impossible for human doctors.

To conduct the study, researchers at Harvard University combed through nearly 29,000 cancer pathology images from some 14,400 cancer patients. Their analysis found that the deep learning models exhibited alarming biases 29.3 percent of the time — on nearly a third of all the diagnostic tasks they were assigned, in other words.

“We found that because AI is so powerful, it can differentiate many obscure biological signals that cannot be detected by standard human evaluation,” Harvard researcher Kun-Hsing Yu, a senior author of the study, said in a press release. “Reading demographics from a pathology slide is thought of as a ‘mission impossible’ for a human pathologist, so the bias in pathology AI was a surprise to us.”

Yu said that these bias-based errors are the result of AI models relying on patterns linked to various demographics when analyzing cancer tissue. In other words, once the four AI tools locked onto a person’s age, race, or gender, those factors would form the backbone of the tissue analysis. In effect, AI would go on to replicate bias resulting from gaps in AI training data.

The AI tools were able to identify samples taken specifically from Black people, to give a concrete example. These cancer slides, the authors wrote, contained higher counts of abnormal, neoplastic cells, and lower counts of supportive elements than those from white patients, allowing AI to snuff them out, even though the samples were anonymous.

Then came the trouble. Once an AI pathology tool had identified a person’s race, they became overly-obsessed with finding previous analyses that fit that particular identifier. But when the model was trained mostly on data from white people, the tools would struggle with those who aren’t as represented. The AI models had trouble distinguishing subclasses of lung cancer cells in Black people, for example — not because there was a lack of lung cancer data for them to draw from, but because there was lacking data from Black lung cancer cells to draw from.

That was unexpected, Yu said in the press release, “because we would expect pathology evaluation to be objective. When evaluating images, we don’t necessarily need to know a patient’s demographics to make a diagnosis.”

Back in June, medical researchers discovered a similar racial bias in large language model (LLM) psychiatric diagnostic tools. In that case, results showed AI tools often proposed “inferior treatment” plans for Black patients whenever their race was explicitly known.

In the case of AI cancer-screening tools, the Harvard research team also developed a new AI-training approach called the FAIR-Path. When this training framework was introduced to the AI tools prior to analysis, they found that it thwarted 88.5 percent of the disparities in performance.

That there’s a solution out there is good news, though that remaining 11.5 percent is nothing to sneeze at either. And until training frameworks like this are made mandatory across all AI tools in the pathology field, questions of the system’s inherent biases will remain.

More on cancer: Amazon Data Center Linked to Cluster of Rare Cancers

The post Doctors Catch Cancer-Diagnosing AI Extracting Patients’ Race Data and Being Racist With It appeared first on Futurism.

Falling in love: These NYC marriage proposals flopped— literally
News

Falling in love: These NYC marriage proposals flopped— literally

by New York Post
December 20, 2025

They really fell for their fiancées. The most sought-after NYC spot to pop the question during the holidays is the ...

Read more
News

Corey Feldman walks back claim that Corey Haim ‘molested’ him after late actor’s mom slammed accusation

December 20, 2025
News

I thought the best holidays were going to be when my kids were little. I was wrong.

December 20, 2025
News

SpaceX Is Buying Up an Unfathomable Number of Cybertrucks

December 20, 2025
News

U.S. announces strikes in Syria targeting Islamic State in response to Americans’ deaths

December 20, 2025
Epstein received “I have a female for him” call days before reported Trump messages

Epstein received “I have a female for him” call days before reported Trump messages

December 20, 2025
OpenAI vs. Apple? Sam Altman is setting his sights on winning what could be an even higher-stakes AI battle

OpenAI vs. Apple? Sam Altman is setting his sights on winning what could be an even higher-stakes AI battle

December 20, 2025
Overlooked No More: Inge Lehmann, Who Discovered the Earth’s Inner Core

Overlooked No More: Inge Lehmann, Who Discovered the Earth’s Inner Core

December 20, 2025

DNYUZ © 2025

No Result
View All Result

DNYUZ © 2025