DNYUZ
No Result
View All Result
DNYUZ
No Result
View All Result
DNYUZ
Home News

New England Journal of Medicine Retracts Paper Because Photo of Patient’s Insides Was Garbled by AI

May 1, 2026
in News
New England Journal of Medicine Retracts Paper Because Photo of Patient’s Insides Was Garbled by AI

Medical journals are being flooded with shoddy AI-generated work, a growing threat to the scientific community that could undermine the value and trustworthiness of potentially life-saving health research. Papers citing hallucinated journals and studies have quickly become a common fixture, raising major concerns among those tasked with weeding through a flood of new submissions.

In a high profile new gaffe, the reputable New England Journal of Medicine (NEJM) was forced to retract a paper by two Beijing-based researchers about a man in China developing “bronchial casts” in his lungs following a wildfire, after it was discovered that the authors had used an AI tool to manipulate a photograph in the piece.

The offending photo shows almost pitch-black, particle-filled bronchial tissues that were cryogenically removed from the patient’s lungs. As MedPage Today reported, an 87-year-old man had been brought to the emergency department at the Beijing Tsinghua Changgung Hospital after extensive fire smoke inhalation, requiring the removal of bronchial tissues that were entirely plugged with smoke particulate matter, an extremely dangerous obstruction of the airway. (MedPage later pointed out the retraction in an editor’s note.)

However, what appears to be a metric measuring tape above the tissues in the photo raises immediate red flags, with the numbers along the scale following a nonsensical sequence — a classic hallmark of the use of an unsophisticated AI image generator.

The authors said the slip-up was a careless accident.

In a retraction note, they wrote that “we were unaware of Journal policies on image manipulation and had altered our submission by using an artificial intelligence (AI) tool to move the ruler to the top of the image.”

“We therefore wish to retract our image and case report,” the note reads.

The blunder should give researchers pause. If simply moving a ruler results in this kind of AI-generated carnage, what other manipulations, both intentional or unintentional, are falling through the cracks?

Some users on social media also questioned the validity of the rest of the offending image, pointing out that there were too many segments of the senior patient’s lungs in the photo, raising the possibility that the image had been manipulated by AI in other ways.

Reached for comment, the authors provided a more detailed explanation of the snafu, but declined to send the original image for comparison:

The patient was in critical condition and receiving emergency rescue treatment. The ruler was placed at an inclined angle during the urgent clinical photography. We only adjusted the position of the ruler to make the image more aesthetic and visually readable, with no tampering with any other clinical image content.

All original clinical materials have already been provided to the NEJM editorial office, so it is not appropriate to send them to you again separately. You may consult the editorial department for specific details. Our official statement regarding the retraction was drafted following the journal’s suggestions.

We only adjusted the position of the ruler and did not modify the messy scale numbers at all. Our original intention was to ensure all information is completely authentic and traceable. It would have been very easy to use AI to make the ruler scale look perfectly standardized, but we did not do that. Throughout the entire process, we have always acted in good faith, kept all clinical information genuine, and ensured every detail is fully traceable.

We indeed did not have a full understanding of the journal’s relevant policies, which was our mistake. We sincerely apologize for this oversight.

The authors then added additional context:

We merely adjusted the angle and placement of the ruler. The scales of the ruler itself are accurate, and the disorder of the numbers was caused only by our position adjustment. It would have been very easy to fix those disordered numbers with AI, but we did not do so, in order to keep all data authentic and traceable. The ruler itself was not AI-generated.

In an appended editor’s note, the NEJM issued a stark reminder: “Authors are required to disclose any use of AI tools and any changes made to images.”

The journal’s editorial policies state that any use of “large language models, chatbots, or image creators” must be disclosed “at submission.”

“Authors should carefully review and edit all materials produced through the use of AI, to prevent the submission of authoritative-sounding output that is incorrect, incomplete, or biased,” the policy warns.

Meanwhile, editors across the scientific world are bracing themselves for an onslaught of slop.

“Science’s increased vigilance against corruption of the literature has become one more component in science and scientific publishing’s relentless pursuit of the truth,” the journal Science wrote in a January editorial. “Publishing carefully edited papers subjected to the judgment of multiple humans — and the retraction and correction of papers when the humans involved make mistakes — has never been more important.”

More on AI slop and academics: Top Medical Journal Publishes Searing Article Warning Against Medical AI

The post New England Journal of Medicine Retracts Paper Because Photo of Patient’s Insides Was Garbled by AI appeared first on Futurism.

The Best Movies and TV Shows Coming to Netflix in May
News

The Best Movies and TV Shows Coming to Netflix in May

by New York Times
May 1, 2026

Every month, Netflix adds movies and TV shows to its library. Here are our picks for some of May’s most ...

Read more
News

U.S. citizen students choose between college financial aid and protecting parents from ICE

May 1, 2026
News

9-Year-Old Hit and Killed by School Bus in Brooklyn

May 1, 2026
News

Trump may be forced to cancel bloody birthday bash event coming at ‘worst possible moment’

May 1, 2026
News

I spent almost $300 to rent a private floating sauna in Norway. It turned out to be the highlight of our family trip.

May 1, 2026
Toilet Maker Spikes in Value as It Flushes Money Into AI

Toilet Maker Spikes in Value as It Flushes Money Into AI

May 1, 2026
F.D.A. Grants Early Access to Promising Drug for Pancreatic Cancer

F.D.A. Grants Early Access to Promising Drug for Pancreatic Cancer

May 1, 2026
An Unexpected Type of Beach Read

An Unexpected Type of Beach Read

May 1, 2026

DNYUZ © 2026

No Result
View All Result

DNYUZ © 2026