DNYUZ
No Result
View All Result
DNYUZ
No Result
View All Result
DNYUZ
Home News

New England Journal of Medicine Retracts Paper Because Photo of Patient’s Insides Was Garbled by AI

May 1, 2026
in News
New England Journal of Medicine Retracts Paper Because Photo of Patient’s Insides Was Garbled by AI

Medical journals are being flooded with shoddy AI-generated work, a growing threat to the scientific community that could undermine the value and trustworthiness of potentially life-saving health research. Papers citing hallucinated journals and studies have quickly become a common fixture, raising major concerns among those tasked with weeding through a flood of new submissions.

In a high profile new gaffe, the reputable New England Journal of Medicine (NEJM) was forced to retract a paper by two Beijing-based researchers about a man in China developing “bronchial casts” in his lungs following a wildfire, after it was discovered that the authors had used an AI tool to manipulate a photograph in the piece.

The offending photo shows almost pitch-black, particle-filled bronchial tissues that were cryogenically removed from the patient’s lungs. As MedPage Today reported, an 87-year-old man had been brought to the emergency department at the Beijing Tsinghua Changgung Hospital after extensive fire smoke inhalation, requiring the removal of bronchial tissues that were entirely plugged with smoke particulate matter, an extremely dangerous obstruction of the airway. (MedPage later pointed out the retraction in an editor’s note.)

However, what appears to be a metric measuring tape above the tissues in the photo raises immediate red flags, with the numbers along the scale following a nonsensical sequence — a classic hallmark of the use of an unsophisticated AI image generator.

The authors said the slip-up was a careless accident.

In a retraction note, they wrote that “we were unaware of Journal policies on image manipulation and had altered our submission by using an artificial intelligence (AI) tool to move the ruler to the top of the image.”

“We therefore wish to retract our image and case report,” the note reads.

The blunder should give researchers pause. If simply moving a ruler results in this kind of AI-generated carnage, what other manipulations, both intentional or unintentional, are falling through the cracks?

Some users on social media also questioned the validity of the rest of the offending image, pointing out that there were too many segments of the senior patient’s lungs in the photo, raising the possibility that the image had been manipulated by AI in other ways.

Reached for comment, the authors provided a more detailed explanation of the snafu, but declined to send the original image for comparison:

The patient was in critical condition and receiving emergency rescue treatment. The ruler was placed at an inclined angle during the urgent clinical photography. We only adjusted the position of the ruler to make the image more aesthetic and visually readable, with no tampering with any other clinical image content.

All original clinical materials have already been provided to the NEJM editorial office, so it is not appropriate to send them to you again separately. You may consult the editorial department for specific details. Our official statement regarding the retraction was drafted following the journal’s suggestions.

We only adjusted the position of the ruler and did not modify the messy scale numbers at all. Our original intention was to ensure all information is completely authentic and traceable. It would have been very easy to use AI to make the ruler scale look perfectly standardized, but we did not do that. Throughout the entire process, we have always acted in good faith, kept all clinical information genuine, and ensured every detail is fully traceable.

We indeed did not have a full understanding of the journal’s relevant policies, which was our mistake. We sincerely apologize for this oversight.

The authors then added additional context:

We merely adjusted the angle and placement of the ruler. The scales of the ruler itself are accurate, and the disorder of the numbers was caused only by our position adjustment. It would have been very easy to fix those disordered numbers with AI, but we did not do so, in order to keep all data authentic and traceable. The ruler itself was not AI-generated.

In an appended editor’s note, the NEJM issued a stark reminder: “Authors are required to disclose any use of AI tools and any changes made to images.”

The journal’s editorial policies state that any use of “large language models, chatbots, or image creators” must be disclosed “at submission.”

“Authors should carefully review and edit all materials produced through the use of AI, to prevent the submission of authoritative-sounding output that is incorrect, incomplete, or biased,” the policy warns.

Meanwhile, editors across the scientific world are bracing themselves for an onslaught of slop.

“Science’s increased vigilance against corruption of the literature has become one more component in science and scientific publishing’s relentless pursuit of the truth,” the journal Science wrote in a January editorial. “Publishing carefully edited papers subjected to the judgment of multiple humans — and the retraction and correction of papers when the humans involved make mistakes — has never been more important.”

More on AI slop and academics: Top Medical Journal Publishes Searing Article Warning Against Medical AI

The post New England Journal of Medicine Retracts Paper Because Photo of Patient’s Insides Was Garbled by AI appeared first on Futurism.

Very Dramatique! 4 Are Rescued After Trainee Driver Plunges Bus Into Seine
News

Very Dramatique! 4 Are Rescued After Trainee Driver Plunges Bus Into Seine

by New York Times
May 1, 2026

Oups! (That’s French for “oops.”) That’s probably what the driver in training thought — or maybe even shouted, perhaps along ...

Read more
News

Japanese Airport Trialing Humanoid Robots as Baggage Handlers

May 1, 2026
News

It’s Harder Than Ever to Get Federal Disaster Aid. Even in Red States.

May 1, 2026
News

Fight over plan to build up to 37 data centers heads to Virginia Supreme Court

May 1, 2026
News

‘The Devil Wears Prada 2’ marketing mania turns popcorn and Diet Coke into haute couture

May 1, 2026
3 Stand-Up Comedy Legends Who Were Destroying Hecklers Before Crowd Work Became Clickbait

3 Stand-Up Comedy Legends Who Were Destroying Hecklers Before Crowd Work Became Clickbait

May 1, 2026
Every Met Gala theme through the years: A list of past Costume Institute exhibitions and their dress codes

Every Met Gala theme through the years: A list of past Costume Institute exhibitions and their dress codes

May 1, 2026
Authorities Release Video of Suspect in Correspondents’ Dinner Attack

Authorities Release Video of Suspect in Correspondents’ Dinner Attack

May 1, 2026

DNYUZ © 2026

No Result
View All Result

DNYUZ © 2026