Radiology Study Warns of "Deepfake" Medical Scans as AI-Generated X-rays Deceive Both Experts and Multimodal Chatbots

A study in Radiology finds that radiologists and AI chatbots struggle to distinguish GPT-4o generated X-rays from real scans, raising alarms for medical fraud.

By: AXL Media

Published: Mar 26, 2026, 9:10 AM EDT

Source: Information for this report was sourced from Radiology via Tarun Sai Lomte

Radiology Study Warns of "Deepfake" Medical Scans as AI-Generated X-rays Deceive Both Experts and Multimodal Chatbots - article image
Radiology Study Warns of "Deepfake" Medical Scans as AI-Generated X-rays Deceive Both Experts and Multimodal Chatbots - article image

HEADLINE

Radiology Study Warns of "Deepfake" Medical Scans as AI-Generated X-rays Deceive Both Experts and Multimodal Chatbots

SUMMARY

A multi-center international study published in Radiology has revealed that modern generative AI can produce anatomically plausible X-rays that are increasingly difficult to distinguish from real clinical scans. In testing, experienced radiologists and advanced multimodal models like GPT-4o and Gemini 2.5 Pro struggled to identify synthetic images, raising urgent concerns regarding the potential for medical fraud, insurance scams, and the erosion of trust in digital health records.

CONTENT

The Evolution from GANs to Diffusion-Based Medical Fakes

Categories

Topics

Related Coverage