Radiologists Save Time with AI- Without Compromising Accuracy

Generative artificial intelligence (AI) can transform diagnostic imaging by converting visual data into text, enabling radiologists to document findings more efficiently. Amid growing demand and a shortage of radiologists, AI technologies offer promising solutions. While current models demonstrate success with chest radiographs, broader applications and clinical evaluations remain largely unexplored. This evaluation is well explained in a recent study published in the JAMA Network Open.

This study evaluated a generative AI model integrated into a clinical workflow for drafting radiology reports from plain radiographs and clinical data. AI is designed to reduce documentation time and enhance diagnostic accuracy. It supports radiologists by generating preliminary findings, including chronicity and severity, to prioritize urgent cases, such as pneumothorax. The model uses a Vit-B/16 image encoder and an OPT-125M text decoder.

This prospective cohort study evaluated the radiographs collected from a 12-hospital tertiary care academic health system between November 15, 2023, and April 24, 2024. This model was trained on electronic health record (EHR) data and outputs full-text report drafts based on patient and image information. All these reports were embedded into standard radiology workflows through Epic and PowerScribe. Radiologists could edit these drafts directly, with all outputs logged for analysis. This study was approved by the institutional review board (IRB) of the Western-Copernicus group.

This study evaluated the association between the use of an AI model and radiologist documentation efficiency. Radiographs documented with model assistance were compared to a baseline set without model use, matched by study type (i.e., chest or non-chest). Peer review assessed clinical and text quality using a Likert Scale. A shadow deployment also evaluated the model’s ability to flag unexpected and clinically significant pneumothoraxes. The statistical analysis was performed by a linear mixed-effects model and a cumulative-link mixed model.

A total of 299,164 radiographs were obtained from the institute. Of these 23,960 radiographs from 14,460 patients (mean age = 59.6±17.5 years, female = 48.8%, male = 51.2%) were used for analysis of documentation efficiency. Moreover, the peer-review included 800 studies (n = 800, mean age = 57.5±19.6) years, female = 57.1%, male = 42.9%) and pneumothorax flagging included 97,651 studies (n = 800, mean age = 60.5±18.1 years, female = 55.4%, male = 44.6%). Among 11,980 studies analyzed using the AI model, 2189 (18.3%) were nonchest radiographs and 9791 (81.7%) were chest radiographs.

There was a significant correlation between documentation time and AI-model use (χ2 = 5.36; P = 0.02). Interpretations of results with AI-model assisted (mean [SE], 159.8 [27.0] seconds) were faster compared to non-AI model studies (mean [SE], 189.2 [36.2] seconds) with z = 2.29 and P = 0.02. This resulted in a 15.5% rise in documentation efficiency. Similarly, a significant association was observed between documentation time and procedure type (χ2 = 20.98; P < 0.001). Additionally, a significantly faster documentation time was observed for nonchest studies compared to chest studies (by a mean [ME] of 33.3 [14.1] seconds; z = 4.63; P < 0.001).

In 800 peer-reviewed studies, no statistical significance difference was observed between AI-model and non-AI-model interpretations concerning text quality (χ2 = 3.62; P = 0.06) and clinical accuracy (χ2 = 0.68; P = 0.41). Furthermore, 97,651 pneumothorax-flagged studies were identified as clinically significant, with a specificity of 99.9% and a sensitivity of 72.7%

This study’s limitations include non-randomized and single-institution settings, which may limit generalizability; using radiologists as controls; and the lack of double readings.

In conclusion, this study found that using a generative AI model for draft radiology reporting improved documentation efficiency without compromising clinical quality. It also showed potential for detecting cases of immediate pneumothorax. This study supports the use of AI-assisted reporting as a promising tool for enhancing workflow efficiency and facilitating collaboration between clinicians and AI in clinical practice.

Reference: Huang J, Wittbrodt MT, Teague CN, et al. Efficiency and Quality of Generative AI–Assisted Radiograph Reporting. JAMA Netw Open. 2025;8(6):e2513921. doi:10.1001/jamanetworkopen.2025.13921

Latest Posts

Free CME credits

Both our subscription plans include Free CME/CPD AMA PRA Category 1 credits.

Digital Certificate PDF

On course completion, you will receive a full-sized presentation quality digital certificate.

medtigo Simulation

A dynamic medical simulation platform designed to train healthcare professionals and students to effectively run code situations through an immersive hands-on experience in a live, interactive 3D environment.

medtigo Points

medtigo points is our unique point redemption system created to award users for interacting on our site. These points can be redeemed for special discounts on the medtigo marketplace as well as towards the membership cost itself.
 
  • Registration with medtigo = 10 points
  • 1 visit to medtigo’s website = 1 point
  • Interacting with medtigo posts (through comments/clinical cases etc.) = 5 points
  • Attempting a game = 1 point
  • Community Forum post/reply = 5 points

    *Redemption of points can occur only through the medtigo marketplace, courses, or simulation system. Money will not be credited to your bank account. 10 points = $1.

All Your Certificates in One Place

When you have your licenses, certificates and CMEs in one place, it's easier to track your career growth. You can easily share these with hospitals as well, using your medtigo app.

Our Certificate Courses