Artificial intelligence (AI) software can offer much value in radiology. But those benefits don't have to come at the cost of added burden for radiologists, if new research published online November 17 in Academic Radiology is any indication.
Researchers from Herlev and Gentofte Hospital in Herlev, Denmark led by first author Felix Müller and senior author Michael Brun Andersen prospectively assessed a commercial AI software application for analyzing noncontrast-enhanced low-dose chest CT scans. In a preliminary study involving a radiology resident and a thoracic radiology consultant, the researchers found that the software did not lead to a significant increase in reading time.
What's more, it found additional actionable findings and also generated a better case review for the resident.
"Overall, results from this feasibility study put forward the theory that PACS based integration of AI tools for the assisted reading of chest CT images are feasible without high reading time impact, while providing diagnostic benefits to the radiologist," the authors wrote.
After a commercially available software application (AI Chest Companion, Siemens Healthineers) was installed at their institution between April and May 2020, the researchers sought to determine if radiologist reading times with concurrent use of multiple AI tools for noncontrast-enhanced low-dose chest CT exams were noninferior to reading times without AI. They also wanted to gauge the effect on reader diagnostic confidence and see if the software results in additional findings.
After the software performs eight image analysis and quantification tasks, these results are then used to generate a single "overview" image for the radiologist that includes color-coded organs based on the findings.
An external observer recorded reading times for 25 cases without AI and then 20 unique studies with AI. They also surveyed the resident and radiology consultant after each exam.
Impact of AI on radiology reading times | ||
Without AI | With AI | |
Radiology resident | 370 seconds | 437 seconds |
Radiology consultant | 360 seconds | 380 seconds |
The increase in time was not statistically significant for either the radiology resident (p = 0.16) or the radiology consultant (p = 0.70).
As for reader perceptions, the radiology resident felt that reading time was not affected by the use of AI in only 10 of the 20 cases, compared with 19 of 20 cases for the consultant radiologist.
"The perceived increase in reading time was attributed to problems with false-positive findings in two cases (resident), too many series in one case (consultant), PACS problems not attributable to the AI tool in one case (resident), and other reasons in two cases (resident)," the authors wrote.
In 18 out of the 20 exams, the AI tool was deemed by the resident to provide a better overview of the study. The software also contributed actionable findings in five (12.5%) of the 40 studies -- all for the radiology consultant. These new findings included two cases of aortic ectasia, two vertebral compression fractures, and one case of coronary calcification.
In other results, the AI software led to an increase in diagnostic confidence for the resident in 30% of the cases for the resident and 10% of the cases for the radiology consultant.
"While results are not necessarily generalizable to other AI tools, they do emphasize that a PACS based integration of AI tools is feasible potentially without negative impact on reading time, while still maintaining positive results such as additional findings, and increased diagnostic confidence," the authors wrote.
The researchers recommend that these outcomes now be verified in a full-scale study.