Artificial intelligence (AI) software could be a valuable tool for radiologists reading chest CT exams, particularly in identifying often-underreported cardiovascular findings, according to research published online March 10 in Academic Radiology.
Researchers led by Dr. Basel Yacoub of the Medical University of South Carolina retrospectively applied a commercial AI software application to analyze chest CT exams from 100 patients and then compared the results to the original radiology report. They found that AI had superior diagnostic performance for identifying aortic dilatation and coronary artery calcium (CAC).
"We expect that adding AI to radiology reporting will translate into improved interpretation and increased confidence with shorter reading time, ultimately reducing burnout among radiologists," the authors wrote.
The researchers applied the AI-Rad Companion (Siemens Healthineers) software to 100 consecutive patients who received noncontrast chest CT exams at their institution. They then compared the algorithm's performance for identifying five findings -- pulmonary lesions, emphysema, aortic dilatation, CAC, and vertebral compression fractures -- with the original radiology reports signed by attending cardiothoracic radiologists.
The AI findings and the radiology report were then compared with a consensus reading by two board-certified radiologists with cardiothoracic radiology experience of eight and 10 years, respectively.
Performance of AI on noncontrast chest CT exams | ||||||
Sensitivity | Specificity | Area under the curve | ||||
AI | Radiology report | AI | Radiology report | AI | Radiology report | |
Pulmonary lesions | 92.8% | 97.6% | 82.4% | 100% | 0.88 | 0.99 |
Emphysema | 80.6% | 74.2% | 66.7% | 97.1% | 0.74 | 0.86 |
Aortic dilatation | 96.3% | 25.9% | 81.4% | 100% | 0.89 | 0.63 |
CAC | 89.8% | 75.4% | 100% | 94.9% | 0.95 | 0.85 |
Vertebral compression fractures | 100% | 100% | 63.7% | 100% | 0.82 | 1.00 |
The better results for AI in identifying aortic dilatation and CAC were statistically significant (p < 0.001 and p = 0.005, respectively). The radiology reports also showed that the interpreting radiologist outperformed the AI in identifying pulmonary lesions and vertebral compression fractures (p = 0.024 and p < 0.001, respectively). The performances were statistically comparable for identifying pulmonary emphysema (p = 0.064).
The retrospective evaluation confirmed the significant added value of using AI to support diagnostic reading of chest CT exams, according to the researchers. These platforms could be beneficial for radiologists facing an increasing workload from the ever-growing number of CT exams.
"A meaningful integration of AI into the clinical environment would ensure that thoracic pathologies continue to be correctly detected and classified despite existing challenges, such as the combination of exponentially growing utilization of radiology services and the lack of growth in the number of trained radiologists," the authors wrote. "This imbalance has resulted in increasing workload and consequent higher rates of burnout."
Further improvements in diagnostic performance from training on larger datasets may enable these algorithms to be implemented in clinical radiology workflow in the near future, according to the researchers.
"This role will be focused on supporting radiologists in their practice by providing automated second-reader results made available when reporting the imaging scans," they wrote.