Using computer-aided detection (CAD) software, 3D laboratory radiologic technologists can determine if patients with multiple sclerosis (MS) need a contrast agent for their brain MRI exam while they're still in the scanner, according to research posted online May 16 in the Journal of the American College of Radiology.
A team of researchers led by Dr. Jeffrey Rudie, PhD, of the Hospital of the University of Pennsylvania in Philadelphia tested a new protocol for follow-up MS MRI exams. After noncontrast fluid-attenuated inversion-recovery (FLAIR) imaging is acquired, CAD software is then used to assess the sequence for new disease activity. A technologist in the institution's 3D laboratory then reviews the CAD results and informs the scanning technologist within minutes if a gadolinium-based contrast agent (GBCA) is needed for the exam. Over a two-month pilot period, the initiative avoided the use of GBCAs and additional imaging sequences in 87% of patients.
"Furthermore, this approach allowed for significant savings in terms of the costs to the healthcare system related to the contrast agent itself, time to administer contrast, and time for image generation and interpretation," the authors wrote.
Although MS patients routinely undergo serial contrast-enhanced MRIs for assessment of treatment response, those who don't have evidence of new disease activity on noncontrast imaging aren't likely to benefit from receiving contrast, according to the researchers. In an effort to address safety concerns over the use of GBCAs, they sought to utilize their internally developed CAD software and 3D laboratory technologists to limit the use of these agents in MS follow-up imaging to only those showing new or enlarging lesions on noncontrast FLAIR imaging.
Prior to the pilot project, each of the institution's 3D laboratory technologists was trained in how to use the CAD software. The software, which takes about 10 minutes to perform, automatically compares two time points of 3D FLAIR MRIs for MS follow-up evaluations. The preliminary CAD-based assessments from the laboratory technologists were then compared with the final report by the interpreting neuroradiologist for 153 patients.
Performance for detecting new, enlarging lesions | |
3D laboratory technologists using CAD software | |
Sensitivity | 80% |
Specificity | 97% |
Accuracy | 95% |
Of the seven patients with false-negative assessments, four had missed lesions in the brain and three had overlooked lesions in the spine, according to the team. After reviewing the patient charts, the group found that none of these patients was brought back for additional imaging with contrast and all were managed routinely with appropriate modification in clinical management.
Half of the missed new lesions were due to technologist assessment errors that could be improved with further training, while the software itself was responsible for the other half of the errors, according to the researchers.
"Thus, it is also likely that more advanced software, such as deep learning-based methods, could further improve the performance of technologists in implementing this sort of targeted diagnostic strategy because it could reduce [false positives] and false negatives," they wrote.