A deep-learning algorithm automatically detected and segmented brain metastases on multisequence MRI, showing potential to help radiologists and radiation oncologists with these tedious and time-consuming tasks, according to research published online May 2 in the Journal of Magnetic Resonance Imaging.
A team of researchers from Stanford University led by Endre Grøvik, PhD, trained a convolutional neural network (CNN) that yielded high detection and segmentation accuracy for brain metastases.
"Accurate segmentation of the metastases is imperative in radiation therapy planning to minimize damage to adjacent normal tissue," the authors wrote. "Our neural network essentially combines visualization, quantification, and segmentation into one fluid step, producing results that can be directly applied to radiotherapy planning with minimal user interaction."
To train their CNN, the Stanford researchers first retrospectively gathered 156 consecutive patients between June 2016 and June 2018 who had known or possible metastatic disease, had no prior surgical or radiation therapy, and had received all required MRI sequences, including pre- and postgadolinium T1-weighted 3D fast spin echo, postgadolinium T1-weighted 3D axial inversion recovery-prepped fast spoiled gradient echo, and 3D fluid attenuated inversion recovery (FLAIR).
The patients received their imaging on 1.5-tesla (Signa Explorer and TwinSpeed, GE Healthcare) or 3-tesla (Discovery 750, Discovery 750w, and Signa Architect, GE; and Skyra, Siemens Healthineers) MRI scanners. Gadolinium-enhanced imaging was provided via a dose of 0.1 mmol/kg body weight of gadobenate dimeglumine (MultiHance, Bracco Imaging).
The primary malignancies for the patients included lung cancer (99 cases), breast cancer (33 cases), melanoma (seven cases), genitourinary cancer (seven cases), gastrointestinal cancer (five cases), and miscellaneous cancer (five cases). Two neuroradiologists with eight and two years of experience, respectively, provided the ground truth segmentations for the study by manually delineating and cross-checking regions of interest around each enhancing metastatic lesion, according to the researchers.
Next, the researchers trained the algorithm using 100 patients, with additional development on five patients. The researchers then tested the CNN on a group of 51 subjects, which included equal numbers of patients with 1 to 3, 4 to 10, and more 10 than lesions.
"By testing on a large number of patients, thus facilitating subgroup analysis, this work demonstrates the network's clinical performance and potential, in addition to better understanding of its generalizability," the authors wrote.
Mean performance of deep-learning model for detecting metastases on a voxel-by-voxel basis | |
Deep-learning model | |
Area under the curve (AUC) for all patients | 0.98 ± 0.04 |
AUC for patients with 1-3 metastases | 0.99 ± 0.01 |
AUC for patients with 4-10 metastases | 0.97 ± 0.05 |
AUC for patients with more than 10 metastases | 0.97 ± 0.03 |
In other findings, the researchers reported that the network was more likely to detect larger metastases, achieving 100% sensitivity on lesions larger than 20 mm and only 50% sensitivity on lesions smaller than 7 mm. In addition, the algorithm's segmentation performance was found to be slightly better for patients with 4 to 10 metastases.