Information is critical for MRI and spinal cord injuries

2019 12 01 21 22 6548 Rsna 2019 Concourse 400

CHICAGO - One good way to accurately describe and share patient information from MRI scans on spinal cord injuries is to use common features to which everyone can relate, according to findings presented on Monday at RSNA 2019.

The list of relevant elements comes from the National Institutes of Health's National Institute of Neurological Disorders and Stroke (NINDS) project, which has created a uniform method for capturing reliable quantitative and categorical data for spinal cord injury from both investigational work and clinical trials.

"The use of consistent terminology in this particular venue is extremely important," said Dr. Adam Flanders, co-director of neuroradiology/ear, nose, and throat radiology at Thomas Jefferson University in Philadelphia. "The agreement that we got overall for our spinal cord features can work, as they were rated good to very good."

NINDS' goals

NINDS is looking for clinicians and radiologists to use a set of common data elements to create structured radiology reports to describe a patient's condition, in this case, relative to their spinal cord injury. The process begins by clinicians choosing a "controlled response" from several options to describe what they see in an MR image as it relates to the spinal cord injury.

"The rationale is: Because there is no consistent method in use to describe the severity of the spinal cord injury on MRI, even though we know MRI has great value as a surrogate for prognosis, for the neurological level of injury, and so forth," Flanders told RSNA attendees. "There is value in having ... a reproducible measure that everyone uses. It raises the bar for what we can use MRI for in this setting."

In addition, the NINDS wants to increase efficiency and accuracy in the evaluation of clinical research studies, perform more effective multicenter research studies, facilitate data sharing among researchers, and help educate investigators.

"Yet, interestingly, when the NINDS published these elements, they did not actually validate whether they worked or not," Flanders added.

Does it work?

Aware of that deficiency, he and his colleagues soon thereafter were awarded a grant from a private institution to perform an independent validation of each of these common data elements for MRI.

So the purpose of this study was to determine interreader and intrareader reliability in using common data elements to evaluate MRI results for spinal cord injury.

The researchers preselected MR images from 35 spinal cord injury cases from 12 imaging centers. The reports included standard anatomical features, which were pared down to information on the length and location of hemorrhaging and edema in the spinal cord.

The researchers also added the brain spinal injury score, which was developed at the University of California, San Francisco, as part of their assessment.

The exam order was randomized and then rescored for a second round and included 25 factors for comparison. The extent of each patient's injury was rated on a scale of 0 to 1, with 0 as normal and 1 as severe.

"We created a training set, trained our readers in one session, and handed out the cases in groups of five [cases] to each of our reviewers," Flanders added. "They received the cases independently from across the country. [The cases] were randomized, and a month later were represented to them for rating again."

Rating the reads

How well did the readers agree on patient conditions based on the NINDS common data elements?

Collectively, agreement for all spinal cord features ranged from poor (0.22) to excellent (0.99), but a few factor ratings performed particularly well. The highest interrater agreement was in the categories of edema, hemorrhage length, and location relative to anatomic reference, which ranged from 0.69 to 0.99.

Good agreement levels (range, 0.73 to 0.83) also were observed for the level of spinal injury between the readers, with only minor differences in agreement between the two reading sessions.

As for the readers themselves, the four neuroradiologists and the lone spine surgeon had no significant difference in performance.

"There was lower agreement for the basic score, which is not well understood right now," Flanders said, "but [the results] show that with the agreement we have, we can move forward, as we have validated this [common data elements] scheme for spinal cord injuries in the future."

Page 1 of 611
Next Page