Radiology residents score lower using structured reports

Radiology residents using structured reporting software to interpret cranial MRI exams produced reports that were judged to be less accurate and complete than those of residents using traditional radiology reporting, according to a study in Radiology.

Structured reporting technology has the potential to standardize radiology reports, enabling report content to be more uniform and easier to extrapolate for research applications. Use of structured reporting might also improve the quality of reports prepared by some radiologists. However, radiologists have been slow to adopt the technology.

With the idea that radiology residents might benefit from using structured reporting, researchers from Indiana University School of Medicine in Indianapolis and Wake Forest University Baptist Medical Center in Winston-Salem, NC, compared the quality and completeness of conventionally prepared reports using free-text reporting to that of structured reports for cranial MRI exams of patients suspected of having a stroke. The unexpected results surprised the research team (Radiology, August 25, 2009).

Thirty-four Indiana University School of Medicine residents in their second, third, and fourth years of training voluntarily participated in the study. All attended a meeting at which they were briefed on the content of a high-quality report. Recommended content included reporting hemorrhage, ischemia, and herniation; reporting mass effect, size, edema, vascular distribution, and age; describing vascular stenosis, other diagnoses, enhancement, and reader confidence; and making recommendations for other tests. Image quality, which MRI sequences were performed, and cause of stroke were also reported.

The study

The residents were evenly and randomly assigned to one of two groups, each of which had approximately the same number of second, third, and fourth-year residents: a control group that performed conventional dictation, and a group that first performed conventional dictation and subsequently used a structured reporting system. The latter group received individual training on a commercial structured reporting system (eDictation, eDictation, Marlton, NJ) within two months of actual use, receiving the same amount and type of training as radiologists would receive. This group also had additional onsite administrative support and access to the vendor's telephone help desk.

The study consisted of two phases, spaced four months apart. In the first phase, all 34 residents reviewed and reported the findings of 25 cranial MRI cases with a clinical suspicion of possible stroke. All cases were acquired using consistent imaging protocols.

Case pathology included ischemia (13 cases), vasculitis (two cases), microvascular disease (two cases), and one case each of multiple sclerosis, marked atrophy, progressive multifocal leukoencephalopathy, herpes simplex encephalitis, and tumor. Three cases were normal.

For each case, principal investigator Dr. Annette Johnson, associate professor of radiology at Wake Forest University Baptist Medical Center, prepared a video showing how she read the exam, discussed findings, and made her interpretations. In addition to serving as a training tool, the purpose of the videos was to eliminate variations in interpretation and diagnosis by the residents.

In the first phase of the study, all participants reviewed the MRI images of a case, watched the video, and dictated their report of the findings. After the dictations were transcribed, the residents had the opportunity to edit the reports.

In the study's second phase, 16 residents reviewed the same 25 MRI cases, watched the faculty videos, and reported their findings using traditional dictation followed by optional editing. The 18 residents who had received training on the structured reporting system used that system instead to report their findings.

Residents first selected standardized menu topics, which, when clicked, offered multiple standardized phrases that could be chosen. The lexicon choices included all commonly used terms to describe cranial-detailed anatomy and disease in stroke. Every imaging feature on which reports were to be graded was available in drop-down menus according to topic.

Once terms were selected, a complete standard formatted sentence was automatically created, enabling the residents to assemble a comprehensive report. Residents with technical questions about using the structured reporting system received advice from either an onsite administrator or the vendor's telephone help desk.

The analysis

Johnson, a neuroradiologist, reviewed and graded all 1,700 reports based on accuracy and completeness according to criteria determined by a survey of 85 neuroscience clinical faculty. Reports were graded on a scale of 1-100, with 100 representing a perfect report. Fifteen reports were incomplete and were eliminated from the study.

Accuracy and completeness of free-text versus structured reporting
Criteria Phase 1, control group: free-text dictation Phase 2, control group: free-text dictation Phase 1, experimental group: free-text dictation Phase 2, experimental group: structured reporting
Accuracy 91.4 92.4 91.5 88.7
Completeness 67.8 71.7 68.7 54.3

Within each group, the accuracy and completeness of reports did not increase with the level of resident training. However, the residents who used structured reporting software showed a 2.8-point reduction in accuracy (p < 0.001) and 17.4-point drop in completeness (p < 0.001) compared to their reports prepared using traditional free-text dictation four months earlier.

The authors were also concerned about the responses they received from a survey filled out by the group that used structured reporting. For a group of volunteers who were interested and motivated in using the technology, only eight of the 18 responded to the request. Seven of the eight had negative comments about the software, including statements that the software was time-consuming and therefore inefficient to use, and it did not allow them to use desired content in their reports. However, six of the eight thought that the idea of structured reporting is good.

The authors expressed concern that the quality of a structured report should not be assumed to be comparable or better than a traditionally dictated report, even though every item required for inclusion in the report was available in drop-down menu form. They recommended that any structured reporting software be evaluated for intrinsic report quality before it is deployed in a radiology department.

In closing, the authors stated that the concept of standardized reporting shows promise and, in principle, standardization of terms should decrease report variability and increase report quality. But structured reporting requires further study to demonstrate definitive proof of the technology's benefits.

"Standardization through structuring of reports is a laudable goal at many levels, but our study results suggest that the effect of such systems on the intrinsic quality of reports cannot be presumed to be positive," the researchers concluded.

By Cynthia E. Keen
AuntMinnie.com staff writer
September 14, 2009

Related Reading

Do unstructured reports fail to convey accurate brain MRI info? January 14, 2009

Speech recognition and structured reporting bring advantages, drawbacks, May 12, 2008

eDictation lands NIH grant, October 7, 2002

eDictation aims to make dictation faster, cheaper, January 23, 2002

Copyright © 2009 AuntMinnie.com

Page 1 of 603
Next Page