While various radiology report formats may not produce differences in efficiency or reading comprehension, efforts to standardize reporting formats may be complicated by disparate reader habits and preferences, according to research from the University of Arizona and the University of Maryland.
In a study that compared the performance of three different types of report formats being used by attendings and residents in radiology and internal medicine, the researchers found that the format type did not significantly affect reading time or comprehension. Significant differences were determined, however, in what areas of the report readers focus on and how they read previous radiology reports.
As a result, a one-size-fits-all report is unlikely to work, according to Elizabeth Krupinski, PhD, of the University of Arizona. She presented the findings during the Society for Imaging Informatics in Medicine (SIIM) annual meeting earlier this year.
Point of contact
The radiology report is the key point of contact between radiologists and clinicians in other specialties, and these physicians are demanding faster turnaround time and reporting standards to help ensure appropriate and accurate communication of findings, Krupinski said.
Seeking to determine if the format of the report affected reading efficiency and comprehension, the researchers reformatted three radiology reports into three different formats: conventional free text, structured text organized by organ system, and hierarchical structured text organized by clinical significance.
The exams included a CT abdomen/pelvis study with contrast and a key finding of cholelithiasis; a CT abdomen/pelvis study without contrast and a key finding of ureteral calculus; and a CT abdomen/pelvis study without contrast and a key finding of ureteropelvic junction (UPJ) stenosis.
The reports, which were produced by Dr. Bruce Reiner and checked by Dr. Eliot Siegel, had their patient identification removed. Each report format contained the same information and had a word count that was within about 10% of the other formats.
Reports were presented in Arial 12-point bold type and were fit to a single page. They were displayed with black text and a white background in an embedded PowerPoint presentation on a Dell flat-panel display, with 1,440 x 900 native resolution and a 1,000:1 contrast ratio.
Testing comprehension
Participants in the study included 10 radiologists (five attendings and five residents) and 10 internal medicine physicians (six attendings and four residents), who reviewed the reports and then answered 10 true-or-false questions on the report's content. They were also surveyed before the study to determine their preferences for what they focus on when they read radiology reports and what they do when previous reports are available.
Reports were anonymized for each subject, and reports for the same case were not sequentially read. In a single 20- to 30-minute session, they were asked to read the report, grade it, and answer questions. The reading phase was timed and the percentage of correct true-or-false questions was analyzed.
The attending radiologists included four men and one woman. The men had an average age of 76.5, while the woman was 52. Resident radiologists included three men and two women. The three men had an average age of 32.7 and the two women had an average age of 29.5.
As for the internal medicine participants, there were three male attendings and three female attendings, with an average male age of 41.7 and average female age of 40.3. There were three male residents with an average age of 37.7 and one female resident who was 28.
Different areas of focus
Chi-square analysis showed significant differences (X2 = 27.15, p < 0.001) in terms of what the participants primarily focused on when reading radiology reports.
Primary area of focus when reading radiology reports
|
With regards to what participants did when previous reports are available, the researchers discovered significant differences (X2 = 275.18, p < 0.0001) by specialty and physician level in the distribution of responses.
Actions with previous reports
|
No participants reported that they did not view any previous reports when they were available.
No effect on time
Analysis of variance (ANOVA) found no significant effect (F = 1.772, p = 0.1732) on reading time by report format type. However, residents spent an average of 20.9 seconds less per report than attendings, and this effect was significant (F = 33.382, p < 0.0001), Krupinski said.
"So either the residents are faster readers or they are skimming," Krupinski said.
The researchers also found a significant difference by specialty (F = 12.23, p = 0.0006) and an interaction effect for specialty by level (F = 18.82, p < 0.0001). The radiology attendings took the longest time to read the reports, while the radiology residents took the least.
As for comprehension, or percent of correct scores, there was no significant effect for report format (F = 1.905, p = 0.1521) or for attending versus resident (F = 2.698, p = 0.1023). There was a significant effect for radiology versus internal medicine (F = 12.34, p = 0.0006), with the radiologists (attending and residents) scoring better than the internal medicine participants.
Krupinski noted that the attending radiologists also complained the most about the two nontraditional report formats.
Future directions for the research could involve different specialties and settings and recording eye position to determine what the reader is actually doing when he or she is reading the report, she said.