Can radiologists learn to make their radiology reports simpler and easier to understand for patients? Yes, they can -- if they receive just a little training on how to write more effectively, according to research published online May 8 in the Journal of Digital Imaging.
After giving a one-hour writing workshop to 14 radiologists, researchers found that the participants created ultrasound reports that were objectively more readable for patients than those produced before the training session. The workshop yielded a statistically significant improvement in the average length and reading grade level of the report's impression section.
"Although the degree of improvement varied for different measures, it is evident that targeted training could provide potential benefits to improve readability due to our statistically significant results," wrote the group led by Wei Chen, PhD, of Nationwide Children's Hospital in Columbus, OH.
Increased access for patients
While patients increasingly have access to their radiology reports, those with little or no medical literacy may find it challenging to understand the content. Making medical reports comprehensible to patients is critical to achieve the aims of open access to medical information, according to the researchers.
Previous studies have found that many medical documents and educational materials are written at college level or above. However, the U.S. Census indicates that the average adult reads at the seventh- to ninth-grade level.
"Highly complex medical documents, including ultrasound reports, are greatly mismatched with patient literacy level," the authors wrote.
They initially hypothesized that writing styles could be objectively evaluated using a measure that takes into account the length of the report and the reader's grade level. They also wanted to determine if a writing workshop could yield quantifiable improvements in report readability (JDI, May 8, 2017).
In the first phase of the study, 14 radiologists were asked to write the impression section for a sample of two randomly selected ultrasound cases: one relatively simple image and one relatively complex image. To ensure that the results were representative of their writing style, the radiologists were not told they would be given writing training or that they would be repeating the experiment after the workshop.
Writing more effectively
After measuring the readability of the reports, the team then provided the radiologists with a one-hour writing workshop that emphasized concepts such as simple structure and brevity, as well as how to write more effectively without losing critical information.
"The workshop encouraged the use of simpler words and phrases and less complex sentence structures to enhance readability," the authors wrote.
The radiologists also received their own readability scores on the two ultrasound cases they had reported, as well as a summary of how one radiologist's writing style compared to another's, according to the group. Two weeks after the workshop, the radiologists were again asked to write the impression section on the same two cases, applying the principles they learned during the training. The researchers then quantified, visualized, and statistically tested the readability of the ultrasound reports again to assess the short-term effectiveness of the training.
Effect of writing workshop on report readability | |||
Before training | After training | Percentage improvement | |
Average impression section length | 23 words | 15 words | 35% |
Average reading grade level | 16 | 14 | 13% |
At least a graduate level of education was needed to read and understand the initial ultrasound reports. After the radiologists completed the workshop, a college level of education was needed.
"This statistically significant but small change observed in reducing the grade level suggests either significant resistance and/or further challenge to continue lowering the grade level of radiology reading materials while maintaining the critical information in the report," the authors wrote. "To lower the reading grade level, we may need additional hours of professional training; or we may need to overcome scholarly reticence to 'dumbing down' the language for fear of compromising accuracy in these technical, medical reports; or both."
Shortening the section length was somewhat easier to improve, according to Chen and colleagues.
"Shortening section length is a viable goal for improving readability of [the] ultrasound report in the patient population," the researchers wrote.
The improvements in average section length (p = 0.024) and average reading grade level (p = 0.022) were both statistically significant.
Radiologist attitudes
In survey responses after training, all 14 radiologists indicated they believe writing style can be changed. In addition, 71% believe writing style could be optimized for communication with both patients and physicians. Furthermore, 64% think that a change of writing style need not compromise accuracy.
The researchers concluded that despite the high level of reading complexity of ultrasound reports, the empirical evidence suggests a challenging but viable pathway for radiologists to improve their writing style.
"Additional time in training may be warranted to affect even greater improvement," they wrote. "However, our [one-hour] training session was effective, efficient, and statistically significant in making improvements possible without affecting the primary message communicated in the report according to the head of the radiology department."
The authors said their simplified report readability metric could be used in future clinical decision-support software, potentially being provided to radiologists as real-time feedback or in a dashboard for periodic review.
"Future research is needed to determine how the linguistic variables identified in this study influence patients' understanding of their conditions, utilization of the healthcare system, and ultimately, clinical outcomes," they wrote. "We encourage researchers to apply the methods used in this study to a broader range of clinical writings and to experiment with additional measures that account for reader comprehension."