Radiologists know that speech recognition dictation systems can help produce reports more efficiently, but do they also produce overlooked errors? Australian radiologists conducted an audit to examine the trade-off between efficiency and accuracy more closely, and were surprised by what they found.
One way to make radiologists aware that their deliverables to physicians may be subpar is by conducting audits. But not many radiology departments publicly state whether they conduct such audits, much less publicize their results. A group in Clayton, Australia, however, did just that, reporting their results in an article published in the August issue of the Journal of Digital Imaging (2011, Vol. 24:4, pp. 724-728).
Radiologists at the Southern Health Department of Diagnostic Imaging started using a speech recognition system (PowerScribe 3.5, Nuance Communications) with its RIS in 2004. Southern Health is Victoria's largest health service, and its radiologists support imaging departments at two hospitals and two medical centers located in the southeastern suburbs of Melbourne.
As expected, the benefits of the technology were near-instantaneous availability of reports in the PACS and cost savings in terms of transcriptionist wages. However, radiologists became concerned after noticing errors in reports in a patient's file when reviewing it prior to dictating another exam. The department decided to conduct an audit to determine the extent of the problem, even though clinicians rarely gave much feedback on report quality.
The audit covered nearly 1,000 reports generated over a six-month period from June 2008 through November 2008. Fifty reports from each of the 19 radiologists were randomly selected out of a total of 52,573. X-ray exam reports comprised 75% of the 52,573. The 990 reports selected in the audit included a mix of 38% x-ray exams and 62% CT, MRI, angiography, nuclear medicine, and ultrasound exams.
Radiologists had the ability to use macros with the system. However, fewer than 2% of the random sample included reports containing macros; macros had been used only for selected and relatively simple studies.
Reports were reviewed to identify inclusion of nonsense phrases, punctuation errors, wrong word substitution, and insertion or deletion of extra words. The difference in error rates between reports of x-ray exams and other types of exams was striking. More than one-third of nonradiography exams contained errors -- some 36% -- and 5% included obvious nonsensical phrases. Reports of x-ray exams had an 11% error rate, of which 2% contained nonsensical phrases.
The majority of errors were wrong word substitution (38%), followed by insertion of undesired words (28%) and deletion of dictated words (14%). A wide range of error rates among the radiologists was identified.
The audit was a wake-up call for the radiologists, according to lead author and radiology registrar Dr. Chian Chang. "The radiologists and trainees in our institution were aware of errors but were quite surprised at the relatively high error rates," he wrote in an email to AuntMinnie.com.
The group found that failure to proofread reports effectively, as well as disruptions to radiologists, were major causes for the error rates. Dictation-related issues such as pronunciation and the clarity and speed of the radiologist dictating the report were also identified as factors. The authors also faulted fatigue caused by a heavy workload.
As a result of the audit, the department added a "gatekeeper" to serve as the first point of contact for case discussion with medical and surgical teams. Both radiologists and registrars serve in this capacity.
"This is a busy role with constant disruptions, but the gatekeeper protects the rest of the radiologists," according to Chang. "The system has been implemented for over a year. Having a gatekeeper has certainly helped reduce disruptions for other radiologists."
The department began another radiology error report audit two months ago, Chang said. The department had hoped to employ more staff so radiologists are less pressured; although this has not yet happened, adding staff is still planned.