MONTREAL - Radiology reports generated by voice recognition software do not include many serious errors that could affect patient care, but they do have an aggravating number of minor inaccuracies, according to a study presented at a joint meeting of Canadian medical societies.
The study examined errors in a collection of radiology reports generated by staff radiologists and residents using voice recognition software, explained Dr. Jonathan Hickle, a fourth-year radiology resident at Dalhousie University in Halifax, Nova Scotia.
The research was presented at last week's Joint Congress on Medical Imaging and Radiation Sciences, which brought together the Canadian Association of Radiologists (CAR) and the Canadian Association of Medical Radiation Technologists (CAMRT), as well as French-language associations based in Quebec.
Higher productivity, but at a price?
Radiologists in Halifax currently use PowerScribe 5.11 from Nuance Communications. The software has lowered radiology report delivery time, but referring clinicians say that reports have a large number of errors, according to Hickle.
Prior to adopting voice recognition software, radiologists used conventional dictation transcription services to produce reports. The back-and-forth exchange between the radiologist and transcription service increased the turnaround time of the radiology report to the referring clinician, Hickle explained.
"It used to take a week or two to produce a report," he said.
The emergence of voice recognition software held the promise of boosting radiologist productivity by reducing the time between dictation and finalization of reports. But research on voice recognition software has shown several drawbacks, Hickle said, citing several studies.
A 2011 analysis by Basma et al found that reports generated using automated speech recognition software technology were eight times more likely to contain major errors than those prepared with conventional dictation transcription. In addition, a 2008 investigation by Quint et al found that more than 20% of radiology reports created using speech recognition technology contained significant errors, even as most radiologists believed that error rates did not surpass 10%.
"Major errors would be those that would affect the interpretation of the report or could potentially harm a patient, and everything else would be a minor error," Hickle said. "Minor errors would be a word missing that would not affect any imaging interpretation. A major error would be a difference like ascending aorta instead of descending aorta."
The researchers aimed for a major error rate of 0% and a minor error rate of less than 10%, he said. In an initial audit of 880 reports, Hickle and co-investigators found a minor error rate of 20% and a major error rate of 3.5%.
Before performing a second audit, the researchers asked users to optimize their microphones and decrease background noise by closing doors to adjacent rooms, among other measures. They also asked the bottom 50th percentile of radiologists to retrain their voice profiles.
In the second audit, the number of reports with major errors fell from 31 to 19, yielding a major error rate of 2.2%. The number of reports with minor errors rose from 176 to 220, however, yielding a minor error rate of 25.7%.
"Retraining was something we thought would improve performance," Hickle said. "Our major errors decreased, but our overall percentage of errors -- and specifically the minor errors -- increased."
Moving forward, the Capital Health Authority will upgrade to a new version of the software, PowerScribe 360, and Hickle plans to conduct a third audit to evaluate the impact of the new software version on errors in reports.
The high rate of errors suggests that a proportion of radiologists are producing shoddy reports, said Dr. Sukhvinder Dhillon from the University of Alberta, who was one of three judges of the clinical audit project session.
"It's up to the radiologist as to how carefully they read the report," Dhillon said. "The buck stops with the radiologist. If you read your report well, there should be no errors, major or minor. Some people may only be scan reading it and miss changes that should be made to the report."
While having minor errors in a report may not affect patient management, it reflects poorly on the radiologist who is creating the reports, Dhillon concluded.
"It may be clinically correct and may not affect patient management in any way, but having errors scattered in your report gives a general impression of sloppiness," he said. "Do you want to accept that level of inaccuracy or sloppiness in the reports that you produce? At the end of the day, it's a legal document that could be produced in a court of law [for others to read]."