Study: It's tough complying with Calif. dose reporting law

2015 04 01 15 26 50 442 California Flag 200

Compliance with California's radiation dose reporting law is difficult because dose reporting processes that rely on manual data entry are unreliable -- at least in the experience of one large institution -- according to a new study in the American Journal of Roentgenology.

Aiming to reduce excessive radiation dose, the California Legislature in 2012 passed a law requiring the reporting of CT dose information. However, a yearlong study of pediatric CT exams found that almost 10% of reports failed to meet legal reporting criteria, and more than 12% failed to meet institutional reporting criteria. Combination scans of multiple parts of patient anatomy were particularly troublesome, according to the authors from Stanford University's Lucille Packard Children's Hospital (AJR, April 2015, Vol. 204:4, pp. 810-816).

"We find that our institutional processes, which depend on error-free human performance rather than automation with electronic systems, resulted in accurate compliance with legal and institutional CT dose reporting standards 87.8% of the time," wrote radiologist Dr. Evan Zucker and colleagues from Stanford University. "In view of this study, we are currently exploring [automated] systems to improve compliance rates with dose reporting at our institution."

The study highlights an important problem whenever the human factor is involved in radiation dose reporting and also the need for automated systems to assist with the dose-reporting requirements, said Zucker, a clinical instructor of radiology, in an interview with AuntMinnie.com.

"I think it sends an important message to practices that are either currently required to do dose reporting or will do so in the future to think about how they can be prepared for this, and what systems they can implement to do this as accurately and systematically as possible," he said. "If we're going to do this, we need to be consistent."

California's big problem

State legislators in 2010 moved to introduce new requirements for dose reporting following a series of serious and highly publicized radiation accidents in California that occurred during medical imaging procedures. California Senate Bill 1237, which became law in July 2012, requires both volume CT dose index (CTDIvol) and dose-length product (DLP) to be indicated in every radiology report when the information is available from the CT scanner.

Alternatively, departments may use a dose unit approved by the American Association of Physicists in Medicine (AAPM), and attaching the protocol to the radiology report is acceptable. But simply sending dose information to the PACS is insufficient, the authors wrote.

The law also has separate reporting requirements for suspected radiation overdoses. CT studies for radiation therapy planning and PET/CT or SPECT/CT exams performed only for radiation attenuation are exempt.

Testing the process

Zucker and colleagues sought to evaluate their institution's experience with dose reporting after SB 1237 went into effect.

"We sought to evaluate compliance with both legal and institutional standards, with attention to the accuracy of dose information reported, and to evaluate areas for improvement," the authors wrote.

They retrospectively reviewed all reports from chest CT exams performed from July 2012 through June 2013, looking for errors in documentation of CTDIvol, DLP, and phantom size, the authors wrote.

The study included all scans of the chest along with other contiguous regions, such as the abdomen, or noncontiguous body regions, such as the head. Studies that did not include the chest were excluded. To be deemed legally compliant, both CTDIvol and DLP were documented accurately and were judged compliant with Stanford's standards if phantom size also was documented accurately. In addition, the study team tracked all reports that did not document dose in the standard format in terms of phantom size, CTDIvol for each series, and total DLP.

The three types of CT scanners were used, including two 64-detector-row machines (LightSpeed VCT, GE Healthcare; Sensation 64, Siemens Healthcare) and a 128-detector-row scanner (Somatom Definition Flash, Siemens Healthcare).

The DICOM dose reporting page differed according to scanner type. For the GE 64-detector-row scanner, the reference phantom size (16 or 32 cm) used to compute each CTDIvol measurement was noted. The Siemens 64-detector-row machine did not specify phantom size; the dose for each body part was calculated based on a 32-cm phantom except for the head, which was calculated based on a 16-cm phantom.

Technologists were asked to transmit the DICOM dose information page automatically produced by the scanner as a separate image to the PACS as part of each scan, and the dose information was made viewable by the radiologists. The radiologists interpreting the CT scans were then responsible for reviewing the dose information pages and including a dose information statement on their reports.

Results

The final analysis included 664 chest CT scans, of which 112 (16.9%) were combined chest, abdomen, and pelvis exams for which the scanner produced a single dose report. The rest were chest scans only that could be performed concurrently with another CT study, in which case the dose data for the current study was output separately by the scanner, the authors explained.

In all, the radiologists left out of their reports data on CTDIvol, DLP, or both in nine of 664 examinations (1.4%) and inaccurately reported one or both of them in 56 of the remaining 655 exams, the authors wrote. Additionally, radiologists left out the phantom size in 11 of 664 examinations (1.7%). A total of 599 exams (90.2%) met legal reporting requirements, while 583 (87.8%) met institutional reporting requirements.

Additionally, radiologists sometimes used less decimal precision than was available, and summed CTDIvol measures included only series-level DLP, specifying dose information from the scout topogram or a nonchest exam for combination exams, the authors wrote. Overall, they found the institutional methods to be "relatively unreliable" for reporting dose.

"Nearly all chest CT reports in the year after California Senate Bill 1237 became effective attempted documentation of dose parameters," Zucker and colleagues wrote. "However, 9.8% of reports failed to accurately meet legal compliance criteria, and 12.2% of reports failed to accurately meet institutional compliance criteria. Furthermore, dose report statements were prone to variability in format. In particular, there was frequent ambiguity regarding documentation of dose parameters for combination CT studies."

Dose reporting requirements can be expected to continue expanding, the group wrote. Texas and Connecticut passed reporting requirements following California's lead, and the Joint Commission is expected to require dose reporting nationally in the future, meaning most radiologists will someday be responsible for dose reporting.

The new study explains the challenges they might expect to arise, the authors wrote. Accuracy errors were a central problem, as were variations in the numerical reporting formats, and there were a host of problems with combined exams.

"One of the major sources of variation was how to report dose information for combination studies," Zucker told AuntMinnie.com. "People had a lot of variation in terms of which dose information to report when there were multiple series or combination studies, and sometimes the same dose information would be reported in several different exams, which could lead to confusion about the actual dose delivered."

Other errors could have arisen from the radiologist dictating values incorrectly or the transcriptionist's misunderstanding of the values dictated, which were then not rechecked before report signature, the authors wrote. Deficiencies in training were likely another contributing factor in errors.

Automated dose reporting, with the ability to populate reports with any dose parameters requested, might be a viable solution. Although cost and equipment compatibility would remain as obstacles, successful software products have been available for several years.

But the error issue is not as easily solved for reporting variations in combination exams, such as for the head and neck.

"That's something an automated system could do, but you have to discuss it in depth on a state or national level," Zucker said. "You have to reach consensus and then develop guidelines for how you want the automated system to work."

Page 1 of 660
Next Page