Data collection key to quality improvement in radiology

CHICAGO -- Robust data collection is key to effective quality improvement in radiology, according to a talk given December 2 at RSNA 2024.

In her keynote, Eva Rubio, MD, from Children’s National Health System in Washington, DC, talked about how radiologists can take a page from other specialties in how they can ensure quality for their services.

“We have room to grow in radiology performance, benchmarking, and data collection,” Rubio said. “From other specialties, we can borrow the concept of accuracy standards.”

Eva Rubio, MD, discusses how radiology departments can take steps toward quality improvement through data collection and seeing what other specialties are doing in this area.Eva Rubio, MD, discusses how radiology departments can take steps toward quality improvement through data collection and seeing what other specialties are doing in this area.

Quality improvement is not where it needs to be in radiology, Rubio said. In the early 2000s, the American College of Radiology (ACR) created a quality assurance program called RADPEER. This has radiologists who review previous images as part of a new interpretation record their ratings of the previous interpretations on a four-point scale.

In 2020, the ACR sponsored the Peer Learning Summit, held at Harvard Medical School. This suggested an approach called peer learning, where expert professionals review one another's work, actively give and receive feedback in a constructive manner, teach and learn from one another, and mutually commit to improving performance.

Previous reports suggest that advantages of peer learning include better and broader participation among radiologists, emphasis on learning, better alignment with understanding system failure and impact on performance, and the ability of departments to design their own programs to suit their needs.

Rubio, however, said there are still pros and cons with peer learning. Pros include being appreciated by many, the emphasis on learning, its anonymized nature, and the ability to spotlight important quality improvement activities. Cons include the risk of becoming “just another case conference,” with case selection being ad hoc, and having difficulty demonstrating improved outcomes to outside entities.

“In some institutions, I worry that it doesn’t really answer potential upper hospital leadership questions about diagnostic performance, which is ultimately why we engage in these activities,” Rubio said.

She added that quality improvement in ongoing efforts to prove diagnostic accuracy could take the lead over peer learning and individual case review. This would take radiologists reprioritizing their time, efforts, staffing, and software toward data collection as a foundational activity rather than focusing on capturing important cases to present at conferences.

Three areas in medicine have such a framework for quality assurance: mammography, surgery, and blood bank pathology. Facilities offering mammography must abide by the U.S. Food and Drug Administration’s (FDA) Mammography Quality Standards Act. Group practices and radiologists in breast imaging also have a wealth of data on their own performance and they have a closed system with repeat patients, Rubio said.

For surgery, departments are typically part of the American College of Surgeons’ National Surgical Quality Improvement Program. Here, departments submit details on their procedures, including data related to the post-operative period and complications. They also get feedback from peers on case details.

“They have good benchmark data,” Rubio said. “This is taking things to the next level, which is where I would like to see radiology move in the future.”

For pathology, departments are typically members of Clinical Laboratory Improvement Amendments (CLIA). This federal program regulates clinical laboratory work in the U.S., establishing quality standards for accurate and reliable data. Every lab has proficiency testing twice a year, with cease-and-desist letters sent to labs that score too low in two consecutive tests.

Rubio said radiology departments can take steps such as developing imaging data registries, choosing areas of ongoing accuracy focus projects, focusing on practice audits, assessing individual provider variability, seeking input from clinical services, and increasing data transparency.

She added that departments should also consider being part of Solutions for Patient Safety, an organization that focuses on pediatric care and outcomes, but “really has no significant radiology presence, if at all.”

“What I think we would benefit from is continued strengthening between quality improvement endeavors and peer review, which I think we’re starting to see more of,” Rubio said.

To view full coverage of RSNA 2024, visit our RADCast.

Page 1 of 14
Next Page