Generative AI is poised to benefit radiology, but oversight needed

2023 05 22 21 21 0293 Chat Gpt Artificial Intelligence Ai Head 400

Generative artificial intelligence (AI) technologies such as ChatGPT offer much potential to enable significant improvements in radiology workflow.

Potential applications could include collaborative research, patient education, and automatic production of draft reports. But much still needs to be done before large-language models (LLMs) are ready for clinical use.

The need for caution and human oversight abounds, as these technologies are a double-edged sword.

AuntMinnie.com has reported that the technology could have an important impact on clinical radiology by giving patients, before radiologic-guided procedures, important information -- answering questions, assessing patient readiness, and providing support. The technology might also enable clinical knowledge systems that provide decision support and facilitate better patient care.

Additionally, while LLMs such as ChatGPT can help automate radiology reports, they can be error-prone, thanks to their ability to learn the nuances of language and generate human-like responses.

Early research applications

Dr. Prof. Rajesh Botchu, a musculoskeletal radiologist at the Royal Orthopedic Hospital in Birmingham, U.K., indicates that an early application for ChatGPT in radiology is its use in collaborative research, undertaking literature searches and exploring research databases.

"The technology will find all relevant material about a particular topic and synthesize the information very quickly -- in minutes," he explains. "It will tell you the pros and cons of each paper and show any knowledge gaps."

Dr. Prof. Rajesh Botchu.Dr. Prof. Rajesh Botchu.

"After you've collected the data, ChatGPT can quickly put together a draft of the research, which the researcher can edit. This will speed the process of, for example, doing grant proposals and other reports, and drafting articles intended for publication. It will give researchers more time to concentrate on doing the research itself," Botchu added.

But there are concerns: Plagiarism is an issue because technologies like ChatGPT create variations of text and articles. AuntMinnie.com previously reported that while ChatGPT can write convincing articles for the untrained eye, the articles can sometimes be factually inaccurate and contain fictitious references.

Botchu also expects ChatGPT and other generative AI technologies to impact radiologist training, creating remote learning tailored to the individual radiologist.

"For example, if a trainee is not good at chest radiology, ChatGPT can detect this, interact with and develop challenges for the trainee, and give them more training towards the chest," he said.

Impacting workflows

Botchu predicts that in a few years, AI technologies, including ChatGPT, will optimize radiology workflows. The technologies will be incorporated into image reading and reporting systems, to identify abnormalities and produce a report draft, which the radiologist can edit quickly. So the number of cases that the radiologist will report during a session will increase. The workflow will increase against the same or even less work time. Turnaround time will be quick.

"ChatGPT will be incorporated into AI programs and automatically detect abnormalities and associated pathologies, and automatically develop the reports," he noted.

Additional applications that Botchu anticipates for ChatGPT include the following:

  • Gathering information from electronic health records (EHRs) and various studies to build patient databases
  • Developing patient satisfaction surveys, automatically with minimal human touch

A job threat?

But the technologies could threaten diagnostic radiology jobs, as fewer radiologists may be needed. It typically takes five to 10 minutes for a radiologist to write a report about an imaging study. But with AI, a radiologist could read an exam and write a report in about a minute, Botchu said.

Dr. Andrew Smith.Dr. Andrew Smith.

"The stress and workload for radiologists will decrease, and their productivity will increase, which is good. But then, the number of radiologists required to do their jobs will decrease. So, the need for radiologists who just do diagnostic radiology, who only look at scans and report on their value, will come down. That could lead to job losses," he speculated, as organizations might find it cheaper to purchase software instead of paying for radiologists.

Dr. Andrew Smith of the department of radiology at the University of Alabama at Birmingham, AL, concurs that AI-assisted radiology reporting has the potential to aid radiologists in providing more accurate, standardized, precise, effective, and timely diagnoses to improve patient outcomes.

"We spend a lot of our time looking at images and generating reports. There are several tasks large language models can do well in radiology reporting," he said. These include the following:

  • Speeding the formation of new radiology reports
  • Summarizing prior reports and notes to generate a concise history for a current radiologic exam
  • Identifying laterality and other mismatch errors in a report
  • Generating structured reports from freeform dictation
  • Generating conclusions and recommendations
  • Writing patient summaries using laymen's terminology

"The most important reporting task for programs like ChatGPT would be to improve adherence to best practice standards and recommendations," he added.

Not ready yet

But the reality is that ChatGPT and other LLMs are not yet ready for "prime time" in radiology, Smith noted. Before being used extensively, they need to undergo additional iteration, training, adaptation, integration, and user testing to make them reliable for radiology reporting, manuscript writing, and research, he said.

"There are privacy and security issues that must be overcome. You don't want to put patient-protected health information into just routine (generic) ChatGPT that that would not be HIPAA-compliant, for instance. There are ways to get around this problem. There are compliant versions of ChatGPT that you can access online. You can also move the large language models to a local server to make them more secure," Smith explained.

Subtraining needed

He believes that to make them viable for radiology use, ChatGPT and similar technologies will have to undergo subtraining on specific tasks, such as summarizing reports, providing recommendations for the radiologist, and understanding EHRs and medical literature. This would enable them to avoid providing fictitious, misleading, or incorrect information, or misguided recommendations. The subtraining would be key to the programs' incorporation into image analysis and the radiologist workflow, Smith indicated.

"Generative AI will be integrated at all levels, including into EHRs, RIS (radiology information systems), worklists, and reports. It is fairly inexpensive to do this. What is unclear is whether a large company will do this best or a more nimble innovative startup," Smith said. "Any of the LLMs could impact radiology. ChatGPT is currently in the lead but others will catch up. Sometimes first to market is best. Other times, second to market is better to overcome errors, missteps, legal issues, or other factors," Smith said.

Undertaking more intuitive tasks

Dr. Laura Heacock of the department of radiology at New York University School of Medicine, also believes that ChatGPT and other LLMs will speed radiologist workflow, eliminate some mundane tasks, and give radiologists more time to undertake more intuitive tasks.

Dr. Laura Heacock.Dr. Laura Heacock.

She also foresees a time when ChatGPT and other LLMs will be integrated into EHRs, patient charts, physician notes, and complex sources of patient information to rapidly gather and condense specific data that would be helpful for an imaging study, as well as prepare less-technical information for patient use.

Early applications for LLMs will be for research, manuscript, and grant writing, with their eventual incorporation into RIS, PACS, and enterprise imaging systems, according to Heacock.

She expects that the models will help automate many radiologist tasks and optimize clinical interactions with patients.

"It will make us more efficient by giving us more time to analyze images and put the information into the context of the patient history. It will check our reports for discrepancies, and serve as a second proofreader," she added.

Not medicine-specific

"But there currently are some drawbacks to using these models in medicine because they have been trained in generalities. They are not medicine-specific, and that is really apparent when you present them with medicine-related issues," she continued. "We will need to better train these general models when applying them to medicine. The real danger right now with using these models in the medical setting is that they have a tendency to hallucinate."

"In addition, when introducing AI to a system, it must be relevant to the population and application that you're placing it in. You have to be sure that that the model is working the way you want it to, with all of the necessary checks and balances," Heacock noted.

Despite the challenges, Smith expects AI-assisted reporting to eventually become the standard of care.

"The field of medical imaging and reporting is on the verge of a transformative era, where AI will be employed at various stages for each patient, leading to a revolution in the field and an improvement in patient care," he emphasized.

Page 1 of 3618
Next Page