The PACSman Pontificates: Is radiology ready for AI and vice versa?

It has been nearly a decade since AI showed up in radiology. Since then over 700 algorithms in medical imaging have received U.S. Food and Drug Administration (FDA) 510(k) clearance to market their products. Most AI companies are anxiously awaiting the deluge of sales they hoped would accompany the interest in AI, but that hasn’t happened yet.

Will the sales deluge show up? Yes, for some companies, but certainly not for the vast majority. There are several barriers that still need to be addressed from a radiologist, facility, and patient standpoint.

The PACSman, Mike Cannavo.The PACSman, Mike Cannavo.

In recent years there has been more written about AI than all other technologies combined. Articles seem to be either totally for or at best marginally against the use of AI. Very rarely is a middle ground shown.

Lack of education

If I had to pick one thing that is holding back more rapid AI adoption it would not be technical issues, costs, or anything else singularly (although each of these certainly has an impact on the adoption rate). Instead, the lack of balanced education on the technology seems to be AI’s biggest impediment, with the word “balanced” being key.

How someone can extrapolate findings from a study conducted overseas with less than 100 participants in a publicly funded healthcare system and present it as a universal truth to over 49,000 radiologists in the U.S. where reimbursement is both complex and convoluted seems to defy logic.

Over five years ago, I predicted that few AI companies would be able to make it on their own. Although a few home runs have been hit relative to investments made in AI companies, the majority of the vendors remain in the RSNA bus line waiting their turn to disembark. Even with added investments and even money from various programs, many of the larger AI companies are still not profitable and some are hemorrhaging money at a pace that seems to defy logic. Others are making “sales” that allow an end user to evaluate a product for X months at no charge. In most universes, this does not constitute an actual sale.

Make or save money

AI needs to do one of two things: make money or save money. According to an October 2023 article in Becker’s Hospital Review, “... only six out of the 300+ regulatory-approved AI applications in radiology are reimbursed worldwide” and the “… lack of public funding is a significant obstacle for advancing AI in radiology and could delay the realization of AI’s full potential in improving patient care.”

The article goes on, “For some radiology AI applications, the benefits of the application may sufficiently serve as the incentive. For others, payers may have to consider reimbursing the AI application separately from the cost of the underlying imaging studies. In such circumstances, it is important for payers to develop a clear set of criteria to decide which AI applications should be paid for separately.”

Can you separately charge for AI based on reimbursement from either insurance, private payers, or until the U.S. Centers for Medicare and Medicaid Services (CMS) makes AI a class I reimbursement? Selective use of AI is a slippery slope that few have even considered. If a patient has either no insurance or Medicare/Medicaid, the only option is to bill them separately for using AI. Many will push back on this.

Is bundling the AI costs with the study cost an option? If the cost is low (less than $10), the facility or radiologist absorbing this, while not optimal, makes it pretty much a non-issue. Sadly, most AI algorithms cost considerably more than this.

In an effort to jump-start AI several years ago, CMS initiated a New Technology Add-On Payment (NTAP) for stroke AI software, which reimbursed up to and even over $1000 for using AI stroke protocols based on specific CPT, ICD-10 PCS, and DRG codes.

High costs

This high cost would probably never fly with insurers given the cost-to-benefit ratio of using AI. Even adding as much as $100 to a study cost would no doubt meet pushback. Now I realize that algorithm development cost is high -- many millions of dollars -- and the cost of running a company even higher.

But no one expects a company to show a return on investment in a few years. Playing in AI is a long-term game. As harsh as it sounds, if companies don’t have the money to get in and stay on AI they simply shouldn’t get in the game.

Some radiology groups have ordered AI use in the study and let the AI company bill the cost separately. This has not been well received by patients when they get a separate bill from the AI company for something they felt should have been covered in the radiologist's bill.

The only viable solution seems to be bundling AI with the study cost. This will require the AI costs to be considerably lower than they are today and make up the cost differential with across-the-board use that provides for additional procedures and faster read times. Of course, this dictates that radiologists trust AI and don’t spend additional time looking at studies that AI has identified differently than their initial interpretation.

According to Signify Research, venture capital (VC) investment for companies developing medical imaging AI applications has totaled almost $5 billion since 2015. Since 2021, there has been a notable shift in VC funding, from many, smaller, early-stage funding deals to fewer, larger, later-stage deals.

The top 25 companies account for more than 73% of all VC funding raised since 2015, yet the total funding has fallen despite the fact that the average deal exceeded $20 million. This reaffirms that the big get bigger while the small continue to struggle. This is another reason why AI has yet to take off. No one wants to be stuck with an unsupported product or one that few use clinically.

A growing market?

Is AI growing? Absolutely, but nowhere near the pace the prognosticators had hoped for in years past. Confidence and trust in the technology by both radiologists and patients present yet another obstacle to AI adoption. The question all seem to have when looking at AI objectively is, “Who do you trust?”

Radiologists initially feared that AI would take their jobs. Now that most -- but not all -- of that fear has gone, the concerns turn to the impact AI has on the bottom line (especially if it comes out of the radiologist’s pocket). Any negative impact AI has on reading speed (having to look at a study more closely, for example) will pretty much be the kiss of death for AI.

In radiology time is money. Conversely, if a radiologist trusts AI it may allow them to read more studies and offset any cost considerations. One recent study found patients are more forgiving of a radiologist missing a finding even with AI than one interpreted by an AI algorithm alone. This is crucial to understand as the first place that AI will probably be used in a standalone scenario will likely be screening mammography, even though it will still require a radiologist to sign off on it.

Now whether radiologists will accept a reduced fee for signing off on an AI- AI-interpreted screening mammogram remains to be seen as their responsibility remains the same as if they read it without AI. Sadly, they may not be given a choice if the policy adopted by the imaging center they read for is one that is “AI interpretation first.”

To the best of my knowledge, this has not been the case as of yet, but with the emphasis on improved margins, it’s just a question of time. A few mammography centers have also tested the waters promoting the use AI as a “second opinion” to get women to come to their center for routine mammographic studies.

Unfortunately, it is too soon to say if the addition of AI to the study interpretation is the reason why the patients came. That would actually make for a great study. In the big picture, the added cost of breast AI is marginal relative to the total cost of the procedure and interpretation at least in higher volume settings. Doing this will also help get a Class III (optional payment) approval with certain insurers until Class I (mandatory payment) is initiated a few years down the road.

Patient education

More patient education needs to be done before patients will accept AI unilaterally. A study done in mid-2023 relating to mammography and published in the British Medical Journal Open (BMJ Open) stated, “In general, women viewed AI as an excellent complementary tool to help radiologists in their decision-making, rather than a complete replacement of their expertise.

To trust the AI, the women requested a thorough evaluation, transparency about AI usage in healthcare, and the involvement of a radiologist in the assessment. They would rather be more worried because of being called in more often for scans than risk having overlooked a sign of cancer. They expressed substantial trust in the healthcare system if the implementation of AI was to become a standard practice.

The findings suggest that the interviewed women, in general, hold a positive attitude toward the implementation of AI in mammography; nonetheless, they also expect and demand more from an AI complimented study than that from a radiologist alone. Effective communication regarding the role and limitations of AI is crucial to ensure that patients understand the purpose and potential outcomes of AI-assisted healthcare.”

Interestingly though, the study did not address the cost of AI. One would have to assume that using AI was done for free to the patients since the UK has a government-sponsored universal healthcare system. Studies published in the U.S. by the RSNA and others also found similar results, although once again cost was never discussed.

PACS integration

Having AI integrated into a PACS makes a huge difference from a performance standpoint. Most AI studies can be processed in fairly quickly, but any delay impacts the radiologist’s ability to provide a rapid turnaround time. This dictates that AI data be simultaneously processed when the new modality data is sent to the Cloud to be matched with prior studies.

This way when the study hits the worklist the study data and AI interpretation are both available without delay. You can set up the hanging protocol so that the AI interpretation is available when the full study data is displayed or can become available by the radiologists hitting an AI key on their viewing software. The one thing you don’t want is a radiologist having to wait for the AI algorithm to process and then download it to the workstation.

Is it ready?

So is AI ready for radiology and radiology ready for AI? Both think they are. The reality is it’s like preparing for your first child by reading books that you feel tell you all about it. The first time you change that diaper from hell or deal with multiple sleepless days and nights you may think “What have I gotten into?” But hearing that same baby's laugh and seeing them smile tends to make it all worthwhile. We just need to get over the baby stage with AI and everything will be all right.

Michael J. Cannavo is known industry-wide as the PACSman. After several decades as an independent PACS consultant, he worked as both a strategic accounts manager and solutions architect with two major PACS vendors. He has now made it back safely from the dark side and is sharing his observations.

His healthcare consulting services for end users include PACS optimization services, system upgrade and proposal reviews, contract reviews, and other areas. The PACSman is also working with imaging and IT vendors to develop market-focused messaging as well as sales training programs. He can be reached at [email protected] or by phone at 407-359-0191.

The comments and observations expressed are those of the author and do not necessarily reflect the opinions of AuntMinnie.com.

Page 1 of 370
Next Page