3D development outpaces facilities' readiness

The need for 3D imaging was created by the evolution of CT scanners from single-slice step-and-shoot machines to the myriad multidetector models available today and the development of PACS systems, which allowed for digital viewing of the images.

MDCT scanners created enormous datasets and the ability to produce volumetric datasets that radiologists could view. The volumetric datasets created the need for 3D postprocessing applications to arrange the images in a volumetric viewing format.

But creating this volumetric format was a cumbersome and time-consuming endeavor that took away from patient scanning time, giving rise to the creation of postprocessing workstations. Volume scanning and cardiac CTs have produced a need for 3D workstations to access the massive amounts of data created by imaging these structures.

Cardiac CT imaging alone creates thousands of images and extremely large datasets. Multiphase cardiac CT can produce datasets in excess of 5,000 images (2.5 GB). Because of these large datasets, workstations and specific software needed to be developed to expeditiously review them. These workstations were expensive and did not add to the efficiency of the department; in fact, they caused bottlenecks because they took technologists off of the scanning equipment to process the datasets.

Many radiologists and cardiologists attended expensive courses on volumetric and multiplanar manipulation and visualization techniques on a multitude of different vendor workstations. They were eager to begin as soon as they returned to their practices, only to discover that using even the most advanced 3D software was time-consuming and tedious.

Most physicians simply do not have time to spend away from their patients to do 3D reconstructions, and if they do not have access to skilled technologists, the program can quickly fade.

Most imaging centers and hospitals recognized this inefficiency, and the workstations were then placed in a location remote from the scanning area. This created a networking issue due to the size of the datasets. The need for access to workstations has created productivity slowdowns in workflow.

Even with the most robust network, a couple of studies traveling at the same time can cause network slowdowns, and in extreme cases, network shutdown. In response to this problem, the "thin-client" model has been developed. In this model, users use a Web-based server and can perform 3D processing from workstations or any Internet-capable computer. The end user only needs an Internet connection and password to gain access to the thin-client server.

With this model, multiple users can access images at the same time without network lag. Depending on the size of the server, as many as 27,000 images can be manipulated at the same time. The quality of the study is not related to the size and type of computer because the information being manipulated is on the thin-client server.

You can't pick up a medical imaging journal without coming across some "new" clinical finding associated with 3D imaging. The authors of an article in Health Imaging and IT reference a study from the American Journal of Roentgenology stating, "The researchers found that [CT angiography (CTA)] exams performed in a clinical setting indicate that CTA can provide all the information needed for the triage and clinical care of patients with peripheral artery disease." A study by the American College of Radiology Imaging Network (ACRIN) revealed that most patients undergoing virtual colonoscopy would not need a subsequent colonoscopy.

Many cardiac CTA studies have been reported and the conclusions are in: 96% sensitivity, 98% negative predictive value, 52% positive predictive value, and an average cost savings of 40%. These impressive numbers don't even include the time savings for the institution, as well as the patients.

Any imaging center that believes it can use existing staff and space for the 3D function needs to look more closely at scanner throughput and the impact of the time used to create advanced imaging. Assuming it takes an experienced CT technologist or physician 30 to 45 minutes to fully reconstruct a coronary CT study at a workstation, that same technologist could easily perform three to four scans on the CT scanner in the same amount of time. A physician could read several scans, consult with referring physicians, and perform procedures during that same period. Now assume the cost for each study not done is approximately $300 -- the impact of reconstructing one cardiac CT per day could be up to $500,000 a year in lost revenue.

All clinical indications point to the value of 3D processing and image interpretation. The trick is to do it efficiently. The first step is to remove 3D processing from the imaging area and transfer it to a 3D lab setting. Turnaround times improve greatly when techs can focus on the job at hand and not participate in the image capture aspect of a 3D program. They will become much better at image manipulation and "buttonology" when they can concentrate on 3D processing, thus lowering turnaround times even more. The second step is to create a true thin-client model.

A standalone 3D lab with experienced technologists is the answer to the problem of 3D reconstruction workflow headaches. If a clinic has the foresight to set up a 3D lab, it will be faced with a few challenges. The first challenge is the cost of the workstation and software associated with specific reconstructions, which is usually in excess of $200,000. The more robust the service, the higher the cost of the software needed to complete the job.

The second challenge -- one that is often more difficult to overcome -- is the shortage of experienced 3D technologists. Many technologists have worked on some sort of 3D workstation, but few have advanced anatomy and pathology experience. The learning curve for this complicated task is steep, and most technologists take a long time to master it.

A third challenge, and the most difficult one, is the complexity of networking from the modalities to the PACS, to the workstation or server, and then back to the end user physicians. The infrastructure of the 3D lab network is extremely important. Because of the size (2,500 to 5,000 images) of the datasets associated with coronary CT, the size of the "pipe" used in both the backbone and feeder lines that supply the lab must be carefully considered. The bigger the better. Compression of the images is of extreme importance, again, because of the size of the datasets. Transmission times can be significantly reduced if the data is compressed, both incoming and outgoing.

An example of transmission times across a T1 (1.5 Mb upload) would be approximately six seconds per image, uncompressed, and two seconds per image when compressed. This is a significant time savings, especially for urgent cases such as a STAT coronary CT for a distressed patient.

The DICOM router, which maximizes existing bandwidth, is another tool that can be used to speed transmission times. The final piece of the puzzle is the ability to place the source and reconstructed images back into the initiating PACS. This particular piece of the puzzle may seem to be the easiest because both sets of images are DICOM-compliant and therefore should be accepted by any PACS system. This is not the case. Different PACS vendors accept images in a variety of ways.

The biggest challenge can be the source image data. These images can be recognized as existing images and could be rejected from the PACS and the ensuing transmission could be aborted. Workarounds must be anticipated and developed to ensure smooth image delivery.

By Gary Kaiser, Ph.D., and Dr. William Shea
AuntMinnie.com contributing writers
June 23, 2008

Gary Kaiser, Ph.D., is the director of operations, 3D Division, at NightHawk Radiology Services. Dr. William Shea is vice president of the 3D Division at NightHawk Radiology Services. Both are based in Austin, TX. For more information, visit www.nighthawkrad.net.

Related Reading

Mayo pares PACS limitations with intelligent image routing, June 11, 2008

Video game technology speeds beating-heart surgery, June 9, 2008

Software unifies control of multiple informatics applications, June 2, 2008

Communication and prevention are key to minimizing PACS downtime effects, May 19, 2008

Using informatics to meet communication challenges, February 7, 2008

Copyright © 2008 AuntMinnie.com

Page 1 of 660
Next Page