The researchers, led by presenter Ming Li, PhD, and Dr. Bradford Wood, created 3D volumetric models of internal patient anatomy by processing medical images with computer software (Vuforia SDK, Vuforia). They aligned these images with the patient using a 3D marker.
Clinicians who put on a pair of R-7 "smart glasses" (Osterhout Design Group) were able to look at these 3D models and view the planned needle trajectory for an interventional procedure overlaid onto the patient's body.
"The system ... enables the interventional radiologist to see below the skin during needle procedures by wearing simple goggles or alternatively looking through a smartphone in a sterile cover," Wood told AuntMinnie.com.
Most image-guided interventions, including biopsy and ablation, require real-time visualization of a needle or a specific region of the body in relation to the targeted lesion, he said. With the AR technique, the operating clinician had access to similar imaging data as with conventional methods -- but without having to refer back to a separate display.
Wood and colleagues evaluated their technique on an interventional phantom and found that clinicians who used the AR device for needle insertion hit their targets with high accuracy. The average distance from the needle tip to the target center point was less than 4 mm.
"Information is power, and it just makes sense to have that information available to the operator when he or she needs it most," Wood said.