Veterans Affairs

Augmented reality surgical navigation system

Combining a holographic image of the surgical area prior to the procedure with the real-time images provides remarkable ability to monitor changes

Software & Information Technology Medical & Biotechnology

The Department of Veterans Affairs is making available a novel tool for enhancing medical surgery. The technology is available, through a patent license agreement, to companies who would make, use, or sell it.

Col. Edward Anderson, 99th Medical Group orthopedic spine surgeon, performs a lumbar microdiscectomy surgery at Nellis Air Force Base in 2018. (Air Force photo)

A major challenge during surgery is for the surgeon to differentiate between diseased tissues and healthy tissue. Currently, surgeons can make decisions based only on their best guess and experience, or use of a neuronavigation system to correlate a body structure with the images taken prior to the surgery.

Current neuronavigation systems display the location of a pointer on a screen and hence require the surgeon to look away from the patient. Augmented reality devices offer an alternative view that can help surgeons make decisions without looking away from the patient.

Augmented reality can also benefit intra-operative imaging, the process of taking an image during surgery and comparing it to a previous image. This can mean moving the patient to an MRI, which is costly and time-consuming and risky.

In order to overcome some of these deficiencies, VA researchers are working to introduce more functional augmented reality in surgical navigation.

Their proposed system consists of three parts: acquisition of any medical image of interest, examples include medical resonance imaging, computed tomography, ultrasound, images from microscopes, or any other device, an augmented reality system with the capability to detect changes in the environment, render holograms, and accurately place holograms relative to the environment, and a computer that can take medical images, convert them into holograms and place them relative to the patient’s body via coregistration.

The coregistration depends upon using information from image of interest and the environment and then using this mutual information to place the image of interest relative to the environment. After that, either the structures of interest from the image can be displayed as a hologram in the real world, or image and objects of interest can be displayed on a screen.

Since the augmented reality continually updates the map and structure of objects in its surroundings, if, for example, during surgery, a nerve gets moved to a new location, these updates can detect the movement and rearrange the holographic representation beyond the initial medical image to reflect the updated location of the nerve.

The system may also detect changes in the internal body organs. For example, during neurosurgery, the brain can become edematous. The system may detect a change in the size of the brain and correlate the change to the previously received medical image. Hence, brain edema can be quantified.

Similarly, the system could also detect blood loss during surgery. Image processing and updates in object shapes can help inform surgeons and other medical staff about real-time cardiac output and lung function during cardiac surgery.

Do you have questions or need more information on a specific technology? Let's talk.

Contact Us