A Vanderbilt engineering professor is working with clinical colleagues to develop and validate an augmented reality (AR) vision guidance system to help surgeons place cochlear implants more precisely. The guidance system leverages emerging artificial intelligence (AI) technology and uses inexpensive, commonly available equipment, making it practical for many operating rooms.

For this project— Development and validation of vision system for improved placement of cochlear implant electrode arrays—Jack Noble, Ph.D., assistant professor of electrical and computer engineering, has received a $3.1 million grant from the National Institutes of Health. The team includes Benoit Dawant, Ph.D., Cornelius Vanderbilt Professor of Electrical and Computer Engineering, and physician collaborators Aaron Moberly, M.D., Guy M. Maness Professor in Otolaryngology at the Vanderbilt University Medical Center, and Robert Labadie, M.D., Ph.D., chair of the Department of Otolaryngology at the Medical University of South Carolina.
“We will build, refine, and test the system thoroughly in a large multi-site laboratory study with both expert and novice surgeons to ensure effectiveness and reliability before performing a clinical trial,” said Noble, also an assistant professor of computer science and biomedical engineering. Noble is an affiliate faculty member of the Vanderbilt Institute for Surgery and Engineering (VISE), an interdisciplinary, trans-institutional institute that brings engineers and physicians together to impact health care. “This project would not have been possible without initial support and equipment provided by VISE. This is another example of how the world-class facilities provided by VISE permit launching trans-institutional, transformational research projects that ultimately improve healthcare,” said Noble.
Cochlear implants help restore hearing for people with severe hearing loss by sending tiny electrical signals to the hearing nerve inside the inner ear (cochlea). Most patients benefit, but results vary widely. A key reason is how well the implant’s thin electrode array is positioned during surgery. If it is not placed optimally, the signals can overlap or miss parts of the nerve, making speech harder to understand.
The system being developed by Noble and his team will assist surgeons to better position the electrodes by guiding the surgical procedure. Powered by modern AI, the system will display 3D imaging data directly into the microscope to allow the surgeon to visualize critical anatomy that is normally hidden. It also will permit overlaying the optimal surgical plan and tracking the 3D position of surgical tools to check how closely the plan is being followed. The system will provide visual alerts if the surgical tools deviate from the plan.
“Think of it as a smart, real-time overlay in the surgical microscope that aligns what the surgeon sees with the rich information we can extract from the patient’s pre-surgery CT or MRI scans and that provides a step-by-step walkthrough of the procedure with corrective advice when needed,” Noble said. “The goal is to help provide surgeons critical information that they currently must mentally synthesize based on experience. With our system, surgeons with less experience performing cochlear implantation may become instant experts. This can improve hearing outcomes for individual implant recipients, but also may help broaden access to cochlear implants, permitting more of the millions of individuals who could benefit from these devices to receive an implant.”
The grant R01DC022099 is funded by the National Institute on Deafness and Other Communication Disorders (NIDCD).
Contact brenda.ellis@vanderbilt.edu