Vanderbilt researchers’ novel AI-driven technology aims to revolutionize cochlear implantation

 

Jack Noble always admired his father’s work as a family practice physician, particularly the simple aspect of helping people. It’s a sentiment the Vanderbilt engineer has carried over into his own work of developing innovative technology that aims to not only improve patients’ hearing, but also assist surgeons performing the procedures.

“I was amazed at how much impact he had on people’s lives,” said Noble, assistant professor of electrical and computer engineering and an affiliate with the Vanderbilt Institute for Surgery and Engineering (VISE). “Coupling that with my interest in engineering and being able to become an engineer in a capacity where I could also help people and contribute to society, was kind of a perfect scenario.”

While an undergrad at Vanderbilt, Noble’s first research project was with a team working to develop minimally invasive cochlear implantation techniques. Data collected from the implantations revealed that how the implants were positioned in the inner ear could impact hearing outcomes, which gave the team an idea: create an augmented reality (AR) vision guidance system to help surgeons place cochlear implants more precisely.

Jack Noble

Earlier this year, Noble and his team received a $3.1 million National Institutes of Health/National Institute on Deafness and Other Communication Disorders (NIDCD) grant to develop and validate a vision system for improved placement of cochlear implant electrode arrays.

Cochlear implants help restore hearing for people with severe hearing loss by sending tiny electrical signals to the hearing nerve inside the inner ear (cochlea). Most patients benefit, but results vary widely. A key reason is how well the implant’s thin electrode array is positioned during surgery. If it is not placed optimally, the signals can overlap or miss parts of the nerve, making speech harder to understand.

“That ability to understand speech can really impact day to day quality of life for people who receive these implants,” said Noble, who is also assistant professor of computer science, biomedical engineering, otolaryngology – head and neck surgery, and hearing and speech sciences.

The system being developed by Noble and his team will assist surgeons to better position the electrodes by guiding the surgical procedure. Powered by modern artificial intelligence, the system will display 3D imaging data directly into the microscope to allow the surgeon to visualize critical anatomy that is normally hidden. It also will permit overlaying the optimal surgical plan and tracking the 3D position of surgical tools to check how closely the plan is being followed. The system will provide visual alerts if the surgical tools deviate from the plan.

Overall, the team is re-engineering the phases of cochlear implant procedure using augmented reality and AI technology to customize pre-operative plans, implant placement during surgery, and post-operative device programming and rehabilitation. That includes developing comprehensive patient-specific cochlear implant models.

Sub-optimal stimulation of the auditory nerves has been a large part of the variability in cochlear implant outcomes. Approaches for estimating how the electrodes stimulate the nerves on a patient-specific basis have not been reliable enough to help audiologists consistently improve outcomes through programming adjustments. The models will enable development of next-generation programming strategies based on computational simulations of implant performance to determine programming settings.

“Dr. Noble has been a leader in cochlear implant research for many years, driving innovations that are reshaping the field,” said Benoit Dawant, a team member and  Cornelius Vanderbilt Professor of Electrical and Computer Engineering. “His new image-guided technology will help surgeons achieve more precise and consistent implant placement, while his digital twin models will enable personalized programming tailored to each patient’s anatomy and neural health. His vision and long-standing dedication are transforming every phase of cochlear implant procedures, and I expect it will greatly improve many CI recipients’ quality of life.”

Other team members include René H. Gifford, former professor of hearing and speech sciences at Vanderbilt, now chief of research and audiology at Hearts for Hearing in Oklahoma City, Oklahoma; and physician collaborators Aaron Moberly, M.D., Guy M. Maness Professor in Otolaryngology at VUMC, and Robert Labadie, M.D., Ph.D., chair of the Department of Otolaryngology at the Medical University of South Carolina.

“The concept which Dr. Noble proposed is straightforward, namely, to use augmented reality to see below the anatomical surface giving surgeons tools to complete surgery more accurately and efficiently,” Labadie said. “However, putting this into practice requires aligning patient-specific anatomy from pre-op imaging with intraoperative anatomy based on relatively small volumes of that anatomy visible using an operating microscope which the surgeon moves regularly.  Successfully executing that task is quite difficult, but based on our track record, I am bullish that it can be achieved.”