Starkey’s CTO envisions hearing aids becoming human enhancement devices

Achin Bhowmik, chief technology officer at Starkey, spent 17 years at Intel before joining the Minnesota hearing aid maker in 2017. His specialty at Intel was ‘computational perception,’ a field he describes as the intersection of cognitive neuroscience and artificial intelligence.
The idea was to figure out how humans perceive the world and try to translate that in a way that would help robots sense the world as we do. So why does a guy who spent almost two decades studying robots end up designing hearing aids? As he sees it, it’s not as big a leap as you might think.
“At Starkey, my focus didn't change. It's still computational perception. And still it's a lot about figuring out how humans sense and understand the world, but the focus is now on how we help humans sense and perceive better when we have sensory degradation,” Bhowmik told FastForward.
Today, he is trying to continually improve and expand what hearing aids can do to not only help humans hear better, but to enhance the human experience as we interact with the world.
The doors of perception
When Bhowmik joined Starkey in 2017, he created a slide that has been his guide for leading engineering at the company. “I identified three areas we wanted to focus on: better hearing, health monitoring and personal assistance. In short, the vision was, could we make hearing aids multifunctional devices, much like what Apple did with the iPhone,” he said. As he points out, before Apple came along, the phone was basically for phone calls, and they designed it to handle multiple tasks.
How successful was he in achieving this early vision? The whole thing comes together in Starkey's Edge AI hearing aids. When it comes to hearing aid functionality, his team began investing in AI and deep neural network technology. AI is now improving the user experience in ways that simply weren’t possible in 2017. For example, back then, someone wearing a hearing aid in a restaurant couldn’t easily distinguish between background noise and the voices of people at their table. Today, AI-powered hearing aids can reduce background noise while enhancing the voices you want to hear—a huge improvement because the devices no longer merely amplify every sound in the environment equally.
Bhowmik says the health monitoring capabilities in today’s Starkey hearing aids compare with those on the Apple Watch. Like the Apple Watch, Starkey’s aids can track how many steps you took or detect a fall. It can also measure your balance, and whether you're at risk of falling, something that’s particularly important to elderly users.

What’s more, the devices not only measure physical activity, they also measure cognitive engagement by monitoring if the user is isolated or engaging with other people on a regular basis. “It's extremely important that older patients with hearing loss be socially engaged,” he said. In fact, his own mother wears his company’s hearing aids.
“So if my mother, who's wearing hearing aids, is becoming socially isolated, I want to know that. And with machine learning classification, we're able to not just monitor physical activity, but also cognitive activity, and then alert people if they seem isolated.”
This is particularly important because social isolation can lead to dementia. A Johns Hopkins study released in early 2023 found that when elders were isolated, there was a 27% higher chance of developing dementia.
Making it personal
Finally, there is the notion of the hearing aid as personal assistant. Advances in generative AI enable users to simply talk to the device and get answers back in their ear that only they can hear. The feature uses proprietary models along with partnerships, such as the one the company has with OpenAI. It can answer a broad range of questions as with any AI assistant such as the weather forecast, the latest sports scores or help translating a foreign language.
That level of functionality requires transforming the device from one where hearing enhancement is done locally on the aid to one that needs to access external computing resources. That in turn requires connecting the hearing aid to an Apple or Android smartphone via Bluetooth, something that can be set up for the person when they get fitted with the device.
“If you're going to do language translation or run a large language model for a natural human interface to the world of information, we run that in the cloud. So it is a distributed computing platform,” he said.

To show what the device is capable of, even for people without hearing loss, Bhowmik wears a Starkey hearing aid as he moves through his day. “I don't have hearing loss as I’ve told people publicly, and yet I'm using my device — and you can't even see it now,” he said. (I actually couldn’t see it until he pulled it out of his ear and showed it to me.)
He added, “Imagine if I have a nearly invisible product that does not get in the way of social engagement, and yet this device can listen to me. It has microphones. It can talk to me privately, so only I can hear. What's a better interface for a cloud-based assistant than a device like this?” he asked.
Security and HIPAA compliance
A medical device connected to your smartphone that’s sharing personal information and using microphones to listen to the person and the environment, has to have much more stringent security requirements than a consumer product. “We have to architect how we collect, curate and use data in a completely different way from if you are simply using a consumer electronic device,” Bhowmik said.
“Our devices are all HIPAA compliant for all the data that's collected from the user. It's sitting on a HIPAA-compliant cloud. Data is not accessible or available to other apps or capabilities from other companies. So we have to guardrail it and be HIPAA-compliant, secure and private.”
Imagine if I have a nearly invisible product that does not get in the way of social engagement, and yet this device can listen to me. It has microphones. It can talk to me privately, so only I can hear. What's a better interface for a cloud-based assistant than a device like this?
Sometimes, whether it’s security or something else, he says a startup can provide an innovative solution, and Starkey is actively working with startups that can help advance the kinds of technologies they are developing at the company.
“We cannot invent it all by ourselves. Startups, everyone who is working on innovative stuff in areas like low power, distributed computing or real time signal processing — we like to work with them,” he said.
For the past seven years, Bhowmik and his team have been working to architect hearing aids that do more than amplify sound. His ultimate goal is to make these devices indispensable, not just for people with hearing loss, but for anyone who wants access to information whenever they need it.
“I see a time when your in-ear device becomes a ubiquitous part of you, not because you have hearing loss, but because you cannot live without it,” he said.
Featured photo courtesy of Starkey.