Widex introduces AI enabled hearing aids

Update: June 11, 2021

Widex introduces AI enabled hearing aids

Widex introduces AI enabled hearing aids

Widex has introduced My Sound, a portfolio of AI features including a new solution that can intelligently customise the company’s Widex MOMENT hearing aids based on a users’ activity and listening intent.

The first company to enable user-driven sound personalisation by leveraging artificial intelligence in hearing aids, My Sound launches the company’s third generation of its AI technology which it claims vastly improves the usability of the AI solution based on the extensive data the company has gathered from the previous two generations.

This solution combines the capacity of artificial intelligence with users’ personal real-world experience to deliver enhanced levels of automated customisation. Through AI modeling and clustering of data collected via the Widex SoundSense Learn AI engine, highly qualified sound profile recommendations for the individual user can now be made based on the intent, need, and preferences of thousands of users in similar real-world situations.

“Widex currently leads the industry by combining AI and human intelligence to create natural sound experiences and foster social participation through better hearing,” said Jodi Sasaki-Miraglia, AuD, Widex’s Director of Professional Training and Education. “Once Widex Moment is fit properly by a local licensed hearing care professional, the user can, if necessary, customise their hearing aids with ease, choosing from multiple AI features. Plus, our latest generation delivers results in just seconds, putting control and intelligent personalisation into the hands of every user.”

My Sound is integrated into the Widex MOMENT app and through AI is able to utilise the cloud-based user data of Widex users worldwide to make sound profile recommendations based on an individual user’s current activity and listening intent. Users launch My Sound from the app and begin by selecting their activity, such as dining, then choosing their intent, such as socialising, conversation, or enjoying music.

Based on the user’s selections, Widex can then draw on tens of thousands of real-life data points, reflecting the preferences and listening situations of other Widex users who have used the app previously. In a matter of seconds, the user is then presented with two recommendations, which can both be listened to before selecting the settings that sound best. In the event neither recommendation meets the individual user’s needs, they can launch SoundSense Learn from the same screen to further personalise their hearing experience through that solution’s sophisticated A/B testing process.