Abstract
Despite the technological advancement of modern hearing aids (HA), many users abandon their devices due to lack of personalization. This is caused by the limited hearing health care resources resulting in users getting only a default ’one size fits all’ setting. However, the emergence of smartphoneconnected HA enables the devices to learn behavioral patterns inferred from user interactions and corresponding soundscape. Such data could enable adaptation of settings to individual user needs dependent on the acoustic environments. In our pilot study, we look into how two test subjects adjust their HA settings, and identify main behavioral patterns that help to explain their needs and preferences in different auditory conditions. Subsequently, we sketch out possibilities and challenges of learning contextual preferences of HA users. Finally, we consider how to encompass these aspects in the design of intelligent interfaces enabling smartphone-connected HA to continuously adapt their settings to context-dependent user needs.