Mapping auditory percepts into visual interfaces for hearing impaired users
Abstract
Auditory-visual interfaces for hearing aid users have received limited attention in HCI research.We explore how to personalize audiological parameters by transforming auditory percepts into visual interfaces. In a pilot study (N = 10) we investigate the interaction patterns of smartphone connected hearing aids. We sketch out a visual interface based on two audiological parameters, brightness and directionality. We discuss how text labels and contrasting colors help users navigate in an auditory interface. And, how users by exploring an auditory interface may enhance the user experience of hearing aids. This study indicates that contextual preferences seemingly reflect cognitive differences in auditory processing. Based on the findings we propose four items, to be considered when designing auditory interfaces: 1) using a map to visualize audiological parameters, 2) applying visual metaphors, turning auditory preferences into actionable interface parameters, 3) supporting the user navigation by using visual markers, 4) capturing user intents when learning contextual preferences.