MONITOR - Multimodal presentation and exploration of time-dependent real-time data by blind people
Previous research on information visualization issues, including mobile HCI, lacks concepts and ideas for monitoring and analyzing time-dependent real-time data streams that are accessible to the blind. Real-time data occurs in various occupational and recreational contexts, such as manufacturing, logistics, sports, assisted living, nursing, transportation, and navigation with a frequency of seconds, minutes, or hours. Without the ability to respond in time, many blind people are excluded from key activities in these contexts because auditory presentation methods serialize the information and tactile presentation methods require time-consuming exploration of the information with the hand or other extremities.
We propose the combination of different output and input modalities, both auditory and haptic, to support non-visual perception of dynamic data, with input via gesture and speech as well as output via voice, sound and vibration. We conduct studies investigating the intelligibility of different output and input modalities for analyzing time-dependent real-time data streams with mobile devices and wearables. Particular attention is paid to the perceptual offset (the time between the occurrence of the data change and the perception of the change), different output rhythms and patterns, and the appropriate data update rates for each modality. In addition, combinations of these modalities will be investigated and tested with regard to perception and comprehensibility.
Aims of the project
The primary goal of this work is to design and develop a multimodal, non-visual approach to exploring real-time data streams with mobile devices and wearables to enable independent analysis for blind people. There are the following sub-goals:
- Investigate blind people's perception and interpretation of multimodal, non-visual data presentation using sonification and hapticfication under temporal constraints
- Investigation of how blind people can interactively, effectively and efficiently explore the presentation of time-dependent real-time data streams through combinations of voice input and gestures
- Investigating to what extent a multimodal presentation of time-dependent real-time data streams influences the usability of user interfaces by sighted people and how sighted people can benefit from an accessible multimodal presentation (visual and non-visual) of time-dependent real-time data streams.
Information about the project
Project duration: 01.03.2024 to 28.02.2026
Funding: Funded by the Deutsche Forschungsgemeinschaft (DFG, German Research Foundation) - 514258978
Employees and collaborators
Research Associate
NameMs Emma Franziska Etzold M.Sc.
Send encrypted email via the SecureMail portal (for TUD external users only).
Visiting address:
Andreas-Pfitzmann-Bau, Raum 1059 Nöthnitzer Str. 46
01187 Dresden