Student Projects
Table of contents
- LaTeX Template for Bachelor's or Master's Theses
- Ideas for New Theses
- Gesture recognition for controlling common Office applications for blind and severely visually impaired people
- Accessible teaching: checklist-based self-assessment on the accessibility of course materials, sensitization of lecturers
- Supporting People on the Autism Spectrum at Work by Improving Work Organization and Reducing Stimuli
- Multimodal presentation and exploration of time-dependent real-time data by blind people
- Conversational Agents in Mental Health Care
- Accessible Maps for Buildings
- Diversity-Sensitive Human-Computer Interaction and AI
- Social Assistance Robots ― Usability and Software Variability
- Improving Presence Lecturing for Blind and Visually Impaired People
- Physical Information Visualisation
- Completed Theses
Are you interested in writing your Bachelors/Masters/Diploma thesis here at HCI? Please, have a look at the following list of possible topics or fields (plus a responsible contact) or browse through the theses in progress or finished work. Feel free to have a look at our research projects, too.
LaTeX Template for Bachelor's or Master's Theses
We offer a LaTeX template to support you writing your bachelor's, master's or diploma thesis. The template can equally be used for German and English theses and comprises common elements, such as a title page and the confirmation statement, as well as the usually used styles and formats. You can download, clone and fork the template from its GitHub repository TUD-HCI-Thesis.
Furthermore, you will find information on creating accessible documents on the website of the Working Group Services Disability and Studies (AG SBS). Especially for those who want to create accessible PDF documents using Word, PowerPoint and InDesign, there are instructions available directly within the download section of the AG SBS.
Ideas for New Theses
Gesture recognition for controlling common Office applications for blind and severely visually impaired people
Contact person: Bela Usabaev
Keywords: Human-Computer-Interaction, Haptification
In order to navigate the visual desktop interface, blind and severely visually impaired people use screen readers such as NVDA to output the displayed content using speech. A Braille display or a standard keyboard is usually used for input.
On tablets and cell phones, blind users control applications (apps) with gestures such as “tap” and “swipe”, which correspond to the “Enter”, “Tab” or “Arrow” keys, and receive feedback by voice [or Braille via a Braille display].
In the field of education, tactile graphics are often used to convey graphically coded complex facts, such as maps or diagrams, through the sense of touch and explanatory voice output. With the help of computer vision, it is possible to implement gesture control on a tactile graphic via a camera image.
In the course of developing a voice assistant for tactile graphics as part of the project TactonomDuo, we want to investigate which gestures are useful for common Office programs such as Excel, Calendar and Explorer, which are often used in everyday professional life, and develop and evaluate a gesture vocabulary for controlling these applications.
Link to project web site: TactonomDuo
The following topics are of interest in this context:
- Gesture recognition from a camera image with the Raspberry Pi
Accessible teaching: checklist-based self-assessment on the accessibility of course materials, sensitization of lecturers
Contact Person: Emma Etzold
Learning management systems (LMS) such as OPAL allow teachers to make their course materials available online. They can choose from a variety of options, from uploading files that students can download to self-programmed website content embedded in the LMS. Teachers are responsible for how accessible their materials are. This means that they decide whether and to what extent, for example, alternative texts for images or video transcriptions exist. Depending on the scope and type of materials, different accessibility criteria are relevant.
In Germany, it is a legal requirement that public institutions - including universities such as the TUD - must make their digital content accessible. This means that they must consider the criteria of the Barrierefreie-Informationstechnik-Verordnung (BITV). These criteria are usually difficult to understand and are primarily aimed at experts - web developers or accessibility specialists. However, many university teaching staff are not among these specialists and are either unfamiliar with the specific guidelines or only partially. Nevertheless, many of them are endeavoring to increase the accessibility of their course content and make learning accessible to all students. To support them in this, the chair has developed a concept and a prototype for a checklist-based self-assessment of a course's accessibility. The first step is to create an "accessibility statement" that informs students about how accessible or inaccessible course materials are. This should also create an opportunity to report existing barriers directly to the responsible person and to remove them as quickly as possible or to obtain alternative access to the information. We also want to use this checklist to sensitize and motivate lectures to accessibility. That is why, in addition to the checklist, there is also a handbook that teachers can use to learn about individual criteria in an easily understandable way and learn how to implement them in an LMS such as OPAL.
The following topics and questions are of special interest for theses on this topic:
- What checklists already exist for specific file formats?
- How easy are such checklists to understand for non-specialists?
- How can BITV criteria be implemented in LMS (OPAL, Ilias, moodle) to create accessible online courses?
Supporting People on the Autism Spectrum at Work by Improving Work Organization and Reducing Stimuli
Mental illness and developmental disorders are often related to a reduced ability to work and adapt, which often leads to reduced employability. People with autism are much more likely to be affected by unemployment or to be employed in jobs with low qualification requirements even though they have a good vocational or school education compared to people without autism. Challenging working conditions such as time pressure, information overload, high communicative demands, social skills, sensory overstimulation, and the demand for a high degree of flexibility are the major reasons for the low employment rate of people with autism in the primary labor market. Until now, only a few technical aids have been available to profitably support people with autism in their everyday work.
The field of research comprises the development of technical assistance systems to support task and time management on the one hand, and tools to support stimulus reduction on the other. The aim is to develop an adaptive noise cancellation method with ANC headphones that, for example, detects the context of a specific conversation situation and adaptively filters out irrelevant signals. Concepts for the use of data glasses, e. g. for person identification or emotion recognition, will be developed as well. In addition, haptic, natural interaction modalities using tangibles are to be used to improve task and schedule planning and thus enable structured work with minimal disruptions.
Theses in the research field should support at least one of the three problem areas in particular:
-
Stimulus Reduction: Reduction of stimuli to avoid overstimulation
-
Communication: Support of verbal and textual communication
-
Task Management: Support of structuring and prioritizing in task and time management
Depending on the specific scientific question, the focus of this work can be on technical components and their possible applications as well as on concepts for supporting people on the autism spectrum at work.
Project website with further information: https://tu-dresden.de/ing/informatik/ai/mci/forschung/forschungsgebiete/autark?set_language=en
Multimodal presentation and exploration of time-dependent real-time data by blind people
Contact Person: Emma Etzold
Keywords: Human-Computer-Interaction, multimodal data presentation, Haptification, Sonification
The visualisation of dynamic data is a standard method to analyse and derive information from them quickly. When monitoring time-dependent real-time data streams (e.g., machine monitoring in a production context), visualisations (e.g., dashboards) are often used to quickly overview the data volume and temporal progress. Most data representations are purely visual and use visual abilities to convey information from data quickly. However, a purely visual data representation excludes blind people from professional data monitoring.
In this research area, multimodality is used to enable greater inclusion of blind people in data monitoring. The aim is to develop a multimodal, non-visual method for exploring real-time data streams using off-the-shelf mobile devices and wearables, enabling blind people to analyse them independently.
In this context, the following research questions are interesting:
- Which (multimodal) approaches of data presentation are equally suitable for blind and sighted people? (Information/data visualisation)
- Which non-visual output media can be usefully combined with each other? (Perception and interpretation of output media)
- How can the user’s attention be focused on certain events (e.g., exceeding thresholds)? (human-activity-recognition)
- Which input modalities can be usefully combined with each other? (interaction)
Conversational Agents in Mental Health Care
Contact Person: Julian Striegl
With rising numbers of people with mental health problems worldwide and an already struggling healthcare and metal health system, new solutions have to be found to improve the mental wellbeing of people from all age groups. In the past, computerized therapy approaches – such as application-based and internet-based cognitive behavioral therapy (CBT) – have already been investigated and have shown to be effective for several different forms of mental disorders, such as depression, anxiety disorder or obsessive compulsive disorder.
In the last decade the research interest in chatbot-based CBT has grown, with voice assistant-based CBT (VA-based CBT) being a possible next step towards offering scalable, low threshold, easy to use and accessible solutions for mental health support. How can we create conversational agents (such as chatbots and voice assistants) that are able to support the mental wellbeing of users through methods based in psychotherapy, positive psychology and mental health coaching?
Subject areas in this field of research include: artificial intelligence and machine learning, natural language processing, voice user interface design and interaction design combined with elements of clinical psychology and psychotherapy. Furthermore, several research questions can be derived from the central topic:
- How can we design suitable voice user interfaces that take the diverse needs of people with different communication needs and capabilities into account in order to provide accessible, low threshold therapy support?
- How does the usability and acceptance of VA-based CBT compare to chatbot-based CBT?
- Which methods from fields such as CBT and positive psychology are suitable to be presented/administered via conversational agents?
- How can we support salutogenesis through chatbots and voice assistants?
- How does an emotion sensitive dialog management affect the acceptance of and adherence to VA-based CBT?
- How does the effectiveness of VA-based CBT compare to alternative application-based and internet-based CBT approaches?
Accessible Maps for Buildings
Keywords: Accessibility, Indoor Maps
The research field Accessible Maps for Buildings addresses different problems in the field of indoor maps. Therefore, problem statements such as the acquisition of building data and their use are considered. On the one hand, the resulting research field ranges from the use of computer vision to transforming escape plans into digital building plans, to SLAM processes which are used to improve maps. On the other hand, it discusses the area of user-centered design to generate maps for specific user groups or develop routing applications.
Cartography of Buildings
Contact Persons: Jan Schmalfuß-Schwarz Julian Striegl
The research area of the cartography of buildings deals with the generation, modelling, improvement and updating of indoor map data. In this regard, card formats are evaluated and developed which, in addition to the purely geometric representation of the building, also store information about accessibility and discuss approaches for capturing time-critical (dynamic) barriers in indoor maps. Moreover, the focus lays on solutions that make it possible to extract indoor maps from various output formats, such as architectural building plans or escape plans, using image processing and deep learning methods. Furthermore, the different solutions have to be examined in regards to their accuracy and optimized if necessary. Therefore, approaches from robotics as well as from the crowdsourcing area are used.
Usage of Map Data
Contact Persons: Julian Striegl Christin Engel
In addition to the creation of visual building maps, the usage of such maps is moreover a diverse topic. Both the generation of physical maps and their mobile use are examined using various modalities. On the one hand, it focuses on the generation of user-centered maps that address the specific needs of people with disabilities. This includes the generation of barrier-free maps (for example in the form of tactile or physical maps for haptic exploration). On the other hand, the goal of the research field is to explore mobile interaction techniques with digital maps to ensure user-friendliness for a variety of use cases and different contexts as well as user groups. This includes – among other topics – the design and creation of voice user interfaces and the accessible representation of (and interaction with) digital map data using screen readers. In this context, the use of location-based services in connection with indoor maps is also examined.
Diversity-Sensitive Human-Computer Interaction and AI
Contact Person: Claudia Loitsch
Keywords: human-computer interaction, user modeling, adaptive systems, human-cantered design, diversity-sensitive design, user experience, user research, AI.
The penetration of digitalization into all areas of life and even all age groups has led to the situation that we now perform almost all tasks in some form of computer-assisted. This means that people, with their individual experiences, expectations, abilities and needs, play an even more important role in the development of computer-based solutions and user interfaces. The overall user experience and digital participation have an increasing economic and social relevance.
This topic investigates how diversity, or the variety of people, can be taken into account in the human-cantered design of computer-based systems and AI.
The idea is that computer-based systems and applications should adapt to our individual needs, abilities, and mental models to help us perform everyday tasks as efficiently as possible and, increasingly, in enjoyable or fun ways. This requires that digital applications and user interfaces become more intelligent and can adapt to our individual characteristics, abilities or life situations. It is also increasingly important that key technologies such as artificial intelligence take diversity into account, in particular that training data is fair and balanced, and does not promote discrimination and exclusion.
Topics will be developed with you depending on your interests. A wide range of questions can be considered in this topic area, such as:
- How do different expressions of personality traits (e.g., Big Five/OCEAN Model) affect digital design solutions?
- Which perceptual or behavioural psychological aspects, design guidelines, design decisions can be applied for diverse target groups (e.g. elderly people, people with migration background, people with cognitive impairments, etc.)?
- How can diversity be achieved and detected in training data for machine learning and AI?
- Which user models address diversity of groups or individuals that lead to a better user experience in adaptable or adaptive applications and user interfaces?
Social Assistance Robots ― Usability and Software Variability
Contact Person: David Gollasch
Keywords: Human-Robot Interaction, Robotic Cooperation, Emotion Recognition and Modelling, Software Variability, Software Ecosystems, Voice Interaction, Gesture Control, Smart Home, Adaptivity, Accessibility, Assistive Systems
Assistive technology improves or enables the participation in society as well as coping with everyday life for many people. Especially the trend of an aging society and the desire for greater inclusion leads to more significance for the field of developing powerful assistive technology. A subset of such technology is made up by social assistance robots (SARs) which are developed for various different tasks and purposes. One key issue in this regard is how to efficiently develop software for those assistive robots such that our target group of people with special needs are able to interact well with them. Thus, we need to design feasible and reliable development methodologies that lead to an appropriate human-robot interaction.
Furthermore, there arises a variety of interesting research questions. Here are some examples:
- How should diversity-sensitive voice interaction look like (related fields: voice user interfaces, conversational user interfaces, artificial intelligence)?
- How to implement assistance robot activities that dynamically match specific requirements (related fields: adaptivity, UI software variability, artificial intelligence)?
- What input modalities should be supported by an assistance robot (related fields: gesture controle, emotion recognition and modelling)?
- How to embed the assistance robot into its environment (related fields: smart home integration, context-sensitivity, ambient assisted living, ADL recognition)?
Improving Presence Lecturing for Blind and Visually Impaired People
Contact Person: Jens Voegler
During lectures, exercises and seminars, interactive whiteboards and slide shows are used quite regularely. By using latest Web technologies, the accessibility of those media for blind and partially sighted people can be improved.
Physical Information Visualisation
Contact Person: Meinhardt Branig
Physical Information Visualisation for Big Data Exploration for Blind and Visually Impaired People.
Completed Theses
As soon as theses are listed within the TUD Research Information System, they will be listed below.
Segmentierung und Vektorisierung von Objektsilhouetten mittels TOF-System.
Bachelorarbeit. Lukas Förster. Betreut durch Jens Bornschein.
Taktiles Zeichnen mittels Gesteneingabe.
Diplomarbeit. Kai Beyrich. Betreut durch Jens Bornschein.
Taktiler Druck von Brailleschrift in SVG-Bildern.
Bachelorarbeit. Dennis Körte. Betreut durch Denise Bornschein
Tool zur interaktiven Erkundung von taktilen Diagrammen auf einem zweidimensionalen Brailledisplay.
Bachelorarbeit. Constantin Lorz. Betreut durch Denise Bornschein.
Erzeugung von zugänglichen audio-taktilen Diagrammen
Masterarbeit. Michael Jobst. Betreut durch Christin Engel
Interaktionskonzepte für taktile Diagramme
Bachelorarbeit. Maximilian Letter. Betreut durch Christin Engel
Entwurf von Autorenwerkzeugen zur Unterstützung der Ersteller von zugänglichen Screenshots
Bachelorarbeit. Philipp Rabe. Betreut durch Christin Engel
Interaktive, zugängliche Diagramme in SVG für blinde Menschen
Bachelorarbeit. Sven Zedlick. Betreut durch Christin Engel und Meinhardt Branig
Vektorbasiertes Zeichnen mit direkter taktiler Rückmeldung.
Bachelorarbeit. Stephanie Schöne. Betreut durch Jens Bornschein.
Personas und Benutzerprofile für Menschen mit Dyslexie.
Masterarbeit. Betreut durch Claudia Loitsch.
Konzeption für den Zugang zu Tafelbildern auf interaktiven Whiteboards (IWB) unter Verwendung der Stiftplatte Hyperbraille.
Bachelorarbeit. Betreut durch Jens Voegler.
Design and Development of a Haptic Rein for a Guide Dog Robot.
Bachelorarbeit. Björn Einert. Betreut durch Limin Zeng.
Generierung von Wegbeschreibungen in Gebäuden für blinde Fußgänger.
Diplomarbeit. Philipp Thöricht. Betreut durch Prof. Gerhard Weber.
Visualisierung von Prüfberichten zur Bewertung der Qualität taktiler Grafiken.
Bachelorarbeit. Philipp Dienst. Betreut durch Jens Bornschein und Denise Prescher.
Erweiterung eines taktilen Arbeitsplatzes für blinde Nutzer mittels TOF-System.
Bachelorarbeit. Markus Roth. Betreut durch Jens Bornschein.
Generalisierung eines 3D-Kartenmodells zur Unterstützung der Mobilität blinder Menschen.
Bachelorarbeit. Laura Eichler. Betreut durch Jens Bornschein.
Towards an All-inclusive Indoor Navigation.
Großer Beleg. Florian Städtler. Betreut durch Martin Spindler.
Indoor Topological Mapping with a 3D ToF camera.
Bachelorarbeit. Tino Noeres. Betreut durch Limin Zeng.
Towards an All-inclusive Indoor Navigation: a Bluetooth Beacon based Proximity Sensing.
Großer Beleg. Florian Städtler. Betreut durch Limin Zeng.
Automatisierte Optimierung von SVG-Grafiken für eine taktile Ausgabe.
Masterarbeit. Yuan Liu. Betreut durch Jens Bornschein.
Barrierefreier Editor zur Festlegung einer Lesereihenfolge in SVG-Bildern.
Bachelorarbeit. Duc Anh Pham. Betreut durch Jens Bornschein.
Braille-Text-Rendering mit Stylesheet-Unterstützung für ein taktiles Flächendisplay.
Bachelorarbeit. Alexander Karpinski. Betreut durch Jens Bornschein.
Webgestützter Remote-Client zur Visualisierung und Prüfung von Anwendungsoberflächen.
Bachelorarbeit. Michael Koller. Betreut durch Michael Schmidt und Jens Bornschein.
Fachbücher mit den Händen lesen – Daisy-Reader für ein taktiles Flächendisplay.
Diplomarbeit. Ghiath Makhoul. Betreut durch Jens Bornschein.
SVG-Graphenzeichner für taktile Funktionsgraphen.
Großer Beleg. Gregor Harlan. Betreut durch Jens Bornschein.
Audio-taktiles Spiel für Blinde.
Großer Beleg. Erik Schulze. Betreut durch Denise Prescher.
Orientierungshilfen für die Erkundung taktiler Grafiken.
Diplomarbeit. Mandy Gerlach. Betreut durch Denise Prescher.
Multimodales Feedback zur Unterstützung blinder Fußgänger innerhalb von Gebäuden (Multimodal Feedback to Help Blind Pedestrians in Buildings).
Diplomarbeit. Alexander Fickel. Betreut durch Limin Zeng.
Haptische Verfahren zur Wahrnehmung von Hindernissen durch blinde Fußgänger (Haptic Interactions for Obstacle Detection by Blind Pedestrians).
Großer Beleg. Alexander Fickel. Betreut durch Limin Zeng.
- Enhancing the Accuracy of Obstacle Detection by 3D Time of Flight Cameras (Großen Beleg)
- Geographic Annotations for Accessible Personalized Cartography (Diplomarbeit)
- An Interactive Collaborative Platform of Accessible Geographical Content in a Mobile Environment (Diplomarbeit)
- Design and Evaluation of Tactile Map Symbols for the Visually Impaired (Großen Beleg)
- Detection and Non-Visual Representation of Objects based on a 3D TOF Range Camera (Diplomarbeit)
- Annotationseditor zur Unterstützung der videobasierten Auswertung von Usability-Studien (Großer Beleg)
- Haptischer Editor für den Geometrieunterricht (Diplomarbeit)
- Stylesheet für die flächige Ausgabe von Braille (Diplomarbeit)
- Untersuchung haptischer Interaktion (Großer Beleg)
- Defizite des QTI-2-Standards und Onyx bezüglich der Barrierefreiheit (Großer Beleg)
- Vergleich von nicht-visuellen Interaktionsmechanismen basierend auf taktiler und auditiver Ausgabe (Großer Beleg)
- Prüfkriterien für Grafiken in für blinde Nutzer aufbereiteten Fachbüchern (Bachelorarbeit)
- Audiohaptischer Zugang zu mathemtischen Dokumenten (Beleg)
- Werkzeuge zur Barrierefreien Aufbereitung von Vorlesungsaufzeichnungen (Beleg)
- Barriefreie Multimediainhalte im Web mit HTML5 (Beleg)
- DAISY-based Accessible Location Based Service (Beleg)
- Videovorlesungen in Lernplattformen (Diplomarbeit)
- Das Smartphone zur Umgebungserkundung für Blinde und Sehbehinderte (Beleg)
- Barrierefreier personalisierbarer Stadtführer für den mobilen Einsatz (Beleg)
- Sonifikation in Umsteigegebäuden (Beleg)
- Smartphones sehen 3D (Beleg)
- Routenführung durch Sonifikation für Blinde und Sehbehinderte (Diplomarbeit)
- Routenplanung in Gebäuden für blinde und sehbehinderte Fußgänger (Diplomarbeit)
- Erweiterung des DAISY-Hörbuchformats zur Beschreibung von Karten & Lageplänen (Großer Beleg/Studienarbeit)
- Barrierefreier mobiler Annotationseditor für Indoor-Lagepläne (Diplomarbeit)
- Navigationssystem für blinde Fußgänger und ÖPNV-Nutzer (Diplomarbeit)
- Automatische Klassifikation von Landmarken und Hindernissen für blinde Fußgänger in Innenräumen mittels Ultraschall (Masterarbeit)