Forschungsthemen
[MA] Pipeline of improving the few-shot classification performance of medical images using knowledge distillation
In recent years, the field of medicine has made significant advancements with the integration of deep learning techniques and computer vision knowledge. However, training deep learning models necessitates a large amount of labeled data, which is a time-consuming and expensive to collect. Moreover, manual labeling introduces potential errors that can impact the model's performance. This master's thesis focuses on improving classification performance on medical images using knowledge distillation techniques. To achieve this objective, I will develop a comprehensive framework that integrates various knowledge distillation methods(e.g Hint-based distillation, Attention transfer, Contrastive learning distillation, and Relation knowledge distillation) and strategies(e.g self-distillation strategy) for different kinds of datasets, with a focus on extensibility and reusability. Given the limited availability of labeled training data, this pipeline aims to enhance the classification performance of medical images, especially for small datasets. This research area holds great significance in the field of AI application, as it has the potential to substantially improve the diagnosis and detection of various medical conditions, ultimately leading to improved patient outcomes without imposing an additional labeling burden on the medical staff.
Betreuer: Karsten Wendt