U-Net has become an indispensable component in medical image segmentation tasks. The characteristic of U-Net is that it produces multi-scale features, multi-scale features can provide hidden features under different views, which helps improve semantic segmentation performance. In addition, knowledge distillation, e.g., feature distillation or logit distillation, is a mechanism that can efficiently compress models. Feature distillation guides students’ feature learning by transferring feature information. In order to be able to supervise and distill these multi-scale features in feature distillation, we propose a Multi-scale Feature Distillation (MFD). MFD uses the teacher's predicted logits as the distillation target, and the students' multi-scale features of different layer as the supervision target. Nowadays, it has become a trend to decouple logits distillation. Original logits distillation can usually be divided into target classes and non-target classes. Target classes and non-target classes often play different roles in feature distillation and logits distillation. We introduce a Decoupled Multi-scale Distillation (DMD) that utilize target classes and non-target classes for feature distillation and logits distillation. When performing feature distillation, we use non-target classes for distillation, and when performing logits distillation we use target classes for distillation. Experiments on different datasets demonstrate that the DMD is effective.


    Zugriff

    Zugriff prüfen

    Verfügbarkeit in meiner Bibliothek prüfen

    Bestellung bei Subito €


    Exportieren, teilen und zitieren



    Titel :

    Decoupled multi-scale distillation for medical image segmentation


    Beteiligte:
    Qin, Chuan (Herausgeber:in) / Zhou, Huiyu (Herausgeber:in) / Zhang, Dingwen (Autor:in) / Yu, Xiangchun (Autor:in)

    Kongress:

    International Conference on Image Processing and Artificial Intelligence (ICIPAl 2024) ; 2024 ; Suzhou, China


    Erschienen in:

    Proc. SPIE ; 13213


    Erscheinungsdatum :

    19.07.2024





    Medientyp :

    Aufsatz (Konferenz)


    Format :

    Elektronische Ressource


    Sprache :

    Englisch



    Scale Decoupled Distillation

    Luo, Shicai Wei Chunbo Luo Yang | ArXiv | 2024

    Freier Zugriff

    Decoupled Knowledge Distillation

    Zhao, Borui / Cui, Quan / Song, Renjie et al. | ArXiv | 2022

    Freier Zugriff

    Empowering Object Detection: Unleashing the Potential of Decoupled and Interactive Distillation

    Qian, Fulan / Hong, Jiacheng / Yan, Huanqian et al. | IEEE | 2025


    Color image segmentation using multi-scale clustering

    Kehtarnavaz, N. / Monaco, J. / Nimtschek, J. et al. | IEEE | 1998


    Color Image Segmentation Using Multi-Scale Clustering

    Kehtarnavaz, N. / Monaco, J. / Nimtschek, J. et al. | British Library Conference Proceedings | 1998