Class Discriminative Knowledge Distillation.
IEEE Trans. Emerg. Top. Comput. Intell., April, 2025
Enhancing Pre-Trained Model-Based Class-Incremental Learning through Neural Collapse.
CoRR, April, 2025
Neural Collapse Inspired Knowledge Distillation.
Proceedings of the AAAI-25, Sponsored by the Association for the Advancement of Artificial Intelligence, February 25, 2025
Siamese Transformer Networks for Few-shot Image Classification.
CoRR, 2024
Knowledge Distillation via Token-Level Relationship Graph Based on the Big Data Technologies.
Big Data Res., 2024
Grouped Logit Distillation Enhanced with Superclass Awareness for Efficient Knowledge Transfer.
Proceedings of the ECAI 2024 - 27th European Conference on Artificial Intelligence, 19-24 October 2024, Santiago de Compostela, Spain, 2024
You Only Need Less Attention at Each Stage in Vision Transformers.
Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, 2024
Knowledge Distillation via Token-level Relationship Graph.
CoRR, 2023
Generating Pseudo-labels Adaptively for Few-shot Model-Agnostic Meta-Learning.
Proceedings of the 34th British Machine Vision Conference 2023, 2023
Class-aware Information for Logit-based Knowledge Distillation.
CoRR, 2022