2024
Mixed Distillation Helps Smaller Language Models Reason Better.
Proceedings of the Findings of the Association for Computational Linguistics: EMNLP 2024, 2024

2023
Mixed Distillation Helps Smaller Language Model Better Reasoning.
CoRR, 2023

2020
Chinese medical named entity recognition based on multi-granularity semantic dictionary and multimodal tree.
J. Biomed. Informatics, 2020

Dynamic knowledge graph based fake-review detection.
Appl. Intell., 2020

2019
Fusional Recognition for Depressive Tendency With Multi-Modal Feature.
IEEE Access, 2019