Yuzhe Zi
Timeline
Legend:
Book In proceedings Article PhD thesis Dataset OtherLinks
On csauthors.net:
Bibliography
2024
RKLD: Reverse KL-Divergence-based Knowledge Distillation for Unlearning Personal Information in Large Language Models.
CoRR, 2024
Proceedings of the 2024 Joint International Conference on Computational Linguistics, 2024