Yongyu Mu

According to our database1, Yongyu Mu authored at least 14 papers between 2020 and 2025.

Collaborative distances:
  • Dijkstra number2 of four.
  • Erdős number3 of four.

Timeline

Legend:

Book 
In proceedings 
Article 
PhD thesis 
Dataset
Other 

Links

On csauthors.net:

Bibliography

2025
SLAM: Towards Efficient Multilingual Reasoning via Selective Language Alignment.
Proceedings of the 31st International Conference on Computational Linguistics, 2025

2024
LRHP: Learning Representations for Human Preferences via Preference Pairs.
CoRR, 2024

RoVRM: A Robust Visual Reward Model Optimized via Auxiliary Textual Preference Data.
CoRR, 2024

Cross-layer Attention Sharing for Large Language Models.
CoRR, 2024

Large Language Models are Parallel Multilingual Learners.
CoRR, 2024

Revealing the Parallel Multilingual Learning within Large Language Models.
Proceedings of the 2024 Conference on Empirical Methods in Natural Language Processing, 2024

Translate-and-Revise: Boosting Large Language Models for Constrained Translation.
Proceedings of the Chinese Computational Linguistics - 23rd China National Conference, 2024

Hybrid Alignment Training for Large Language Models.
Proceedings of the Findings of the Association for Computational Linguistics, 2024

2023
Augmenting Large Language Model Translators via Translation Memories.
Proceedings of the Findings of the Association for Computational Linguistics: ACL 2023, 2023

2022
Improved Knowledge Distillation for Pre-trained Language Models via Knowledge Selection.
Proceedings of the Findings of the Association for Computational Linguistics: EMNLP 2022, 2022

2021
The NiuTrans System for the WMT21 Efficiency Task.
CoRR, 2021

The NiuTrans Machine Translation Systems for WMT21.
Proceedings of the Sixth Conference on Machine Translation, 2021

The NiuTrans System for the WMT 2021 Efficiency Task.
Proceedings of the Sixth Conference on Machine Translation, 2021

2020
The NiuTrans Machine Translation Systems for WMT20.
Proceedings of the Fifth Conference on Machine Translation, 2020


  Loading...