Minchan Jeong
According to our database1,
Minchan Jeong
authored at least 14 papers
between 2021 and 2024.
Collaborative distances:
Collaborative distances:
Timeline
Legend:
Book In proceedings Article PhD thesis Dataset OtherLinks
On csauthors.net:
Bibliography
2024
BAPO: Base-Anchored Preference Optimization for Personalized Alignment in Large Language Models.
CoRR, 2024
FedDr+: Stabilizing Dot-regression with Global Feature Distillation for Federated Learning.
CoRR, 2024
Proceedings of the Innovative Mobile and Internet Services in Ubiquitous Computing, 2024
BAPO: Base-Anchored Preference Optimization for Overcoming Forgetting in Large Language Models Personalization.
Proceedings of the Findings of the Association for Computational Linguistics: EMNLP 2024, 2024
FedSOL: Stabilized Orthogonal Learning with Proximal Restrictions in Federated Learning.
Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, 2024
Hard Prompts Made Interpretable: Sparse Entropy Regularization for Prompt Tuning with RL.
Proceedings of the 62nd Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), 2024
2023
CoRR, 2023
Proceedings of the Findings of the Association for Computational Linguistics: EMNLP 2023, 2023
Revisiting Intermediate Layer Distillation for Compressing Language Models: An Overfitting Perspective.
Proceedings of the Findings of the Association for Computational Linguistics: EACL 2023, 2023
Toward Risk-based Optimistic Exploration for Cooperative Multi-Agent Reinforcement Learning.
Proceedings of the 2023 International Conference on Autonomous Agents and Multiagent Systems, 2023
Proceedings of the International Conference on Artificial Intelligence and Statistics, 2023
2022
Proceedings of the Advances in Neural Information Processing Systems 35: Annual Conference on Neural Information Processing Systems 2022, 2022
Proceedings of the Advances in Neural Information Processing Systems 35: Annual Conference on Neural Information Processing Systems 2022, 2022
2021
Preservation of the Global Knowledge by Not-True Self Knowledge Distillation in Federated Learning.
CoRR, 2021