Young Jin Kim

Orcid: 0000-0003-2976-3047

Affiliations:
  • Georgia Institute of Technology, Atlanta, GA, USA


According to our database1, Young Jin Kim authored at least 21 papers between 2016 and 2024.

Collaborative distances:

Timeline

2016
2017
2018
2019
2020
2021
2022
2023
2024
0
1
2
3
4
5
6
7
1
4
3
1
1
1
2
1
3
1
1
1
1

Legend:

Book 
In proceedings 
Article 
PhD thesis 
Dataset
Other 

Links

Online presence:

On csauthors.net:

Bibliography

2024
GRIN: GRadient-INformed MoE.
CoRR, 2024

Contrastive Preference Optimization: Pushing the Boundaries of LLM Performance in Machine Translation.
Proceedings of the Forty-first International Conference on Machine Learning, 2024

A Paradigm Shift in Machine Translation: Boosting Translation Performance of Large Language Models.
Proceedings of the Twelfth International Conference on Learning Representations, 2024

2023
Mixture of Quantized Experts (MoQE): Complementary Effect of Low-bit Quantization and Robustness.
CoRR, 2023

Task-Based MoE for Multitask Multilingual Machine Translation.
CoRR, 2023

FineQuant: Unlocking Efficiency with Fine-Grained Weight-Only Quantization for LLMs.
CoRR, 2023

How Good Are GPT Models at Machine Translation? A Comprehensive Evaluation.
CoRR, 2023

AutoMoE: Heterogeneous Mixture-of-Experts with Adaptive Computation for Efficient Neural Machine Translation.
Proceedings of the Findings of the Association for Computational Linguistics: ACL 2023, 2023

2022
Who Says Elephants Can't Run: Bringing Large Scale MoE Models into Cloud Scale Production.
CoRR, 2022

AutoMoE: Neural Architecture Search for Efficient Sparsely Activated Transformers.
CoRR, 2022

Gating Dropout: Communication-efficient Regularization for Sparsely Activated Transformers.
CoRR, 2022

Gating Dropout: Communication-efficient Regularization for Sparsely Activated Transformers.
Proceedings of the International Conference on Machine Learning, 2022

Taming Sparsely Activated Transformer with Stochastic Experts.
Proceedings of the Tenth International Conference on Learning Representations, 2022

Fast Vocabulary Projection Method via Clustering for Multilingual Machine Translation on GPU.
Proceedings of the 15th biennial conference of the Association for Machine Translation in the Americas (Volume 1: Research Track), 2022

2021
Scalable and Efficient MoE Training for Multitask Multilingual Models.
CoRR, 2021

2020
FastFormers: Highly Efficient Transformer Models for Natural Language Understanding.
CoRR, 2020

2019
Time- and space-parallel simulation of air traffic networks.
Simul., 2019

From Research to Production and Back: Ludicrously Fast Neural Machine Translation.
Proceedings of the 3rd Workshop on Neural Generation and Translation@EMNLP-IJCNLP 2019, 2019

2018
A deep learning and parallel simulation methodology for air traffic management.
PhD thesis, 2018

2017
Time-parallel simulation of air traffic networks.
Proceedings of the 2017 Winter Simulation Conference, 2017

2016
Creation of a decision-support methodology for selecting more-electric aircraft subsystem technologies.
Proceedings of the Annual IEEE Systems Conference, 2016


  Loading...