2022
On the Effect of Pretraining Corpora on In-context Learning by a Large-scale Language Model.
Proceedings of the 2022 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, 2022

AlphaTuning: Quantization-Aware Parameter-Efficient Adaptation of Large-Scale Pre-Trained Language Models.
Proceedings of the Findings of the Association for Computational Linguistics: EMNLP 2022, 2022

Keep Me Updated! Memory Management in Long-term Conversations.
Proceedings of the Findings of the Association for Computational Linguistics: EMNLP 2022, 2022

2021
What Changes Can Large-scale Language Models Bring? Intensive Study on HyperCLOVA: Billions-scale Korean Generative Pretrained Transformers.
Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing, 2021

2020
CareCall: a Call-Based Active Monitoring Dialog Agent for Managing COVID-19 Pandemic.
CoRR, 2020

ClovaCall: Korean Goal-Oriented Dialog Speech Corpus for Automatic Speech Recognition of Contact Centers.
Proceedings of the 21st Annual Conference of the International Speech Communication Association, 2020

2019
Tripartite Heterogeneous Graph Propagation for Large-scale Social Recommendation.
Proceedings of ACM RecSys 2019 Late-Breaking Results co-located with the 13th ACM Conference on Recommender Systems, 2019

2018
NSML: Meet the MLaaS platform with a real-world case study.
CoRR, 2018

CHOPT : Automated Hyperparameter Optimization Framework for Cloud-Based Machine Learning Platforms.
CoRR, 2018

2017
NSML: A Machine Learning Platform That Enables You to Focus on Your Models.
CoRR, 2017