Dahyun Kim

Orcid: 0000-0002-0820-4214

Affiliations:
  • Upstage AI, South Korea
  • Gwangju Institute of Science and Technology (GIST), South Korea (former)


According to our database1, Dahyun Kim authored at least 15 papers between 2019 and 2024.

Collaborative distances:
  • Dijkstra number2 of five.
  • Erdős number3 of four.

Timeline

Legend:

Book 
In proceedings 
Article 
PhD thesis 
Dataset
Other 

Links

Online presence:

On csauthors.net:

Bibliography

2024
Representing the Under-Represented: Cultural and Core Capability Benchmarks for Developing Thai Large Language Models.
CoRR, 2024

1 Trillion Token (1TT) Platform: A Novel Framework for Efficient Data Sharing and Compensation in Large Language Models.
CoRR, 2024

Evalverse: Unified and Accessible Library for Large Language Model Evaluation.
CoRR, 2024

Long-Tailed Recognition on Binary Networks by Calibrating A Pre-trained Model.
CoRR, 2024

Dataverse: Open-Source ETL (Extract, Transform, Load) Pipeline for Large Language Models.
CoRR, 2024

sDPO: Don't Use Your Data All at Once.
CoRR, 2024

Model-Based Data-Centric AI: Bridging the Divide Between Academic Ideals and Industrial Pragmatism.
CoRR, 2024

SOLAR 10.7B: Scaling Large Language Models with Simple yet Effective Depth Up-Scaling.
Proceedings of the 2024 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies: Industry Track, 2024

Open Ko-LLM Leaderboard: Evaluating Large Language Models in Korean with Ko-H5 Benchmark.
Proceedings of the 62nd Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), 2024

2022
Online Continual Learning on Class Incremental Blurry Task Configuration with Anytime Inference.
Proceedings of the Tenth International Conference on Learning Representations, 2022

Unsupervised Representation Learning for Binary Networks by Joint Classifier Learning.
Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, 2022

2021
Self-Supervised Learning for Binary Networks by Joint Classifier Training.
CoRR, 2021

BNAS v2: Learning Architectures for Binary Networks with Empirical Improvements.
CoRR, 2021

2020
Learning Architectures for Binary Networks.
Proceedings of the Computer Vision - ECCV 2020, 2020

2019
Incremental Learning with Maximum Entropy Regularization: Rethinking Forgetting and Intransigence.
CoRR, 2019


  Loading...