Dong Hyeon Jeon

According to our database1, Dong Hyeon Jeon authored at least 12 papers between 2021 and 2024.

Collaborative distances:
  • Dijkstra number2 of four.
  • Erdős number3 of four.

Timeline

Legend:

Book 
In proceedings 
Article 
PhD thesis 
Dataset
Other 

Links

On csauthors.net:

Bibliography

2024
SLM as Guardian: Pioneering AI Safety with Small Language Models.
CoRR, 2024

Taxonomy and Analysis of Sensitive User Queries in Generative AI Search.
CoRR, 2024

SLM as Guardian: Pioneering AI Safety with Small Language Model.
Proceedings of the 2024 Conference on Empirical Methods in Natural Language Processing: EMNLP 2024, 2024

RADCoT: Retrieval-Augmented Distillation to Specialization Models for Generating Chain-of-Thoughts in Query Expansion.
Proceedings of the 2024 Joint International Conference on Computational Linguistics, 2024

2023
ConcatPlexer: Additional Dim1 Batching for Faster ViTs.
CoRR, 2023

AADiff: Audio-Aligned Video Synthesis with Text-to-Image Diffusion.
CoRR, 2023

Leveraging Off-the-shelf Diffusion Model for Multi-attribute Fashion Image Manipulation.
Proceedings of the IEEE/CVF Winter Conference on Applications of Computer Vision, 2023

MAFiD: Moving Average Equipped Fusion-in-Decoder for Question Answering over Tabular and Textual Data.
Proceedings of the Findings of the Association for Computational Linguistics: EACL 2023, 2023

Unifying Vision-Language Representation Space with Single-Tower Transformer.
Proceedings of the Thirty-Seventh AAAI Conference on Artificial Intelligence, 2023

2022
SISER: Semantic-Infused Selective Graph Reasoning for Fact Verification.
Proceedings of the 29th International Conference on Computational Linguistics, 2022

LM-BFF-MS: Improving Few-Shot Fine-tuning of Language Models based on Multiple Soft Demonstration Memory.
Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 2: Short Papers), 2022

2021
What Changes Can Large-scale Language Models Bring? Intensive Study on HyperCLOVA: Billions-scale Korean Generative Pretrained Transformers.
Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing, 2021


  Loading...