Longze Chen

According to our database1, Longze Chen authored at least 13 papers in 2024.

Collaborative distances:
  • Dijkstra number2 of five.
  • Erdős number3 of four.

Timeline

Legend:

Book 
In proceedings 
Article 
PhD thesis 
Dataset
Other 

Links

On csauthors.net:

Bibliography

2024
IP-MOT: Instance Prompt Learning for Cross-Domain Multi-Object Tracking.
CoRR, 2024

PersonaMath: Enhancing Math Reasoning through Persona-Driven Data Augmentation.
CoRR, 2024

The Imperative of Conversation Analysis in the Era of LLMs: A Survey of Tasks, Techniques, and Trends.
CoRR, 2024

MMEvol: Empowering Multimodal Large Language Models with Evol-Instruct.
CoRR, 2024

Hierarchical Context Pruning: Optimizing Real-World Code Completion with Repository-Level Pretrained Code LLMs.
CoRR, 2024

Leave No Document Behind: Benchmarking Long-Context LLMs with Extended Multi-Doc QA.
CoRR, 2024

Long Context is Not Long at All: A Prospector of Long-Dependency Data for Large Language Models.
CoRR, 2024

DEEM: Diffusion Models Serve as the Eyes of Large Language Models for Image Perception.
CoRR, 2024

DEFT: Distribution-guided Efficient Fine-Tuning for Human Alignment.
Proceedings of the Findings of the Association for Computational Linguistics: EMNLP 2024, 2024

Leave No Document Behind: Benchmarking Long-Context LLMs with Extended Multi-Doc QA.
Proceedings of the 2024 Conference on Empirical Methods in Natural Language Processing, 2024

Ruler: A Model-Agnostic Method to Control Generated Length for Large Language Models.
Proceedings of the Findings of the Association for Computational Linguistics: EMNLP 2024, 2024

Marathon: A Race Through the Realm of Long Context with Large Language Models.
Proceedings of the 62nd Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), 2024

Long Context is Not Long at All: A Prospector of Long-Dependency Data for Large Language Models.
Proceedings of the 62nd Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), 2024


  Loading...