Chi-Min Chan

Orcid: 0009-0006-0218-3412

According to our database1, Chi-Min Chan authored at least 16 papers between 2021 and 2025.

Collaborative distances:
  • Dijkstra number2 of four.
  • Erdős number3 of four.

Timeline

Legend:

Book 
In proceedings 
Article 
PhD thesis 
Dataset
Other 

Links

On csauthors.net:

Bibliography

2025
PIP: Perturbation-based Iterative Pruning for Large Language Models.
CoRR, January, 2025

2024
EVA: An Embodied World Model for Future Video Anticipation.
CoRR, 2024

HiPrompt: Tuning-free Higher-Resolution Generation with Hierarchical MLLM Prompts.
CoRR, 2024

AgentMonitor: A Plug-and-Play Framework for Predictive and Secure Multi-Agent Systems.
CoRR, 2024

Importance Weighting Can Help Large Language Models Self-Improve.
CoRR, 2024

RQ-RAG: Learning to Refine Queries for Retrieval Augmented Generation.
CoRR, 2024

AgentVerse: Facilitating Multi-Agent Collaboration and Exploring Emergent Behaviors.
Proceedings of the Twelfth International Conference on Learning Representations, 2024

ChatEval: Towards Better LLM-based Evaluators through Multi-Agent Debate.
Proceedings of the Twelfth International Conference on Learning Representations, 2024

2023
Parameter-efficient fine-tuning of large-scale pre-trained language models.
Nat. Mac. Intell., March, 2023

AgentVerse: Facilitating Multi-Agent Collaboration and Exploring Emergent Behaviors in Agents.
CoRR, 2023

Arbitrary Few Parameters are Good Enough for Adapting Large-scale Pre-trained Language Models.
CoRR, 2023

Exploring the Impact of Model Scaling on Parameter-Efficient Tuning.
Proceedings of the 2023 Conference on Empirical Methods in Natural Language Processing, 2023

Plug-and-Play Document Modules for Pre-trained Models.
Proceedings of the 61st Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), 2023

2022
Delta Tuning: A Comprehensive Study of Parameter Efficient Methods for Pre-trained Language Models.
CoRR, 2022

On Transferability of Prompt Tuning for Natural Language Processing.
Proceedings of the 2022 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, 2022

2021
On Transferability of Prompt Tuning for Natural Language Understanding.
CoRR, 2021


  Loading...