Keivan Alizadeh-Vahid

Orcid: 0009-0001-2703-6674

Affiliations:
  • University of Washington, USA


According to our database1, Keivan Alizadeh-Vahid authored at least 15 papers between 2019 and 2024.

Collaborative distances:
  • Dijkstra number2 of four.
  • Erdős number3 of four.

Timeline

Legend:

Book 
In proceedings 
Article 
PhD thesis 
Dataset
Other 

Links

On csauthors.net:

Bibliography

2024
SALSA: Soup-based Alignment Learning for Stronger Adaptation in RLHF.
CoRR, 2024

Computational Bottlenecks of Training Small-scale Large Language Models.
CoRR, 2024

Duo-LLM: A Framework for Studying Adaptive Computation in Large Language Models.
CoRR, 2024

GSM-Symbolic: Understanding the Limitations of Mathematical Reasoning in Large Language Models.
CoRR, 2024

Scaling Smart: Accelerating Large Language Model Pre-training with Small Model Initialization.
CoRR, 2024

eDKM: An Efficient and Accurate Train-Time Weight Clustering for Large Language Models.
IEEE Comput. Archit. Lett., 2024

ReLU Strikes Back: Exploiting Activation Sparsity in Large Language Models.
Proceedings of the Twelfth International Conference on Learning Representations, 2024

LLM in a flash: Efficient Large Language Model Inference with Limited Memory.
Proceedings of the 62nd Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), 2024

2023
FLUID: A Unified Evaluation Framework for Flexible Sequential Data.
Trans. Mach. Learn. Res., 2023

LLM in a flash: Efficient Large Language Model Inference with Limited Memory.
CoRR, 2023

2022
DKM: Differentiable k-Means Clustering Layer for Neural Network Compression.
Proceedings of the Tenth International Conference on Learning Representations, 2022

2020
Recurrent Poisson Factorization for Temporal Recommendation.
IEEE Trans. Knowl. Data Eng., 2020

In the Wild: From ML Models to Pragmatic ML Systems.
CoRR, 2020

Butterfly Transform: An Efficient FFT Based Neural Architecture Design.
Proceedings of the 2020 IEEE/CVF Conference on Computer Vision and Pattern Recognition, 2020

2019
Butterfly Transform: An Efficient FFT Based Neural Architecture Design.
CoRR, 2019


  Loading...