Hongsun Jang

Orcid: 0000-0003-4291-6124

According to our database1, Hongsun Jang authored at least 8 papers between 2023 and 2024.

Collaborative distances:
  • Dijkstra number2 of four.
  • Erdős number3 of four.

Timeline

Legend:

Book 
In proceedings 
Article 
PhD thesis 
Dataset
Other 

Links

On csauthors.net:

Bibliography

2024
Smart-Infinity: Fast Large Language Model Training using Near-Storage Processing on a Real System.
Proceedings of the IEEE International Symposium on High-Performance Computer Architecture, 2024

Pipette: Automatic Fine-Grained Large Language Model Training Configurator for Real-World Clusters.
Proceedings of the Design, Automation & Test in Europe Conference & Exhibition, 2024

PeerAiD: Improving Adversarial Distillation from a Specialized Peer Tutor.
Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, 2024

GraNNDis: Fast Distributed Graph Neural Network Training Framework for Multi-Server Clusters.
Proceedings of the 2024 International Conference on Parallel Architectures and Compilation Techniques, 2024

2023
GraNNDis: Efficient Unified Distributed Training Framework for Deep GNNs on Large Clusters.
CoRR, 2023

Pipe-BD: Pipelined Parallel Blockwise Distillation.
Proceedings of the Design, Automation & Test in Europe Conference & Exhibition, 2023

Fast Adversarial Training with Dynamic Batch-level Attack Control.
Proceedings of the 60th ACM/IEEE Design Automation Conference, 2023

Optimus-CC: Efficient Large NLP Model Training with 3D Parallelism Aware Communication Compression.
Proceedings of the 28th ACM International Conference on Architectural Support for Programming Languages and Operating Systems, 2023


  Loading...