Defang Chen

Orcid: 0000-0003-0833-7401

Affiliations:
  • Zhejiang University, Hangzhou, Zhejiang, CN


According to our database1, Defang Chen authored at least 37 papers between 2019 and 2024.

Collaborative distances:

Timeline

Legend:

Book 
In proceedings 
Article 
PhD thesis 
Dataset
Other 

Links

Online presence:

On csauthors.net:

Bibliography

2024
Output Regularization With Cluster-Based Soft Targets.
IEEE Trans. Neural Networks Learn. Syst., August, 2024

Multi-exit self-distillation with appropriate teachers.
Frontiers Inf. Technol. Electron. Eng., March, 2024

Online adversarial knowledge distillation for graph neural networks.
Expert Syst. Appl., March, 2024

Simple and Fast Distillation of Diffusion Models.
CoRR, 2024

Conditional Image Synthesis with Diffusion Models: A Survey.
CoRR, 2024

Knowledge Distillation with Refined Logits.
CoRR, 2024

Knowledge Translation: A New Pathway for Model Compression.
CoRR, 2024

On the Trajectory Regularity of ODE-based Diffusion Sampling.
Proceedings of the Forty-first International Conference on Machine Learning, 2024

Fast ODE-based Sampling for Diffusion Models in Around 5 Steps.
Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, 2024

Confidence-aware Self-Semantic Distillation on Knowledge Graph Embedding.
Proceedings of the 33rd ACM International Conference on Information and Knowledge Management, 2024

2023
Collaborative Semantic Aggregation and Calibration for Federated Domain Generalization.
IEEE Trans. Knowl. Data Eng., December, 2023

Finalizing your reference list with machine learning.
J. Ambient Intell. Humaniz. Comput., November, 2023

Online cross-layer knowledge distillation on graph neural networks with deep supervision.
Neural Comput. Appl., October, 2023

SemCKD: Semantic Calibration for Cross-Layer Knowledge Distillation.
IEEE Trans. Knowl. Data Eng., June, 2023

Domain-Specific Bias Filtering for Single Labeled Domain Generalization.
Int. J. Comput. Vis., 2023

A Geometric Perspective on Diffusion Models.
CoRR, 2023

Knowledge Distillation with Deep Supervision.
Proceedings of the International Joint Conference on Neural Networks, 2023

Adaptive Multi-Teacher Knowledge Distillation with Meta-Learning.
Proceedings of the IEEE International Conference on Multimedia and Expo, 2023

Accelerating Diffusion Sampling with Classifier-based Feature Distillation.
Proceedings of the IEEE International Conference on Multimedia and Expo, 2023

Holistic Weighted Distillation for Semantic Segmentation.
Proceedings of the IEEE International Conference on Multimedia and Expo, 2023

Customizing Synthetic Data for Data-Free Student Learning.
Proceedings of the IEEE International Conference on Multimedia and Expo, 2023

2022
JointE: Jointly utilizing 1D and 2D convolution for knowledge graph embedding.
Knowl. Based Syst., 2022

Improving Knowledge Graph Embedding via Iterative Self-Semantic Knowledge Distillation.
CoRR, 2022

Deeply-Supervised Knowledge Distillation.
CoRR, 2022

Collaborative Knowledge Distillation for Heterogeneous Information Network Embedding.
Proceedings of the WWW '22: The ACM Web Conference 2022, Virtual Event, Lyon, France, April 25, 2022

Label-Efficient Domain Generalization via Collaborative Exploration and Generalization.
Proceedings of the MM '22: The 30th ACM International Conference on Multimedia, Lisboa, Portugal, October 10, 2022

Alignahead: Online Cross-Layer Knowledge Extraction on Graph Neural Networks.
Proceedings of the International Joint Conference on Neural Networks, 2022

Confidence-Aware Multi-Teacher Knowledge Distillation.
Proceedings of the IEEE International Conference on Acoustics, 2022

Knowledge Distillation with the Reused Teacher Classifier.
Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, 2022

2021
Online Adversarial Distillation for Graph Neural Networks.
CoRR, 2021

Do We Need to Directly Access the Source Datasets for Domain Generalization?
CoRR, 2021

Exploring the Connection between Knowledge Distillation and Logits Matching.
CoRR, 2021

Distilling Holistic Knowledge with Graph Neural Networks.
Proceedings of the 2021 IEEE/CVF International Conference on Computer Vision, 2021

Cross-Layer Distillation with Semantic Calibration.
Proceedings of the Thirty-Fifth AAAI Conference on Artificial Intelligence, 2021

2020
Online Knowledge Distillation via Multi-branch Diversity Enhancement.
Proceedings of the Computer Vision - ACCV 2020 - 15th Asian Conference on Computer Vision, Kyoto, Japan, November 30, 2020

Online Knowledge Distillation with Diverse Peers.
Proceedings of the Thirty-Fourth AAAI Conference on Artificial Intelligence, 2020

2019
HAHE: Hierarchical Attentive Heterogeneous Information Network Embedding.
CoRR, 2019


  Loading...