Search

ZLST
ZLST
  • Home
  • News
  • People
  • Publications
  • Contact
Defang Chen

Defang Chen

陈德仿 博士后

Zhejiang University

Latest

  • Simple and Fast Distillation of Diffusion Models
  • On the Trajectory Regularity of ODE-based Diffusion Sampling
  • Online Adversarial Knowledge Distillation for Graph Neural Networks
  • Fast Ode-based Sampling for Diffusion Models in Around 5 Steps
  • Accelerating Diffusion Sampling with Classifier-based Feature Distillation
  • Adaptive Multi-Teacher Knowledge Distillation with Meta-Learning
  • Customizing Synthetic Data for Data-Free Student Learning
  • Holistic Weighted Distillation for Semantic Segmentation
  • Online Cross-Layer Knowledge Distillation on Graph Neural Networks with Deep Supervision
  • Knowledge Distillation with Deep Supervision
  • Alignahead: Online Cross-Layer Knowledge Extraction on Graph Neural Networks
  • Knowledge Distillation with the Reused Teacher Classifier
  • Collaborative Knowledge Distillation for Heterogeneous Information Network Embedding
  • Confidence-Aware Multi-Teacher Knowledge Distillation
  • JointE: Jointly Utilizing 1D And 2D Convolution for Knowledge Graph Embedding
  • Distilling Holistic Knowledge with Graph Neural Networks
  • Cross-Layer Distillation with Semantic Calibration
  • Online Knowledge Distillation with Diverse Peers

© 2025 Me. This work is licensed under CC BY NC ND 4.0

Published with Hugo Blox Builder — the free, open source website builder that empowers creators.

Cite
Copy Download