Home
News
People
Publications
Contact
Knowledge Distillation
Collaborative Knowledge Distillation for Heterogeneous Information Network Embedding
Learning low-dimensional representations for Heterogeneous Information Networks (HINs) has drawn increasing attention recently for its …
Can Wang
,
ShengZhou
,
KangYu
,
Defang Chen
,
BolangLi
,
Yan Feng
,
Chun Chen
PDF
Cite
DOI
Confidence-Aware Multi-Teacher Knowledge Distillation
Knowledge distillation is initially introduced to utilize additional supervision from a single teacher model for the student model …
HailinZhang
,
Defang Chen
,
Can Wang
PDF
Cite
DOI
Distilling Holistic Knowledge with Graph Neural Networks
Knowledge Distillation (KD) aims at transferring knowledge from a larger well-optimized teacher network to a smaller learnable student. …
ShengZhou
,
YuchengWang
,
Defang Chen
,
Jiawei Chen
,
XinWang
,
Can Wang
,
JiajunBu
PDF
Cite
Code
DOI
Cross-Layer Distillation with Semantic Calibration
Knowledge distillation is a technique to enhance the generalization ability of a student model by exploiting outputs from a teacher …
Defang Chen
,
JianpingMei
,
YuanZhang
,
Can Wang
,
Zhe Wang
,
Yan Feng
,
Chun Chen
PDF
Cite
Code
DOI
Online Knowledge Distillation with Diverse Peers
Distillation is an effective knowledge-transfer technique that uses predicted distributions of a powerful teacher model as soft targets …
Defang Chen
,
JianpingMei
,
Can Wang
,
Yan Feng
,
Chun Chen
PDF
Cite
DOI
«
Cite
×