ZLST
ZLST
Home
News
People
Publications
Contact
Knowledge Distillation
Confidence-aware Self-Semantic Distillation on Knowledge Graph Embedding
Knowledge Graph Embedding (KGE), which projects entities and relations into continuous vector spaces, has garnered significant …
Yichen Liu
,
Jiawei Chen
,
Yan Feng
,
Can Wang
PDF
Cite
DOI
Distillation Matters: Empowering Sequential Recommenders to Match the Performance of Large Language Models
Owing to their powerful semantic reasoning capabilities, Large Language Models (LLMs) have been effectively utilized as recommenders, …
Yu Cui
,
FengLiu
,
PengboWang
,
Bohao Wang
,
Heng Tang
,
YiWan
,
JunWang
,
Jiawei Chen
PDF
Cite
Code
DOI
Cite
×