Rankformer: A Graph Transformer for Recommendation based on Ranking Objective

Abstract

Recommender Systems (RS) aim to generate personalized ranked lists for each user and are evaluated using ranking metrics. Although personalized ranking is a fundamental aspect of RS, this critical property is often overlooked in the design of model architectures. To address this issue, we propose Rankformer, a ranking-inspired recommendation model. The architecture of Rankformer is inspired by the gradient of the ranking objective, embodying a unique (graph) transformer architecture – it leverages global information from all users and items to produce more informative representations and employs specific attention weights to guide the evolution of embeddings towards improved ranking performance. We further develop an acceleration algorithm for Rankformer, reducing its complexity to a linear level with respect to the number of positive instances. Extensive experimental results demonstrate that Rankformer outperforms state-of-the-art methods. The code is available at https://github.com/StupidThree/Rankformer.

Publication
In Proceedings of the ACM Web Conference 2025
Sirui Chen
Sirui Chen
Student

Sirui Chen is a Ph.D. student in ZLST, supervised by Profs. Can Wang, and Profs. Jiawei Chen.

Shen Han
Shen Han
Student

Shen Han is currently a Master student in ZLST Lab, where he is supervised by Prof.Jiawei Chen.

Jiawei Chen
Jiawei Chen
陈佳伟 研究员
Bohao Wang
Bohao Wang
Student

I am a third-year Ph.D. student, and my supervisors are Prof. Chun Chen, Prof. Can Wang, and Prof. Jiawei Chen.

Yan Feng
Yan Feng
冯雁 副教授
Chun Chen
Chun Chen
陈纯 院士
Can Wang
Can Wang
王灿 教授