SELFRec is a Python framework for self-supervised recommendation (SSR) which integrates commonly used datasets and metrics, and implements many state-of-the-art SSR models. SELFRec has a lightweight architecture and provides user-friendly interfaces. It can facilitate model implementation and evaluation.
Founder and principal contributor: @Coder-Yu @xiaxin1998
Supported by: @AIhongzhi (A/Prof. Hongzhi Yin, UQ)
This repo is released with our survey paper on self-supervised learning for recommender systems. We organized a tutorial on self-supervised recommendation at WWW'22. Visit the tutorial page for more information.
- Fast execution: SELFRec is developed with Python 3.7+, Tensorflow 1.14+ and Pytorch 1.7+. All models run on GPUs. Particularly, we optimize the time-consuming procedure of item ranking, drastically reducing the ranking time to seconds (less than 10 seconds for the scale of 10,000×50,000).
- Easy configuration: SELFRec provides a set of simple and high-level interfaces, by which new SSR models can be easily added in a plug-and-play fashion.
- Highly Modularized: SELFRec is divided into multiple discrete and independent modules/layers. This design decouples the model design from other procedures. For users of SELFRec, they just need to focus on the logic of their method, which streamlines the development.
- SSR-Specific: SELFRec is designed for SSR. For the data augmentation and self-supervised tasks, it provides specific modules and interfaces for rapid development.
numba==0.53.1
numpy==1.20.3
scipy==1.6.2
tensorflow==1.14.0
torch>=1.7.0
- Configure the xx.conf file in the directory named conf. (xx is the name of the model you want to run)
- Run main.py and choose the model you want to run.
General hyperparameter settings are: batch_size: 2048, emb_size: 64, learning rate: 0.001, L2 reg: 0.0001.
Model | Recall@20 | NDCG@20 | Hyperparameter settings |
---|---|---|---|
MF | 0.0543 | 0.0445 | |
LightGCN | 0.0639 | 0.0525 | layer=3 |
NCL | 0.0670 | 0.0562 | layer=3, ssl_reg=1e-6, proto_reg=1e-7, tau=0.05, hyper_layers=1, alpha=1.5, num_clusters=2000 |
SGL | 0.0675 | 0.0555 | λ=0.1, ρ=0.1, tau=0.2 layer=3 |
MixGCF | 0.0691 | 0.0577 | layer=3, n_nes=64, layer=3 |
DirectAU | 0.0695 | 0.0583 | 𝛾=2, layer=3 |
SimGCL | 0.0721 | 0.0601 | λ=0.5, eps=0.1, tau=0.2, layer=3 |
XSimGCL | 0.0723 | 0.0604 | λ=0.2, eps=0.2, l∗=1 tau=0.15 layer=3 |
- Create a .conf file for your model in the directory named conf.
- Make your model inherit the proper base class.
- Reimplement the following functions.
- build(), train(), save(), predict()
- Register your model in main.py.
Data Set | Basic Meta | User Context | ||||||
---|---|---|---|---|---|---|---|---|
Users | Items | Ratings (Scale) | Density | Users | Links (Type) | |||
Douban | 2,848 | 39,586 | 894,887 | [1, 5] | 0.794% | 2,848 | 35,770 | Trust |
LastFM | 1,892 | 17,632 | 92,834 | implicit | 0.27% | 1,892 | 25,434 | Trust |
Yelp | 19,539 | 21,266 | 450,884 | implicit | 0.11% | 19,539 | 864,157 | Trust |
Amazon-Book | 52,463 | 91,599 | 2,984,108 | implicit | 0.11% | - | - | - |
@article{yu2022self,
title={Self-Supervised Learning for Recommender Systems: A Survey},
author={Yu, Junliang and Yin, Hongzhi and Xia, Xin and Chen, Tong and Li, Jundong and Huang, Zi},
journal={arXiv preprint arXiv:2203.15876},
year={2022}
}