Skip to content
/ SEM Public
forked from Qrange-group/SEM

SEM can automatically decide to select and integrate attention operators to compute attention maps.

License

Notifications You must be signed in to change notification settings

xia-yu66/SEM

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

8 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

SEM: Switchable Excitation Module for Self-attention Mechanism

996.ICU GitHub GitHub

This repository is the implementation of "SEM: Switchable Excitation Module for Self-attention Mechanism" [paper] on CIFAR-100 and CIFAR-10 datasets.

Introduction

SEM is a self-attention module, which can automatically decide to select and integrate attention operators to compute attention maps.

Requirement

Python and PyTorch.

pip install -r requirements.txt

Usage

python run.py --dataset cifar100 --block-name bottleneck --depth 164 --epochs 164 --schedule 81 122 --gamma 0.1 --wd 1e-4

Results

Dataset original SEM
ResNet164 CIFAR10 93.39 94.95
ResNet164 CIFAR100 74.30 76.76

Citing SEM

@article{zhong2022switchable,
  title={Switchable Self-attention Module},
  author={Zhong, Shanshan and Wen, Wushao and Qin, Jinghui},
  journal={arXiv preprint arXiv:2209.05680},
  year={2022}
}

Acknowledgments

Many thanks to bearpaw for his simple and clean Pytorch framework for image classification task.

About

SEM can automatically decide to select and integrate attention operators to compute attention maps.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Python 100.0%