Skip to content
/ CPFT Public

(unofficial) Code for "Few-Shot Intent Detection via Contrastive Pre-Training and Fine-Tuning", EMNLP 2021

Notifications You must be signed in to change notification settings

ZIZUN/CPFT

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

6 Commits
 
 
 
 
 
 
 
 

Repository files navigation

CPFT

(unofficial implementation) Code for Few-Shot Intent Detection via Contrastive Pre-Training and Fine-Tuning (EMNLP 2021)

Requirements

  • PyTorch >= 1.7.1
  • tokenizers==0.9.3
  • sklearn
  • transformers==4.10.2

Process

  1. Environment Setting
pip install -r ./pretrain/requirements.txt
  1. Pretraining (MLM Loss + Unsup_con Loss)
cd pretrain
bash scripts/pretrain.sh base 64
  1. Finetuning (Sup_con Loss + CLS Loss)
  • Note that you need to check 'pretrained_model_path' on shell script
cd finetune
bash scripts/finetune_HWU64_5shot.sh base 16
bash scripts/finetune_HWU64_10shot.sh base 16
bash scripts/finetune_CLINC150_10shot.sh base 16
bash scripts/finetune_CLINC150_10shot.sh base 16
bash scripts/finetune_BANKING77_10shot.sh base 16
bash scripts/finetune_BANKING77_10shot.sh base 16

References

Q&A

If you encounter any problem, leave an issue in the github repo.

About

(unofficial) Code for "Few-Shot Intent Detection via Contrastive Pre-Training and Fine-Tuning", EMNLP 2021

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published