Skip to content

qwpyyx/CL-LoRA

 
 

Repository files navigation

O-LoRA

  • This repo releases our implementation for the O-LoRA model.
  • It is built based on the pretrained T5-large model, and finetuned on our data.

image_text

Setup

You can install the required libraries by running

pip install -r requirements.txt

You are also required to download the t5-large model from huggingface, put it to the folder named initial_model, and rename the model folder as 't5-large'.

LLaMA2 HF is also supported. You can put your llama2 hf model to the folder named initial_model and rename the model folder as 'llama'.

Training and Evaluation

For t5-large:

You can reproduce our experiments of order 1 & 2 & 3 by simply running

order1:

bash scripts/order_1.sh> logs_and_outputs/order_1/logs/train_and_infer.log 2>&1 &

order2:

bash scripts/order_2.sh> logs_and_outputs/order_2/logs/train_and_infer.log 2>&1 &

order3:

bash scripts/order_3.sh> logs_and_outputs/order_3/logs/train_and_infer.log 2>&1 &

The model you have trained will be saved in logs_and_outputs/order_1(2 or 3)/outputs.

The result of each task will be saved in logs_and_outputs/order_1(2 or 3)/outputs/TASK_NAME/predict_results.json.

You can also check the logs during training and infering in logs_and_outputs/order_1(2 or 3)/logs/train_and_infer.log

For LLaMA2:

order1:

bash scripts_llama/order_1.sh> logs_and_outputs_llama/order_1/logs/train_and_infer.log 2>&1 &

order2:

bash scripts_llama/order_2.sh> logs_and_outputs_llama/order_2/logs/train_and_infer.log 2>&1 &

order3:

bash scripts_llama/order_3.sh> logs_and_outputs_llama/order_3/logs/train_and_infer.log 2>&1 &

Citation

@article{wang2023orthogonal,
  title={Orthogonal Subspace Learning for Language Model Continual Learning},
  author={Wang, Xiao and Chen, Tianze and Ge, Qiming and Xia, Han and Bao, Rong and Zheng, Rui and Zhang, Qi and Gui, Tao and Huang, Xuanjing},
  journal={arXiv preprint arXiv:2310.14152},
  year={2023}
}

About

Continuous learning of LoRA

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Python 77.5%
  • Shell 22.5%