Skip to content

Commit

Permalink
reduce batch size for lora_llama2
Browse files Browse the repository at this point in the history
  • Loading branch information
dlwh committed Feb 13, 2024
1 parent c70eedc commit 8c18358
Showing 1 changed file with 1 addition and 1 deletion.
2 changes: 1 addition & 1 deletion config/lora_llama2.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -9,7 +9,7 @@ trainer:
project: "levanter-lora"
tags: ["lora", "llama2"]
num_train_steps: 5000 # tune to suit your needs
train_batch_size: 128
train_batch_size: 64

# if using model parallelism, this is useful:
tensor_parallel_axes: ["mlp", "heads"]
Expand Down

0 comments on commit 8c18358

Please sign in to comment.