-
Notifications
You must be signed in to change notification settings - Fork 14
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
微调的过程中,在保存模型的时候出错 #2
Comments
我看了这个帖子down grade了一些packages,然后如果还报其他错就把报错的升级,就能用了。根据我的实验应该是 |
谢谢解答,能不能把你的环境配置发我一下,包的版本太多了,一个一个试错太费时间了 |
peft==0.12.0 |
|
set save_safetensors=False in Seq2SeqTrainingArguments can help |
@shuaijiang 如标题,在保存微调模型时报错:
Some tensors share memory, this will lead to duplicate memory on disk and potential differences when loading them again: {failing}.
A potential way to correctly save your model is to use
save_model
.More information at https://huggingface.co/docs/safetensors/torch_shared_tensors
请问这个要怎么解决呢?
The text was updated successfully, but these errors were encountered: