You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I have searched related issues but cannot get the expected help. YES
I have read the FAQ documentation but cannot get the expected help. YES
Hi, I have pretrained a model A (e.g. resnet) for 5 epoches and I would like to load the model parameter to model B (e.g. resnet+fpn) and finetune B for 20 epoches. Then, when I load the parameter from model A to B with "resume_from", I find that the training starts with epoch 5 (6th) maybe because the pth file of model A includes the meta info which indicates it is the 5th ckpt. If I would like the downstream model to start training with epoch 0 and does not include the training meta info from A, How can I do ?
This discussion was converted from issue #2464 on December 26, 2022 08:16.
Heading
Bold
Italic
Quote
Code
Link
Numbered list
Unordered list
Task list
Attach files
Mention
Reference
Menu
reacted with thumbs up emoji reacted with thumbs down emoji reacted with laugh emoji reacted with hooray emoji reacted with confused emoji reacted with heart emoji reacted with rocket emoji reacted with eyes emoji
-
Checklist
Hi, I have pretrained a model A (e.g. resnet) for 5 epoches and I would like to load the model parameter to model B (e.g. resnet+fpn) and finetune B for 20 epoches. Then, when I load the parameter from model A to B with "resume_from", I find that the training starts with epoch 5 (6th) maybe because the pth file of model A includes the meta info which indicates it is the 5th ckpt. If I would like the downstream model to start training with epoch 0 and does not include the training meta info from A, How can I do ?
Hope to get some suggestions, thanks!
Beta Was this translation helpful? Give feedback.
All reactions