You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hi, we utilize PyTorch_lightning for training, where checkpoints are saved with other hyperparameters in *.ckpt file. So, you need to first take out the checkpoint of the model from the .ckpt file (just like my .pth checkpoints). Here is the code for this operation:
ckpt = torch.load('your ***.ckpt**')
new_state_dict = OrderedDict()
for k in ckpt['state_dict']:
print(k)
if k[:6] != 'model.':
continue
name = k[6:]
new_state_dict[name] = ckpt['state_dict'][k]
**your model**.load_state_dict(new_state_dict,strict=False)
训练的时候似乎是保存的.ckpt,测试的时候是使用的.pth,请问这是正常的吗?
这将会导致权重加载不上去。
The text was updated successfully, but these errors were encountered: