We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
您好,我想请教一下,预训练阶段的loss一般会收敛到多少?我现在在自己的数据上跑预mae-b的预训练,发现loss基本在0.2左右就不再下降了。然后把重建的结果绘制在原图上,发现都比较模糊。
The text was updated successfully, but these errors were encountered:
@hanfeng0409 您好,请问您是在多大规模的数据集上跑了多少epoch后达到收敛呢
Sorry, something went wrong.
@hanfeng0409 您好,请问您是在多大规模的数据集上跑了多少epoch后达到收敛呢 感谢您的回复。 大概3000个 数据上,现在是设置了1200个epoch,实际跑到了200个epoch,前30个epoch的loss很快从0.8降到了 0.2左右 后100个epoch loss在0.18附近震荡。
No branches or pull requests
您好,我想请教一下,预训练阶段的loss一般会收敛到多少?我现在在自己的数据上跑预mae-b的预训练,发现loss基本在0.2左右就不再下降了。然后把重建的结果绘制在原图上,发现都比较模糊。
The text was updated successfully, but these errors were encountered: