Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

positive_loss is much larger than negative_loss #15

Open
heiyuxiaokai opened this issue Oct 11, 2019 · 1 comment
Open

positive_loss is much larger than negative_loss #15

heiyuxiaokai opened this issue Oct 11, 2019 · 1 comment

Comments

@heiyuxiaokai
Copy link

heiyuxiaokai commented Oct 11, 2019

Train log:
2019-10-11 16:29:21,673 maskrcnn_benchmark.trainer INFO: eta: 1 day, 0:40:30 iter: 260 loss: 3.8217 (3.8392) negative_loss: 0.0326 (0.0354) positive_loss: 3.7731 (3.8038) time: 1.3759 (1.4870) data: 0.0050 (0.0068) lr: 0.000680 max mem: 7173
2019-10-11 16:29:48,522 maskrcnn_benchmark.trainer INFO: eta: 1 day, 0:29:44 iter: 280 loss: 3.7343 (3.8322) negative_loss: 0.0513 (0.0364) positive_loss: 3.6920 (3.7958) time: 1.2558 (1.4766) data: 0.0049 (0.0067) lr: 0.000707 max mem: 7173
2019-10-11 16:30:17,056 maskrcnn_benchmark.trainer INFO: eta: 1 day, 0:25:56 iter: 300 loss: 3.5909 (3.8169) negative_loss: 0.0517 (0.0395) positive_loss: 3.5172 (3.7775) time: 1.1965 (1.4733) data: 0.0047 (0.0066) lr: 0.000733 max mem: 7173_
It's normal?

@zhangxiaosong18
Copy link
Owner

RetinaNet and FreeAnchor initialize the classifier bias to make it predict lower scores, so the negative loss is very small.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants