Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Difference between code and paper #38

Open
lizleo opened this issue Dec 17, 2020 · 1 comment
Open

Difference between code and paper #38

lizleo opened this issue Dec 17, 2020 · 1 comment

Comments

@lizleo
Copy link

lizleo commented Dec 17, 2020

When

so

then

but in your code https://github.com/zhangxiaosong18/FreeAnchor/blob/master/maskrcnn_benchmark/modeling/rpn/free_anchor_loss.py#L161,
you just use (matched_cls_prob in your code) as ,
that means you just ignore the other predicted classes which not matching the target class, and I think it's different with retinanet_cls_loss defined in https://github.com/zhangxiaosong18/FreeAnchor/blob/master/maskrcnn_benchmark/modeling/rpn/retinanet_loss.py#L142.

I try to rewrite the code calculating matched_cls_prob as blew:

labels_mul = torch.zeros([len(labels_), self.num_classes])
for i in range(len(labels_)):
    labels_mul[i, labels_[i]] = 1

labels_mul = labels_mul.unsqueeze(1).repeat(1, self.pre_anchor_topk, 1)

loss_mul_class = nn.BCELoss(reduction="none")(cls_prob_[matched], labels_mul).sum(dim=-1)
matched_cls_prob = (-loss_mul_class).exp()

Did I get it wrong ? @zhangxiaosong18

@liuyang-ict
Copy link

You know if b_{i}^{cls}[k] =1, the only way it can happen is if k = i, because b_{i}^{cls} is a one-hot label.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants