-
Notifications
You must be signed in to change notification settings - Fork 11
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Cannot get similar AP on K2C #4
Comments
That is weird, because 4k iters of source domain pretraining can get 46.4 mAP on test set of target domain. Can you upload your complete training log? |
I’m really appreciate for your quickly response. In addition, I tried few experiments as well to make sure my dataset is well-prepared, the results as the following:
In s2c and k2c experiments, I used the default config file that you provided. The results is not as good as yours. Is there any additional setting that I have to do? Moreover, I also tried f2c and k2b, the results are worse than the pure faster R-CNN. I wonder if I need to tune some parameters? Is there any tips for parameter tuning? Here are the log files for the 4 experiments. (k2bdd and f2c config setting are also included in log files) |
@merlinarer Does the evaluation metric (AP) represents [email protected]? or [email protected]:0.95 for car class?? |
@merlinarer Do you have a plan to release the training log for K2C? I also meet such problem to achieve the final result via PT. |
@thesuperorange Hello! The previous links are no longer valid, and I would like to know if you still have the pretrained model './vgg16_caffe.pth'. I am in urgent need of it, and I hope to receive your reply. I would be very grateful. |
I use the same config files to execute Probablistic Teacher on KITTI to Cityscapes (k2c).
According to your paper, AP of k2c improve from 40.3 to 60.2.
However, I got only 23.8 for AP50.
BTW, in the README page, it seems that only f2c model weights and log files are given. (the link of k2c is same as f2c)
Thanks for your kindly help!
The log of my training is as following:
[10/16 02:43:47] d2.evaluation.testing INFO: copypaste: AP,AP50,AP75
[10/16 02:43:47] d2.evaluation.testing INFO: copypaste: 9.6286,23.8045,7.0624
[10/16 02:43:47] d2.utils.events INFO: eta: 0:00:00 iter: 29999 total_loss: 3.475 loss_cls: 0.04451 loss_box_reg: 0.3731 loss_rpn_cls: 0.03362 loss_rpn_loc: 0.324 loss_cls_sup: 0.04147 loss_box_reg_sup: 0.3772 loss_rpn_cls_sup: 0.03775 loss_rpn_loc_sup: 0.3196 loss_cls_unsup: 0.292 loss_box_reg_unsup: 1.025 loss_rpn_cls_unsup: 0.3071 loss_rpn_loc_unsup: 1.081 time: 6.9902 data_time: 1.4791 lr: 0.016 max_mem: 19448M
The text was updated successfully, but these errors were encountered: