Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

About parameter on Pascal. #3

Closed
prettydong opened this issue Jul 25, 2022 · 10 comments
Closed

About parameter on Pascal. #3

prettydong opened this issue Jul 25, 2022 · 10 comments

Comments

@prettydong
Copy link

No description provided.

@prettydong
Copy link
Author

prettydong commented Jul 25, 2022

Your work is very interesting !
would you share precision parameters on Pascal with me ? Thank you!

@prettydong prettydong changed the title About mAP About parameter on Pascal. Jul 25, 2022
@prettydong
Copy link
Author

I use rel 0.2, lr 1e-5, backbone imgaenetV2 of resnet50 , epoch 15.
then i get Test mAP : 89.124 😊

@jameslahm
Copy link

Hi, It's under "End-to-End" training setting?

@prettydong
Copy link
Author

Hi, It's under "End-to-End" training setting?

Yes, 'end2end' is defualt setting.
I run the code on a PC with single RTX3060 12G not 4GPUs in paper.
And it seems that learning rate of last linear layer is different from resnet backbone.

@jameslahm
Copy link

jameslahm commented Jul 26, 2022

It seems that the paper also uses a single GPU on pascal too and only use 4 GPUs on openimages.

And it seems that learning rate of last linear layer is different from resnet backbone.

Yes, the learning rate of last linear layer is 10 times larger than resnet backbone.

@jameslahm
Copy link

I use rel 0.2, lr 1e-5, backbone imgaenetV2 of resnet50 , epoch 15.
then i get Test mAP : 89.124 😊

Hi, @prettydong , did you use LL-R method?I try these hyperparameters under LL-R and only get 87.795 test mAP. Did you change somewhere else? Thanks!

@prettydong
Copy link
Author

prettydong commented Jul 27, 2022

I use rel 0.2, lr 1e-5, backbone imgaenetV2 of resnet50 , epoch 15.
then i get Test mAP : 89.124 😊

Hi, @prettydong , did you use LL-R method?I try these hyperparameters under LL-R and only get 87.795 test mAP. Did you change somewhere else? Thanks!

I use LL-R under 0.2rel and 1e-5 lr ,other parameter is defualt. Then I can get 88 mAP. The torch warning me that I use old backbone , then I use IMAGENET1K_V2 which resize image to 232, you can find details on "pytorch/vision#3995 (comment)".
->models.py:22---feature_extractor = torchvision.models.resnet50(weights=ResNet50_Weights.DEFAULT).
This backbone spends more times. And best performance occures on 12-13th epoch. Then mAP get 89.1.
I plan to do some incremental work on this paper. 😂I'd be happy to discuss the details of this paper with you.

@youngwk
Copy link
Collaborator

youngwk commented Jul 29, 2022

Thanks for your interest.
For LL-R, I used delta_rel=0.1, lr=2.5E-03 with SGD optimizer.

@prettydong
Copy link
Author

Thanks!

@youngwk youngwk closed this as completed Jul 29, 2022
@jameslahm
Copy link

I use rel 0.2, lr 1e-5, backbone imgaenetV2 of resnet50 , epoch 15.
then i get Test mAP : 89.124 😊

Hi, @prettydong , did you use LL-R method?I try these hyperparameters under LL-R and only get 87.795 test mAP. Did you change somewhere else? Thanks!

I use LL-R under 0.2rel and 1e-5 lr ,other parameter is defualt. Then I can get 88 mAP. The torch warning me that I use old backbone , then I use IMAGENET1K_V2 which resize image to 232, you can find details on "pytorch/vision#3995 (comment)". ->models.py:22---feature_extractor = torchvision.models.resnet50(weights=ResNet50_Weights.DEFAULT). This backbone spends more times. And best performance occures on 12-13th epoch. Then mAP get 89.1. I plan to do some incremental work on this paper. 😂I'd be happy to discuss the details of this paper with you.

Thanks! I'd be happy too!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants