Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Experiment on DenseNet with dilation #139

Open
Xiangyu-CAS opened this issue Jan 24, 2018 · 5 comments
Open

Experiment on DenseNet with dilation #139

Xiangyu-CAS opened this issue Jan 24, 2018 · 5 comments

Comments

@Xiangyu-CAS
Copy link

Hi, recently, I have carried out some experiment based on DenseNet backbone. In order to produce feature map at stride 8, dilation or a trous trick was used.

However, the results are really confusing. Most of the refinement stages were not working. In original implementation, loss reduce from stage 1 to stage 6, but in my experiment with dilation, loss stay constant from stage 3 to stage 6.

Did you carry out any experiment using dilation and met the same issue ?

Original:

I0824 06:42:20.602033 8724 solver.cpp:228] Iteration 55075, loss = 685.986
I0824 06:42:20.602083 8724 solver.cpp:244] Train net output #0: loss_stage1_L1 = 112.87 (* 1 = 112.87 loss)
I0824 06:42:20.602092 8724 solver.cpp:244] Train net output #1: loss_stage1_L2 = 33.2826 (* 1 = 33.2826 loss)
I0824 06:42:20.602097 8724 solver.cpp:244] Train net output #2: loss_stage2_L1 = 87.7518 (* 1 = 87.7518 loss)
I0824 06:42:20.602102 8724 solver.cpp:244] Train net output #3: loss_stage2_L2 = 27.0587 (* 1 = 27.0587 loss)
I0824 06:42:20.602108 8724 solver.cpp:244] Train net output #4: loss_stage3_L1 = 83.2692 (* 1 = 83.2692 loss)
I0824 06:42:20.602111 8724 solver.cpp:244] Train net output #5: loss_stage3_L2 = 25.0962 (* 1 = 25.0962 loss)
I0824 06:42:20.602116 8724 solver.cpp:244] Train net output #6: loss_stage4_L1 = 81.7719 (* 1 = 81.7719 loss)
I0824 06:42:20.602121 8724 solver.cpp:244] Train net output #7: loss_stage4_L2 = 24.2401 (* 1 = 24.2401 loss)
I0824 06:42:20.602126 8724 solver.cpp:244] Train net output #8: loss_stage5_L1 = 81.4189 (* 1 = 81.4189 loss)
I0824 06:42:20.602130 8724 solver.cpp:244] Train net output #9: loss_stage5_L2 = 23.8976 (* 1 = 23.8976 loss)
I0824 06:42:20.602136 8724 solver.cpp:244] Train net output #10: loss_stage6_L1 = 81.3313 (* 1 = 81.3313 loss)
I0824 06:42:20.602141 8724 solver.cpp:244] Train net output #11: loss_stage6_L2 = 23.9978 (* 1 = 23.9978 loss)

DenseNet + dilation

I0831 21:17:55.839206 31806 solver.cpp:228] Iteration 80925, loss = 822.671
I0831 21:17:55.839253 31806 solver.cpp:244] Train net output #0: loss_stage1_L1 = 114.009 (* 1 = 114.009 loss)
I0831 21:17:55.839259 31806 solver.cpp:244] Train net output #1: loss_stage1_L2 = 29.0028 (* 1 = 29.0028 loss)
I0831 21:17:55.839264 31806 solver.cpp:244] Train net output #2: loss_stage2_L1 = 107.842 (* 1 = 107.842 loss)
I0831 21:17:55.839309 31806 solver.cpp:244] Train net output #3: loss_stage2_L2 = 27.9593 (* 1 = 27.9593 loss)
I0831 21:17:55.839316 31806 solver.cpp:244] Train net output #4: loss_stage3_L1 = 107.783 (* 1 = 107.783 loss)
I0831 21:17:55.839321 31806 solver.cpp:244] Train net output #5: loss_stage3_L2 = 27.661 (* 1 = 27.661 loss)
I0831 21:17:55.839326 31806 solver.cpp:244] Train net output #6: loss_stage4_L1 = 108.645 (* 1 = 108.645 loss)
I0831 21:17:55.839331 31806 solver.cpp:244] Train net output #7: loss_stage4_L2 = 27.7962 (* 1 = 27.7962 loss)
I0831 21:17:55.839336 31806 solver.cpp:244] Train net output #8: loss_stage5_L1 = 108.079 (* 1 = 108.079 loss)
I0831 21:17:55.839340 31806 solver.cpp:244] Train net output #9: loss_stage5_L2 = 27.6918 (* 1 = 27.6918 loss)
I0831 21:17:55.839345 31806 solver.cpp:244] Train net output #10: loss_stage6_L1 = 108.332 (* 1 = 108.332 loss)
I0831 21:17:55.839350 31806 solver.cpp:244] Train net output #11: loss_stage6_L2 = 27.8711 (* 1 = 27.8711 loss)

@Ai-is-light
Copy link

@Xiangyu-CAS would you mind sharing your result of your experiment?

@Xiangyu-CAS
Copy link
Author

Xiangyu-CAS commented Mar 12, 2018

@Ai-is-light ,
ResNet50 (conv4,5 dilation, bn_istrain) +2tstage : 55% (2644 images)
ResNet50 (conv4,5 dilation, bn_istrain) +6tstage : 55% (2644 images)
ResNet50 (conv4,5 dilation, bn_freeze) + N stage: failed <10%

                  DenseNet121(conv4,5 dilation, bn_freeze) +2tstage : 54%
                  DenseNet121(conv4,5 dilation, bn_freeze) +6tstage : 54%
                  DenseNet121(conv4,5 dilation, bn_istrain) +6tstage : 54%

would you mind sharing yours ?

@Ai-is-light
Copy link

@Xiangyu-CAS I haven't try the ResNet and DenseNet , but I will do it. I have tried the original network changed by the dilated work. In my work , I need pay more attention to the speed. However I failed when I try to use the BN-layer at the stage of 2-6. In your work, ResNet50 (conv4,5 dilation, bn_istrain) +2tstage : 55% (2644 images), which means you did use bn-layer in training including every layer of stage2-6 ??? I trained the original network and got coco-2644 about 0.584(mAP), and dilate-conv about 0.543 (mAP). And , now, I try to use the mobileNet to train more faster model. So,

mAP/coco-dataset

mAP/coco-dataset

@hellojialee
Copy link

Is the performance you reported obtained on COCO2017 or COCO2014?

@Hanhanhan11
Copy link

@Xiangyu-CAS I haven't try the ResNet and DenseNet , but I will do it. I have tried the original network changed by the dilated work. In my work , I need pay more attention to the speed. However I failed when I try to use the BN-layer at the stage of 2-6. In your work, ResNet50 (conv4,5 dilation, bn_istrain) +2tstage : 55% (2644 images), which means you did use bn-layer in training including every layer of stage2-6 ??? I trained the original network and got coco-2644 about 0.584(mAP), and dilate-conv about 0.543 (mAP). And , now, I try to use the mobileNet to train more faster model. So,

mAP/coco-dataset

mAP/coco-dataset

Hi, did you use mobileNet to train the model? I have tried it for a long time, I want to ask if you change the cropsize to 128 to train this model? How about the performance? Thanks

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants