Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Results cannot be reproduced? #27

Open
cq0907 opened this issue Nov 12, 2022 · 13 comments
Open

Results cannot be reproduced? #27

cq0907 opened this issue Nov 12, 2022 · 13 comments

Comments

@cq0907
Copy link

cq0907 commented Nov 12, 2022

I train a model by:
0c4286b4416487068b6e1f046708fd4
After training 98 epoch, I get the following results:
d3356f69f666418062f127910f05b6a
Why my result are so different from yours?
image

The setting is:
image

@zesenwu23
Copy link
Contributor

Hi, can you upload the training log for the first 10 epochs?

@cq0907
Copy link
Author

cq0907 commented Nov 12, 2022

Hi, can you upload the training log for the first 10 epochs?

Epoch: [2][100/695] Time: 0.272 (0.278) lr:0.030 Loss: 89.4270 (88.8914) iLoss: 1.4270 (0.8914) TLoss: 88.0000 (88.0000) KLoss: 8.2916 (8.5085) Accu: 38.63
Epoch: [2][150/695] Time: 0.275 (0.276) lr:0.030 Loss: 88.7448 (88.8946) iLoss: 0.7448 (0.8946) TLoss: 88.0000 (88.0000) KLoss: 8.2963 (8.5243) Accu: 38.67
Epoch: [2][200/695] Time: 0.273 (0.276) lr:0.030 Loss: 88.8029 (88.8883) iLoss: 0.8029 (0.8883) TLoss: 88.0000 (88.0000) KLoss: 8.3028 (8.6061) Accu: 38.67
Epoch: [2][250/695] Time: 0.273 (0.275) lr:0.030 Loss: 88.3966 (88.8919) iLoss: 0.3966 (0.8919) TLoss: 88.0000 (88.0000) KLoss: 7.6682 (8.7429) Accu: 38.68
Epoch: [2][300/695] Time: 0.273 (0.275) lr:0.030 Loss: 88.9312 (88.8849) iLoss: 0.9312 (0.8849) TLoss: 88.0000 (88.0000) KLoss: 8.1458 (8.8047) Accu: 38.83
Epoch: [2][350/695] Time: 0.273 (0.275) lr:0.030 Loss: 88.9140 (88.8733) iLoss: 0.9140 (0.8733) TLoss: 88.0000 (88.0000) KLoss: 8.9773 (8.8654) Accu: 38.94
Epoch: [2][400/695] Time: 0.273 (0.275) lr:0.030 Loss: 88.6361 (88.8473) iLoss: 0.6361 (0.8473) TLoss: 88.0000 (88.0000) KLoss: 9.0357 (8.8763) Accu: 39.25
Epoch: [2][450/695] Time: 0.282 (0.274) lr:0.030 Loss: 88.4686 (88.8283) iLoss: 0.4686 (0.8283) TLoss: 88.0000 (88.0000) KLoss: 9.5262 (8.8743) Accu: 39.52
Epoch: [2][500/695] Time: 0.273 (0.274) lr:0.030 Loss: 88.6134 (88.8055) iLoss: 0.6134 (0.8055) TLoss: 88.0000 (88.0000) KLoss: 9.1093 (8.8821) Accu: 39.80
Epoch: [2][550/695] Time: 0.273 (0.274) lr:0.030 Loss: 88.6441 (88.7872) iLoss: 0.6441 (0.7872) TLoss: 88.0000 (88.0000) KLoss: 9.5906 (8.8930) Accu: 39.99
Epoch: [2][600/695] Time: 0.272 (0.274) lr:0.030 Loss: 88.1605 (88.7717) iLoss: 0.1605 (0.7717) TLoss: 88.0000 (88.0000) KLoss: 8.2677 (8.9089) Accu: 40.20
Epoch: [2][650/695] Time: 0.273 (0.274) lr:0.030 Loss: 88.3207 (88.7560) iLoss: 0.3207 (0.7560) TLoss: 88.0000 (88.0000) KLoss: 8.3744 (8.9002) Accu: 40.40
Test Epoch: 2
Extracting Gallery Feature...
Extracting Time: 3.557
Extracting Query Feature...
Extracting Time: 5.575
Evaluation Time: 4.138
POOL: Rank-1: 16.51% | Rank-5: 36.94% | Rank-10: 48.36%| Rank-20: 62.79%| mAP: 15.83%| mINP: 7.45%
FC: Rank-1: 21.11% | Rank-5: 45.31% | Rank-10: 59.77%| Rank-20: 75.49%| mAP: 20.87%| mINP: 10.63%
Best Epoch [2]
==> Preparing Data Loader...
3
[18948 18933 18940 ... 17397 17443 17421]
[8563 8548 8563 ... 7494 7504 7498]
Epoch: [3][0/695] Time: 0.716 (0.716) lr:0.040 Loss: 88.6011 (88.6011) iLoss: 0.6011 (0.6011) TLoss: 88.0000 (88.0000) KLoss: 8.6152 (8.6152) Accu: 41.15
Epoch: [3][50/695] Time: 0.273 (0.282) lr:0.040 Loss: 88.3976 (88.6086) iLoss: 0.3976 (0.6086) TLoss: 88.0000 (88.0000) KLoss: 9.9551 (8.8652) Accu: 42.22
Epoch: [3][100/695] Time: 0.272 (0.278) lr:0.040 Loss: 88.9570 (88.6278) iLoss: 0.9570 (0.6278) TLoss: 88.0000 (88.0000) KLoss: 8.7417 (8.9135) Accu: 42.08
Epoch: [3][150/695] Time: 0.274 (0.276) lr:0.040 Loss: 88.3955 (88.6135) iLoss: 0.3955 (0.6135) TLoss: 88.0000 (88.0000) KLoss: 8.9246 (8.9067) Accu: 42.14
Epoch: [3][200/695] Time: 0.274 (0.276) lr:0.040 Loss: 88.6157 (88.6116) iLoss: 0.6157 (0.6116) TLoss: 88.0000 (88.0000) KLoss: 9.3078 (8.9218) Accu: 42.15
Epoch: [3][250/695] Time: 0.274 (0.275) lr:0.040 Loss: 88.8796 (88.6140) iLoss: 0.8796 (0.6140) TLoss: 88.0000 (88.0000) KLoss: 9.9581 (8.9296) Accu: 42.08
Epoch: [3][300/695] Time: 0.274 (0.275) lr:0.040 Loss: 88.3863 (88.6050) iLoss: 0.3863 (0.6050) TLoss: 88.0000 (88.0000) KLoss: 8.9333 (8.9124) Accu: 42.18
Epoch: [3][350/695] Time: 0.273 (0.275) lr:0.040 Loss: 88.2817 (88.5953) iLoss: 0.2817 (0.5953) TLoss: 88.0000 (88.0000) KLoss: 8.7242 (8.9181) Accu: 42.30
Epoch: [3][400/695] Time: 0.273 (0.275) lr:0.040 Loss: 88.3939 (88.5864) iLoss: 0.3939 (0.5864) TLoss: 88.0000 (88.0000) KLoss: 8.1597 (8.9067) Accu: 42.39
Epoch: [3][450/695] Time: 0.274 (0.275) lr:0.040 Loss: 88.2766 (88.5786) iLoss: 0.2766 (0.5786) TLoss: 88.0000 (88.0000) KLoss: 8.1062 (8.8791) Accu: 42.50
Epoch: [3][500/695] Time: 0.273 (0.275) lr:0.040 Loss: 89.1530 (88.5682) iLoss: 1.1530 (0.5682) TLoss: 88.0000 (88.0000) KLoss: 9.7190 (8.8280) Accu: 42.66
Epoch: [3][550/695] Time: 0.274 (0.274) lr:0.040 Loss: 88.3128 (88.5551) iLoss: 0.3128 (0.5551) TLoss: 88.0000 (88.0000) KLoss: 8.3632 (8.7564) Accu: 42.85
Epoch: [3][600/695] Time: 0.274 (0.274) lr:0.040 Loss: 88.4823 (88.5447) iLoss: 0.4823 (0.5447) TLoss: 88.0000 (88.0000) KLoss: 8.5399 (8.7063) Accu: 42.97
Epoch: [3][650/695] Time: 0.272 (0.274) lr:0.040 Loss: 88.4157 (88.5339) iLoss: 0.4157 (0.5339) TLoss: 88.0000 (88.0000) KLoss: 7.1020 (8.6463) Accu: 43.12
==> Preparing Data Loader...
4
[6088 6095 6073 ... 1956 1884 1946]
[2636 2611 2614 ... 774 774 791]
Epoch: [4][0/695] Time: 0.700 (0.700) lr:0.050 Loss: 88.5157 (88.5157) iLoss: 0.5157 (0.5157) TLoss: 88.0000 (88.0000) KLoss: 8.2325 (8.2325) Accu: 43.23
Epoch: [4][50/695] Time: 0.273 (0.282) lr:0.050 Loss: 88.4549 (88.4709) iLoss: 0.4549 (0.4709) TLoss: 88.0000 (88.0000) KLoss: 7.5466 (7.9749) Accu: 43.85
Epoch: [4][100/695] Time: 0.273 (0.278) lr:0.050 Loss: 88.3378 (88.4647) iLoss: 0.3378 (0.4647) TLoss: 88.0000 (88.0000) KLoss: 8.2959 (8.0032) Accu: 44.03
Epoch: [4][150/695] Time: 0.273 (0.276) lr:0.050 Loss: 88.4724 (88.4954) iLoss: 0.4724 (0.4954) TLoss: 88.0000 (88.0000) KLoss: 7.4439 (8.0202) Accu: 43.67
Epoch: [4][200/695] Time: 0.273 (0.276) lr:0.050 Loss: 88.4874 (88.4942) iLoss: 0.4874 (0.4942) TLoss: 88.0000 (88.0000) KLoss: 7.8594 (8.0040) Accu: 43.64
Epoch: [4][250/695] Time: 0.273 (0.275) lr:0.050 Loss: 88.7037 (88.4915) iLoss: 0.7037 (0.4915) TLoss: 88.0000 (88.0000) KLoss: 8.2477 (8.0059) Accu: 43.69
Epoch: [4][300/695] Time: 0.273 (0.275) lr:0.050 Loss: 88.6629 (88.4894) iLoss: 0.6629 (0.4894) TLoss: 88.0000 (88.0000) KLoss: 7.9568 (7.9894) Accu: 43.69
Epoch: [4][350/695] Time: 0.274 (0.275) lr:0.050 Loss: 88.3515 (88.4847) iLoss: 0.3515 (0.4847) TLoss: 88.0000 (88.0000) KLoss: 8.9861 (7.9794) Accu: 43.75
Epoch: [4][400/695] Time: 0.272 (0.275) lr:0.050 Loss: 88.5411 (88.4831) iLoss: 0.5411 (0.4834) TLoss: 88.0000 (87.9997) KLoss: 8.5402 (7.9610) Accu: 43.75
Epoch: [4][450/695] Time: 0.273 (0.274) lr:0.050 Loss: 88.2826 (88.4732) iLoss: 0.2826 (0.4735) TLoss: 88.0000 (87.9997) KLoss: 7.8360 (7.9248) Accu: 43.86
Epoch: [4][500/695] Time: 0.273 (0.274) lr:0.050 Loss: 88.2636 (88.4644) iLoss: 0.2636 (0.4646) TLoss: 88.0000 (87.9998) KLoss: 6.3330 (7.8879) Accu: 43.96
Epoch: [4][550/695] Time: 0.272 (0.274) lr:0.050 Loss: 88.1958 (88.4555) iLoss: 0.1958 (0.4557) TLoss: 88.0000 (87.9998) KLoss: 7.0777 (7.8301) Accu: 44.08
Epoch: [4][600/695] Time: 0.272 (0.274) lr:0.050 Loss: 88.3473 (88.4462) iLoss: 0.3473 (0.4464) TLoss: 88.0000 (87.9998) KLoss: 6.7138 (7.7825) Accu: 44.18
Epoch: [4][650/695] Time: 0.272 (0.274) lr:0.050 Loss: 88.4024 (88.4374) iLoss: 0.4024 (0.4376) TLoss: 88.0000 (87.9998) KLoss: 6.4832 (7.7257) Accu: 44.31
Test Epoch: 4
Extracting Gallery Feature...
Extracting Time: 0.779
Extracting Query Feature...
Extracting Time: 4.350
Evaluation Time: 4.090
POOL: Rank-1: 19.85% | Rank-5: 41.13% | Rank-10: 53.17%| Rank-20: 67.84%| mAP: 18.52%| mINP: 8.58%
FC: Rank-1: 26.16% | Rank-5: 54.25% | Rank-10: 68.26%| Rank-20: 81.72%| mAP: 25.37%| mINP: 13.49%
Best Epoch [4]
==> Preparing Data Loader...
5
[18073 18002 18005 ... 3811 3863 3861]
[7818 7808 7819 ... 1657 1665 1673]
Epoch: [5][0/695] Time: 0.713 (0.713) lr:0.060 Loss: 88.1604 (88.1604) iLoss: 0.1604 (0.1604) TLoss: 88.0000 (88.0000) KLoss: 6.1973 (6.1973) Accu: 47.40
Epoch: [5][50/695] Time: 0.273 (0.281) lr:0.060 Loss: 88.3838 (88.3612) iLoss: 0.3838 (0.3612) TLoss: 88.0000 (88.0000) KLoss: 6.4952 (6.8451) Accu: 45.37
Epoch: [5][100/695] Time: 0.273 (0.277) lr:0.060 Loss: 88.3777 (88.3803) iLoss: 0.3777 (0.3803) TLoss: 88.0000 (88.0000) KLoss: 7.1354 (6.8482) Accu: 45.18
Epoch: [5][150/695] Time: 0.273 (0.276) lr:0.060 Loss: 88.2980 (88.3776) iLoss: 0.2980 (0.3776) TLoss: 88.0000 (88.0000) KLoss: 6.5694 (6.8887) Accu: 45.23
Epoch: [5][200/695] Time: 0.272 (0.275) lr:0.060 Loss: 88.4809 (88.3864) iLoss: 0.4809 (0.3864) TLoss: 88.0000 (88.0000) KLoss: 6.9022 (6.8921) Accu: 45.09
Epoch: [5][250/695] Time: 0.273 (0.275) lr:0.060 Loss: 88.2699 (88.3864) iLoss: 0.2699 (0.3864) TLoss: 88.0000 (88.0000) KLoss: 6.3708 (6.8662) Accu: 45.12
Epoch: [5][300/695] Time: 0.277 (0.275) lr:0.060 Loss: 88.3163 (88.3776) iLoss: 0.3163 (0.3776) TLoss: 88.0000 (88.0000) KLoss: 6.2582 (6.8536) Accu: 45.22
Epoch: [5][350/695] Time: 0.272 (0.275) lr:0.060 Loss: 88.1860 (88.3741) iLoss: 0.1860 (0.3741) TLoss: 88.0000 (88.0000) KLoss: 5.9200 (6.8137) Accu: 45.29
Epoch: [5][400/695] Time: 0.273 (0.274) lr:0.060 Loss: 88.1951 (88.3686) iLoss: 0.1951 (0.3686) TLoss: 88.0000 (88.0000) KLoss: 6.1012 (6.7744) Accu: 45.37
Epoch: [5][450/695] Time: 0.274 (0.274) lr:0.060 Loss: 88.5585 (88.3654) iLoss: 0.5585 (0.3654) TLoss: 88.0000 (88.0000) KLoss: 6.5597 (6.7349) Accu: 45.40
Epoch: [5][500/695] Time: 0.274 (0.274) lr:0.060 Loss: 88.4268 (88.3627) iLoss: 0.4268 (0.3627) TLoss: 88.0000 (88.0000) KLoss: 6.4857 (6.6911) Accu: 45.43
Epoch: [5][550/695] Time: 0.274 (0.274) lr:0.060 Loss: 88.0958 (88.3602) iLoss: 0.0958 (0.3602) TLoss: 88.0000 (88.0000) KLoss: 5.7822 (6.6420) Accu: 45.48
Epoch: [5][600/695] Time: 0.274 (0.274) lr:0.060 Loss: 21.2410 (87.7977) iLoss: 6.9059 (0.3980) TLoss: 14.3351 (87.3997) KLoss: 4.4297 (6.5908) Accu: 45.21
Epoch: [5][650/695] Time: 0.273 (0.274) lr:0.060 Loss: 7.0363 (81.7301) iLoss: 6.2938 (0.8793) TLoss: 0.7425 (80.8508) KLoss: 1.6672 (6.2382) Accu: 41.97
==> Preparing Data Loader...
6
[16271 16340 16342 ... 20425 20415 20423]
[ 7014 7047 7021 ... 10039 10044 10041]
Epoch: [6][0/695] Time: 0.686 (0.686) lr:0.070 Loss: 6.7556 (6.7556) iLoss: 6.0077 (6.0077) TLoss: 0.7479 (0.7479) KLoss: 1.2174 (1.2174) Accu: 0.52
Epoch: [6][50/695] Time: 0.277 (0.282) lr:0.070 Loss: 5.8677 (6.4769) iLoss: 5.1514 (5.7499) TLoss: 0.7163 (0.7270) KLoss: 1.3028 (1.2581) Accu: 1.79
Epoch: [6][100/695] Time: 0.273 (0.278) lr:0.070 Loss: 6.1910 (6.3815) iLoss: 5.4860 (5.6604) TLoss: 0.7050 (0.7212) KLoss: 1.3270 (1.2567) Accu: 2.03
Epoch: [6][150/695] Time: 0.282 (0.276) lr:0.070 Loss: 6.5671 (6.2769) iLoss: 5.8560 (5.5607) TLoss: 0.7111 (0.7163) KLoss: 1.3876 (1.2966) Accu: 2.74
Epoch: [6][200/695] Time: 0.272 (0.276) lr:0.070 Loss: 6.2862 (6.1988) iLoss: 5.5866 (5.4854) TLoss: 0.6996 (0.7134) KLoss: 1.6873 (1.3396) Accu: 3.32
Epoch: [6][250/695] Time: 0.278 (0.275) lr:0.070 Loss: 5.9715 (6.1229) iLoss: 5.2693 (5.4122) TLoss: 0.7022 (0.7108) KLoss: 1.5540 (1.3893) Accu: 4.17
Epoch: [6][300/695] Time: 0.273 (0.275) lr:0.070 Loss: 6.2065 (6.0643) iLoss: 5.5080 (5.3556) TLoss: 0.6984 (0.7087) KLoss: 1.5774 (1.4518) Accu: 5.44
Epoch: [6][350/695] Time: 0.272 (0.275) lr:0.070 Loss: 5.7395 (6.0074) iLoss: 5.0407 (5.3004) TLoss: 0.6989 (0.7070) KLoss: 2.0679 (1.5042) Accu: 6.57
Epoch: [6][400/695] Time: 0.273 (0.274) lr:0.070 Loss: 5.6376 (5.9538) iLoss: 4.9428 (5.2479) TLoss: 0.6948 (0.7059) KLoss: 1.8836 (1.5573) Accu: 7.01
Epoch: [6][450/695] Time: 0.272 (0.274) lr:0.070 Loss: 5.3541 (5.9083) iLoss: 4.6586 (5.2034) TLoss: 0.6954 (0.7048) KLoss: 1.9755 (1.6113) Accu: 7.83
Epoch: [6][500/695] Time: 0.273 (0.274) lr:0.070 Loss: 5.9325 (5.8653) iLoss: 5.2368 (5.1614) TLoss: 0.6957 (0.7039) KLoss: 2.3936 (1.6686) Accu: 8.88
Epoch: [6][550/695] Time: 0.273 (0.274) lr:0.070 Loss: 4.9913 (5.8203) iLoss: 4.2986 (5.1171) TLoss: 0.6927 (0.7031) KLoss: 2.3705 (1.7250) Accu: 9.90
Epoch: [6][600/695] Time: 0.273 (0.274) lr:0.070 Loss: 4.6970 (5.7804) iLoss: 4.0035 (5.0781) TLoss: 0.6935 (0.7024) KLoss: 2.2517 (1.7815) Accu: 11.08
Epoch: [6][650/695] Time: 0.273 (0.274) lr:0.070 Loss: 5.2744 (5.7394) iLoss: 4.5821 (5.0378) TLoss: 0.6923 (0.7017) KLoss: 3.0256 (1.8371) Accu: 12.47
Test Epoch: 6
Extracting Gallery Feature...
Extracting Time: 0.801
Extracting Query Feature...
Extracting Time: 4.326
Evaluation Time: 4.128
POOL: Rank-1: 1.37% | Rank-5: 7.07% | Rank-10: 13.91%| Rank-20: 26.35%| mAP: 3.65%| mINP: 2.11%
FC: Rank-1: 1.05% | Rank-5: 6.68% | Rank-10: 13.96%| Rank-20: 24.93%| mAP: 3.69%| mINP: 2.33%
Best Epoch [4]
==> Preparing Data Loader...
7
[19199 19201 19203 ... 16944 17016 16951]
[8819 8817 8822 ... 7317 7313 7313]
Epoch: [7][0/695] Time: 0.716 (0.716) lr:0.080 Loss: 5.4215 (5.4215) iLoss: 4.7287 (4.7287) TLoss: 0.6928 (0.6928) KLoss: 3.0571 (3.0571) Accu: 30.21
Epoch: [7][50/695] Time: 0.283 (0.282) lr:0.080 Loss: 5.4436 (5.1843) iLoss: 4.7481 (4.4912) TLoss: 0.6955 (0.6932) KLoss: 2.5890 (2.7173) Accu: 28.96
Epoch: [7][100/695] Time: 0.273 (0.278) lr:0.080 Loss: 5.1101 (5.1418) iLoss: 4.4171 (4.4488) TLoss: 0.6930 (0.6930) KLoss: 2.6546 (2.7395) Accu: 29.82
Epoch: [7][150/695] Time: 0.271 (0.276) lr:0.080 Loss: 4.7229 (5.0963) iLoss: 4.0310 (4.4034) TLoss: 0.6919 (0.6929) KLoss: 2.7361 (2.7750) Accu: 30.57
Epoch: [7][200/695] Time: 0.274 (0.275) lr:0.080 Loss: 4.6991 (5.0435) iLoss: 4.0098 (4.3512) TLoss: 0.6893 (0.6923) KLoss: 3.1200 (2.8529) Accu: 32.34
Epoch: [7][250/695] Time: 0.273 (0.275) lr:0.080 Loss: 5.7062 (5.0283) iLoss: 5.0134 (4.3362) TLoss: 0.6928 (0.6920) KLoss: 2.8371 (2.9129) Accu: 33.30
Epoch: [7][300/695] Time: 0.274 (0.275) lr:0.080 Loss: 4.8394 (4.9915) iLoss: 4.1478 (4.2997) TLoss: 0.6916 (0.6918) KLoss: 3.1658 (2.9507) Accu: 34.38
Epoch: [7][350/695] Time: 0.273 (0.274) lr:0.080 Loss: 5.1756 (4.9720) iLoss: 4.4841 (4.2805) TLoss: 0.6915 (0.6915) KLoss: 3.5287 (2.9975) Accu: 35.17
Epoch: [7][400/695] Time: 0.272 (0.274) lr:0.080 Loss: 4.3831 (4.9398) iLoss: 3.6960 (4.2486) TLoss: 0.6871 (0.6912) KLoss: 3.2113 (3.0200) Accu: 36.13
Epoch: [7][450/695] Time: 0.273 (0.274) lr:0.080 Loss: 5.0209 (4.9078) iLoss: 4.3340 (4.2169) TLoss: 0.6869 (0.6909) KLoss: 3.7779 (3.0700) Accu: 37.02
Epoch: [7][500/695] Time: 0.274 (0.274) lr:0.080 Loss: 4.8175 (4.8818) iLoss: 4.1292 (4.1913) TLoss: 0.6883 (0.6906) KLoss: 3.9559 (3.1194) Accu: 37.55
Epoch: [7][550/695] Time: 0.277 (0.274) lr:0.080 Loss: 4.2501 (4.8495) iLoss: 3.5702 (4.1594) TLoss: 0.6799 (0.6901) KLoss: 3.4374 (3.1671) Accu: 38.31
Epoch: [7][600/695] Time: 0.274 (0.274) lr:0.080 Loss: 4.5579 (4.8126) iLoss: 3.8681 (4.1232) TLoss: 0.6898 (0.6895) KLoss: 4.0766 (3.2236) Accu: 39.06
Epoch: [7][650/695] Time: 0.278 (0.274) lr:0.080 Loss: 4.2724 (4.7809) iLoss: 3.5864 (4.0918) TLoss: 0.6860 (0.6890) KLoss: 3.7125 (3.2726) Accu: 39.65
==> Preparing Data Loader...
8
[10436 10428 10497 ... 16615 16608 16626]
[4488 4496 4490 ... 7152 7141 7142]
Epoch: [8][0/695] Time: 0.705 (0.705) lr:0.090 Loss: 4.5153 (4.5153) iLoss: 3.8360 (3.8360) TLoss: 0.6793 (0.6793) KLoss: 4.3656 (4.3656) Accu: 42.71
Epoch: [8][50/695] Time: 0.273 (0.282) lr:0.090 Loss: 4.7377 (4.2681) iLoss: 4.0614 (3.5888) TLoss: 0.6764 (0.6793) KLoss: 4.0565 (4.2724) Accu: 49.27
Epoch: [8][100/695] Time: 0.273 (0.277) lr:0.090 Loss: 4.3135 (4.2672) iLoss: 3.6423 (3.5883) TLoss: 0.6712 (0.6789) KLoss: 3.9947 (4.1619) Accu: 48.81
Epoch: [8][150/695] Time: 0.273 (0.276) lr:0.090 Loss: 3.4615 (4.2268) iLoss: 2.7908 (3.5479) TLoss: 0.6707 (0.6789) KLoss: 4.0730 (4.2252) Accu: 48.83
Epoch: [8][200/695] Time: 0.272 (0.275) lr:0.090 Loss: 4.0212 (4.1892) iLoss: 3.3501 (3.5113) TLoss: 0.6711 (0.6779) KLoss: 4.3166 (4.2836) Accu: 49.45
Epoch: [8][250/695] Time: 0.273 (0.275) lr:0.090 Loss: 4.3676 (4.1472) iLoss: 3.7044 (3.4704) TLoss: 0.6632 (0.6768) KLoss: 4.7292 (4.3411) Accu: 49.94
Epoch: [8][300/695] Time: 0.276 (0.275) lr:0.090 Loss: 4.1010 (4.1313) iLoss: 3.4423 (3.4550) TLoss: 0.6587 (0.6763) KLoss: 4.7060 (4.3611) Accu: 50.03
Epoch: [8][350/695] Time: 0.272 (0.274) lr:0.090 Loss: 4.1517 (4.0975) iLoss: 3.4715 (3.4221) TLoss: 0.6802 (0.6754) KLoss: 4.9141 (4.4090) Accu: 50.49
Epoch: [8][400/695] Time: 0.273 (0.274) lr:0.090 Loss: 3.8435 (4.0659) iLoss: 3.1813 (3.3920) TLoss: 0.6623 (0.6739) KLoss: 5.2494 (4.4566) Accu: 51.01
Epoch: [8][450/695] Time: 0.273 (0.274) lr:0.090 Loss: 3.6860 (4.0329) iLoss: 3.0416 (3.3602) TLoss: 0.6444 (0.6728) KLoss: 4.6557 (4.5093) Accu: 51.46
Epoch: [8][500/695] Time: 0.273 (0.274) lr:0.090 Loss: 3.7771 (3.9927) iLoss: 3.1474 (3.3220) TLoss: 0.6297 (0.6707) KLoss: 4.8703 (4.5578) Accu: 52.09
Epoch: [8][550/695] Time: 0.273 (0.274) lr:0.090 Loss: 3.3321 (3.9598) iLoss: 2.6735 (3.2901) TLoss: 0.6586 (0.6698) KLoss: 5.0266 (4.6114) Accu: 52.57
Epoch: [8][600/695] Time: 0.273 (0.274) lr:0.090 Loss: 3.7681 (3.9360) iLoss: 3.1343 (3.2677) TLoss: 0.6338 (0.6683) KLoss: 5.0208 (4.6551) Accu: 52.88
Epoch: [8][650/695] Time: 0.274 (0.274) lr:0.090 Loss: 3.3970 (3.9080) iLoss: 2.7462 (3.2409) TLoss: 0.6507 (0.6671) KLoss: 5.3225 (4.7055) Accu: 53.19
Test Epoch: 8
Extracting Gallery Feature...
Extracting Time: 0.808
Extracting Query Feature...
Extracting Time: 4.322
Evaluation Time: 4.040
POOL: Rank-1: 3.29% | Rank-5: 11.78% | Rank-10: 22.09%| Rank-20: 40.84%| mAP: 5.78%| mINP: 3.18%
FC: Rank-1: 2.84% | Rank-5: 11.73% | Rank-10: 19.88%| Rank-20: 35.02%| mAP: 5.09%| mINP: 2.43%
Best Epoch [4]
==> Preparing Data Loader...
9
[21687 21680 21688 ... 20999 20988 20992]
[11329 11337 11337 ... 10667 10661 10658]
Epoch: [9][0/695] Time: 0.697 (0.697) lr:0.100 Loss: 3.1833 (3.1833) iLoss: 2.5300 (2.5300) TLoss: 0.6533 (0.6533) KLoss: 5.8047 (5.8047) Accu: 59.90
Epoch: [9][50/695] Time: 0.273 (0.281) lr:0.100 Loss: 4.1611 (3.5205) iLoss: 3.5199 (2.8762) TLoss: 0.6411 (0.6443) KLoss: 5.5921 (5.4184) Accu: 58.81
Epoch: [9][100/695] Time: 0.272 (0.277) lr:0.100 Loss: 3.3032 (3.4773) iLoss: 2.6719 (2.8344) TLoss: 0.6313 (0.6429) KLoss: 5.8251 (5.4112) Accu: 59.59
Epoch: [9][150/695] Time: 0.273 (0.276) lr:0.100 Loss: 3.6473 (3.4583) iLoss: 3.0196 (2.8168) TLoss: 0.6276 (0.6415) KLoss: 5.5682 (5.4526) Accu: 59.38
Epoch: [9][200/695] Time: 0.273 (0.275) lr:0.100 Loss: 3.9288 (3.4246) iLoss: 3.2923 (2.7851) TLoss: 0.6365 (0.6395) KLoss: 6.0866 (5.5165) Accu: 59.68
Epoch: [9][250/695] Time: 0.271 (0.275) lr:0.100 Loss: 2.9349 (3.4116) iLoss: 2.2829 (2.7739) TLoss: 0.6520 (0.6377) KLoss: 5.3323 (5.5485) Accu: 59.81
Epoch: [9][300/695] Time: 0.273 (0.275) lr:0.100 Loss: 3.1694 (3.3783) iLoss: 2.5291 (2.7414) TLoss: 0.6403 (0.6369) KLoss: 6.2431 (5.6114) Accu: 60.21
Epoch: [9][350/695] Time: 0.272 (0.274) lr:0.100 Loss: 3.1968 (3.3368) iLoss: 2.5118 (2.7021) TLoss: 0.6850 (0.6347) KLoss: 6.4518 (5.6803) Accu: 60.68
Epoch: [9][400/695] Time: 0.272 (0.274) lr:0.100 Loss: 3.2538 (3.3003) iLoss: 2.5923 (2.6681) TLoss: 0.6615 (0.6322) KLoss: 6.4434 (5.7386) Accu: 61.05
Epoch: [9][450/695] Time: 0.272 (0.274) lr:0.100 Loss: 2.1155 (3.2642) iLoss: 1.4993 (2.6341) TLoss: 0.6162 (0.6301) KLoss: 6.0223 (5.7934) Accu: 61.39
Epoch: [9][500/695] Time: 0.273 (0.274) lr:0.100 Loss: 3.0302 (3.2474) iLoss: 2.4420 (2.6189) TLoss: 0.5882 (0.6285) KLoss: 6.1151 (5.8352) Accu: 61.58
Epoch: [9][550/695] Time: 0.276 (0.274) lr:0.100 Loss: 2.0192 (3.2102) iLoss: 1.4496 (2.5842) TLoss: 0.5697 (0.6260) KLoss: 6.1015 (5.8877) Accu: 62.01
Epoch: [9][600/695] Time: 0.272 (0.274) lr:0.100 Loss: 2.4337 (3.1895) iLoss: 1.8641 (2.5652) TLoss: 0.5696 (0.6243) KLoss: 6.1188 (5.9480) Accu: 62.24
Epoch: [9][650/695] Time: 0.272 (0.274) lr:0.100 Loss: 2.9384 (3.1669) iLoss: 2.3256 (2.5440) TLoss: 0.6127 (0.6229) KLoss: 6.9302 (5.9854) Accu: 62.50
==> Preparing Data Loader...
10
[21001 20988 20987 ... 22193 22196 22183]
[10655 10658 10657 ... 11833 11839 11837]
Epoch: [10][0/695] Time: 0.702 (0.702) lr:0.100 Loss: 2.7923 (2.7923) iLoss: 2.2307 (2.2307) TLoss: 0.5615 (0.5615) KLoss: 6.4560 (6.4560) Accu: 64.58
Epoch: [10][50/695] Time: 0.272 (0.282) lr:0.100 Loss: 3.3721 (2.7971) iLoss: 2.7299 (2.2005) TLoss: 0.6422 (0.5966) KLoss: 6.8163 (6.7191) Accu: 66.97
Epoch: [10][100/695] Time: 0.272 (0.277) lr:0.100 Loss: 2.7034 (2.7808) iLoss: 2.1790 (2.1863) TLoss: 0.5244 (0.5945) KLoss: 6.1185 (6.6961) Accu: 67.37
Epoch: [10][150/695] Time: 0.274 (0.276) lr:0.100 Loss: 2.6545 (2.7674) iLoss: 2.0719 (2.1738) TLoss: 0.5826 (0.5936) KLoss: 6.6824 (6.6773) Accu: 67.16
Epoch: [10][200/695] Time: 0.273 (0.275) lr:0.100 Loss: 2.6527 (2.7266) iLoss: 2.0979 (2.1376) TLoss: 0.5548 (0.5889) KLoss: 6.5861 (6.7291) Accu: 67.72
Epoch: [10][250/695] Time: 0.273 (0.275) lr:0.100 Loss: 2.1370 (2.7083) iLoss: 1.6269 (2.1193) TLoss: 0.5101 (0.5890) KLoss: 6.2336 (6.7819) Accu: 68.05
Epoch: [10][300/695] Time: 0.274 (0.275) lr:0.100 Loss: 2.9291 (2.6769) iLoss: 2.3844 (2.0909) TLoss: 0.5447 (0.5860) KLoss: 6.7695 (6.8256) Accu: 68.38
Epoch: [10][350/695] Time: 0.273 (0.274) lr:0.100 Loss: 2.0951 (2.6431) iLoss: 1.5120 (2.0592) TLoss: 0.5831 (0.5840) KLoss: 7.3683 (6.8784) Accu: 68.75
Epoch: [10][400/695] Time: 0.274 (0.274) lr:0.100 Loss: 2.2369 (2.6184) iLoss: 1.6413 (2.0366) TLoss: 0.5956 (0.5818) KLoss: 7.2489 (6.9032) Accu: 68.97
Epoch: [10][450/695] Time: 0.273 (0.274) lr:0.100 Loss: 2.5862 (2.5974) iLoss: 2.0108 (2.0186) TLoss: 0.5754 (0.5788) KLoss: 7.3194 (6.9532) Accu: 69.16
Epoch: [10][500/695] Time: 0.272 (0.274) lr:0.100 Loss: 2.1384 (2.5806) iLoss: 1.6435 (2.0031) TLoss: 0.4949 (0.5775) KLoss: 6.8980 (6.9729) Accu: 69.40
Epoch: [10][550/695] Time: 0.273 (0.274) lr:0.100 Loss: 1.9845 (2.5613) iLoss: 1.4426 (1.9861) TLoss: 0.5419 (0.5753) KLoss: 7.8287 (7.0213) Accu: 69.57
Epoch: [10][600/695] Time: 0.274 (0.274) lr:0.100 Loss: 2.3351 (2.5333) iLoss: 1.7581 (1.9603) TLoss: 0.5771 (0.5731) KLoss: 7.4241 (7.0556) Accu: 69.93
Epoch: [10][650/695] Time: 0.273 (0.274) lr:0.100 Loss: 2.3799 (2.5136) iLoss: 1.8113 (1.9418) TLoss: 0.5686 (0.5718) KLoss: 7.2985 (7.0955) Accu: 70.18
Test Epoch: 10
Extracting Gallery Feature...
Extracting Time: 0.774
Extracting Query Feature...
Extracting Time: 4.337
Evaluation Time: 4.122
POOL: Rank-1: 5.15% | Rank-5: 18.22% | Rank-10: 31.03%| Rank-20: 51.04%| mAP: 8.47%| mINP: 4.90%
FC: Rank-1: 3.76% | Rank-5: 14.36% | Rank-10: 25.45%| Rank-20: 41.65%| mAP: 6.22%| mINP: 3.08%
Best Epoch [4]
==> Preparing Data Loader...
11

@cq0907
Copy link
Author

cq0907 commented Nov 12, 2022

Hi, can you upload the training log for the first 10 epochs?

mAP curve:
image
image

rank1 curve:
image
image

mINP curve:
image
image

@zesenwu23
Copy link
Contributor

You can try to change the code of line 148 in ICCV21_CAJ/loss.py,
from diff_pow =torch.clamp_max(diff_pow, max=88)
to diff_pow =torch.clamp_max(diff_pow, max=44).

@cq0907
Copy link
Author

cq0907 commented Nov 12, 2022

OK, thank you. I'll try again.

@cq0907
Copy link
Author

cq0907 commented Nov 12, 2022

Let me ask you one more question. What's the difference between the following two? I'm a beginner. Please forgive me.
image

@zesenwu23
Copy link
Contributor

Let me ask you one more question. What's the difference between the following two? I'm a beginner. Please forgive me. image

You may refer to this issue FC or Pool feature?

@cq0907
Copy link
Author

cq0907 commented Nov 12, 2022

Let me ask you one more question. What's the difference between the following two? I'm a beginner. Please forgive me. image

You may refer to this issue FC or Pool feature?

OK, thank you!

@xiaoye-hhh
Copy link

I change the code

            # diff_pow =torch.clamp_max(diff_pow, max=88)
            diff_pow =torch.clamp_max(diff_pow, max=44)

but the result is still low. What should i do?
1668325856666

@cleddy
Copy link

cleddy commented Dec 14, 2022

It happened the same to me. The results reach only 20%+ in mAP of my best epoch.I found that the triplet loss drop dramatically in epoch 5, just like yours.
Screenshot from 2022-12-14 15-55-29
I don't know whether it is right.

@intlabSeJun
Copy link

I found reason. enhanced Tripletloss. if u change the Triplet loss 'agw' mode then it good working. but best result will not reproduce.

@intlabSeJun
Copy link

intlabSeJun commented Jan 11, 2023

image
like this. use AGW tripletloss. but args.method='adp'

@ling275
Copy link

ling275 commented Sep 15, 2023

I also have this problem. Improvements have been made in accordance with the recommendations, but the results are still not met. Have you solved this problem?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

6 participants