Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

validation set and loss in tensorboard graph #250

Open
joaoalves10 opened this issue Apr 20, 2022 · 4 comments
Open

validation set and loss in tensorboard graph #250

joaoalves10 opened this issue Apr 20, 2022 · 4 comments

Comments

@joaoalves10
Copy link

joaoalves10 commented Apr 20, 2022

Hi @AntonMu i would like to ask you where i can find the graph related to the validation set? When i use the tensorboard only appears the loss of the trainning.

Another question is related to the loss graph from tensorboard. Can you explain to my why the loss has a such a steep decline between the first trainning epochs and the second ones?

I have another question, why do you freeze some layers in the first set of trainninf and then unfreeze them all in the last part? And how many layers do you freeze initially?

image

@AntonMu
Copy link
Owner

AntonMu commented Apr 24, 2022

Everything you are asking, you can see in the colab: https://github.com/AntonMu/TrainYourOwnYOLO#google-colab-tutorial-

Hope this helps!

@joaoalves10
Copy link
Author

it helps a bit. however i do not find the answer to why the loss has a such a steep decline between the first trainning epochs and the second ones?
And why do you freeze some layers in the first set of trainninf and then unfreeze them all in the last part? And how many layers do you freeze initially?

@AntonMu
Copy link
Owner

AntonMu commented Apr 25, 2022

If you look at the log messages under !python Train_YOLO.py, you see the exact numbers of frozen layers in each step.

Of course, if you freeze layers, you train only a small part of the model, and thus the loss is larger. When you unfreeze all layers, the model has more parameters to learn, and thus the loss is smaller.

@joaoalves10
Copy link
Author

joaoalves10 commented Apr 25, 2022 via email

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants