-
Notifications
You must be signed in to change notification settings - Fork 43
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add SemanticSamTrainer #637
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Looks good to me, I couldn't spot any issues.
Thanks @constantinpape. I tested this in a 2d dataset, looks like it's doing the job as expected. This is GTG from my side now (only pending a few minor discussion in the evaluation PR) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
We should revisit this with multi-class segmentation in mind, and potentially prefer the low-res masks.
Hi @constantinpape, Looks like the mutli-class semantic segmentation works as expected now (atleast from the first looks of the Tensorboard logs). I am not a big fan of the workarounds I had to apply to make this work, but maybe we find a better way to make things work in a much more modular setup. Let me know if you spot something. We can discuss together tomorrow the details anyways. ADDITION: I added the support for an added loss function (cross entropy) over the logits (between the |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Looks good, but let's remove the GT downsampling. (I think we don't need it anyore)
I've removed the downscaling of masks. Should be GTG now. Thanks! PS. Tested it on a quick training as well |
@constantinpape Here is the trainer for semantic segmentation using SAM. Let me know if this aligns with what we discussed.