Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Question about conditional sampling and moco-k during training #5

Open
andsild opened this issue Dec 1, 2023 · 0 comments
Open

Question about conditional sampling and moco-k during training #5

andsild opened this issue Dec 1, 2023 · 0 comments

Comments

@andsild
Copy link

andsild commented Dec 1, 2023

Hi,

Thanks for the code and paper :)

I'm trying to understand this line, where the model checks for conditional sampling:

if self.condition:
# conditional ssl
logits = torch.mm(q, k.T) / self.T
labels = torch.arange(logits.shape[0], dtype=torch.long).cuda()
return logits, labels

It seems to me that this skips updating the self.queue which I believe corresponds to the "dictionary" in the MoCo paper (section 3.1)

Am I right? If so, does this mean you effectively train with args.moco-k at the same as the batch size (m = 128)? If so, when you compared against an model without conditional sampling in your paper, did you also set moco-k to 128 or was it the default 65536?

@andsild andsild closed this as completed May 6, 2024
@andsild andsild reopened this Dec 3, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant