Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

THANKS for your great job, Is there a pytorch version of the code? #1

Open
Jessejx opened this issue Oct 6, 2021 · 7 comments
Open

Comments

@Jessejx
Copy link

Jessejx commented Oct 6, 2021

THANKS for your great job, Is there a pytorch version of the code?

@Guo-Xiaoqing
Copy link
Owner

Sorry, there is no PyTorch version.

@Jessejx
Copy link
Author

Jessejx commented Oct 8, 2021

in mutual loss, Can Kullback-Leibler be applied to compare two images directly?

@Guo-Xiaoqing
Copy link
Owner

Kullback–Leibler divergence is a measure of how one probability distribution is different from a second, reference probability distribution. Hence, it cannot be applied on two input images.

@Jessejx
Copy link
Author

Jessejx commented Oct 8, 2021

I see kl_loss in the code, so I think you are using Kullback-Leibler divergence to meansure.

@Guo-Xiaoqing
Copy link
Owner

Guo-Xiaoqing commented Oct 8, 2021 via email

@Jessejx
Copy link
Author

Jessejx commented Oct 8, 2021

THANKS, You mean I should change images 2 likelihood maps first?

@Guo-Xiaoqing
Copy link
Owner

Guo-Xiaoqing commented Oct 8, 2021 via email

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants