Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Missing references #4

Open
eamid opened this issue May 30, 2021 · 0 comments
Open

Missing references #4

eamid opened this issue May 30, 2021 · 0 comments

Comments

@eamid
Copy link

eamid commented May 30, 2021

I would like to point out that our work (Amid et al. 2019a) extends the Generalized CE loss (Zhang and Sabuncu 2018) by introducing two temperatures t1 and t2 which recovers GCE when t1 = q and t2 = 1. Our more recent work, called the bi-tempered loss (Amid et al. 2019b) extends these methods by introducing a proper (unbiased) generalization of the CE loss and is shown to be extremely effective in reducing the effect of noisy examples. Please consider adding these two papers to your list.

Google AI blog post:
https://ai.googleblog.com/2019/08/bi-tempered-logistic-loss-for-training.html
Code:
https://github.com/google/bi-tempered-loss
Demo:
https://google.github.io/bi-tempered-loss/

(Amid et al. 2019a) Amid et al. "Two-temperature logistic regression based on the Tsallis divergence." In The 22nd International Conference on Artificial Intelligence and Statistics (AISTATS), 2019.

(Amid et al. 2019b) Amid et al. "Robust bi-tempered logistic loss based on Bregman divergences." In Advances in Neural Information Processing Systems (NeurIPS), 2019.

(Zhang and Sabuncu 2018) Generalized Cross Entropy Loss for Training Deep Neural Networks with Noisy Labels, In Advances in Neural Information Processing Systems (NeurIPS), 2018.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant