a PyTorch implementation for functions of Label Distribution Learning
-
JS-divergence $$ D(p||q)=\sum\limits_{i=1}^np(x)\log\frac{p(x)}{q(x)} $$
-
Wasserstein Distance (adapted as Earth Move Distance) $$ Loss=−\frac{1}{N}\sum_{j=1,zj∼Pz}^{N} f_θ(g_ω(z_j))\ω^∗=arg\min_ω(Loss) $$
-
Chebyshev
$$ Dis(D, \widehat{D})=\max {i}\left|d{i}-\widehat{d}_{i}\right| $$
-
Clark $$ Dis(D, \widehat{D})=\sqrt{\sum_{i=1}^{c} \frac{\left(d_{i}-\widehat{d_{i}}\right)^{2}}{\left(d_{i}+\widehat{d_{i}}\right)^{2}}} $$
-
Canberra $$ Dis(D, \widehat{D})=\sum_{i=1}^{c} \frac{\left|d_{i}-\widehat{d}{i}\right|}{d{i}+\widehat{d}_{i}} $$
-
Intersection
- MSE of Entropy It will calculate the MSE of the respective entropies of the two vectors