Skip to content

a PyTorch implement for tools of Label Distribution Learning ##

Notifications You must be signed in to change notification settings

melxy1997/distrib_learn

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

10 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Label Distribution Learning

a PyTorch implementation for functions of Label Distribution Learning

Loss Function

  • JS-divergence $$ D(p||q)=\sum\limits_{i=1}^np(x)\log\frac{p(x)}{q(x)} $$

  • Wasserstein Distance (adapted as Earth Move Distance) $$ Loss=−\frac{1}{N}\sum_{j=1,zj∼Pz}^{N} f_θ(g_ω(z_j))\ω^∗=arg\min_ω(Loss) $$

Evaluation Metric

  • Chebyshev

    $$ Dis(D, \widehat{D})=\max {i}\left|d{i}-\widehat{d}_{i}\right| $$

  • Clark $$ Dis(D, \widehat{D})=\sqrt{\sum_{i=1}^{c} \frac{\left(d_{i}-\widehat{d_{i}}\right)^{2}}{\left(d_{i}+\widehat{d_{i}}\right)^{2}}} $$

  • Canberra $$ Dis(D, \widehat{D})=\sum_{i=1}^{c} \frac{\left|d_{i}-\widehat{d}{i}\right|}{d{i}+\widehat{d}_{i}} $$

  • Intersection

$$ S i m {1}(D, \widehat{D})=\sum_{i=1}^{c} \min \left(d_{i}, \widehat{d}_{i}\right) $$

Customize Loss Function

  • MSE of Entropy It will calculate the MSE of the respective entropies of the two vectors

About

a PyTorch implement for tools of Label Distribution Learning ##

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published