-
Notifications
You must be signed in to change notification settings - Fork 272
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
【Hackathon No.9】为 Paddle 新增 Laplace API #190
Conversation
|
||
self.loc - self.scale * (value - 0.5).sign() * paddle.log1p(-2 * (value - 0.5).abs()) | ||
|
||
- `kl_divergence` 两个Laplace分布之间的kl散度(other--Laplace类的一个实例): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
KL计算逻辑统一实现在 https://github.com/PaddlePaddle/Paddle/blob/develop/python/paddle/distribution/kl.py,Laplace class中无需额外实现
看了下kl.py文件,确实需要注册一下laplace的分布,但是KL计算还是需要手动实现一下吧?有部分分布(Beta, Dirichlet,ExponentialFamily)是在kl.py中实现的,有的是在原始api中实现然后在kl.py中调用的(Categorical,Uniform, Normal)。因此更新的文档设计保留了在原始类中实现kl计算,增加了kl.py的注册,可调用paddle.distribution.kl_divergence计算laplace之间的kl散度。
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM
No description provided.