Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Implement Ordered distribution factory #7297

Open
ricardoV94 opened this issue May 2, 2024 · 6 comments
Open

Implement Ordered distribution factory #7297

ricardoV94 opened this issue May 2, 2024 · 6 comments
Labels
enhancements hackathon Suitable for hackathon

Comments

@ricardoV94
Copy link
Member

ricardoV94 commented May 2, 2024

Description

For univariate IID, adding a transform=ordered is equivalent to sorting the raw draws (forward pass). The logp is proportional to the density of the original draws + ordered transform jacobian. We would just need to figure out the normalization constant so it defines a proper multivariate variable that integrates to 1.

With this users would have a generative graph for ordered variables, that they can also do prior_predictive sampling from.
Also default initvals would work out of the box. Right now users always need to provide them.

This would also pretty much also obliviate the need for default_transform and transform that we added in #5674 and simplify the API.

The normalization constant is probably just size!: https://en.wikipedia.org/wiki/Order_statistic#The_joint_distribution_of_the_order_statistics_of_an_absolutely_continuous_distribution

The API could look something like:

with pm.Model() as m:
  x = pm.Ordered("x", pm.Normal.dist(), shape=(3,))
@michaelosthege
Copy link
Member

Just stressing (citing you) that

Transforms have no role in forward sampling methods (prior/posterior predictive)

from #7040 (comment) because we just got startled by this (again).

@ricardoV94
Copy link
Member Author

ricardoV94 commented May 30, 2024

Just stressing (citing you) that

Transforms have no role in forward sampling methods (prior/posterior predictive)

from #7040 (comment) because we just got startled by this (again).

This is now emphasized in https://www.pymc.io/projects/docs/en/latest/api/distributions/transforms.html

With the distinction between transform and default_transform we can also start emitting warnings in prior/posterior predictive sampling like we do with Potentials, that non default transforms will be ignored

@AlexAndorra
Copy link
Contributor

AlexAndorra commented Jun 12, 2024

Also default initvals would work out of the box. Right now users always need to provide them.

What do you mean by that @ricardoV94 ? I have definitely had cases where it worked without specifying initivals, IIRC

@ricardoV94
Copy link
Member Author

Also default initvals would work out of the box. Right now users always need to provide them.

What do you mean by that @ricardoV94 ? I have definitely had cases where it worked without specifying initivals, IIRC

Then you were lucky, or had a non-iid prior like mu=[-1, 0, 1]

@AlexAndorra
Copy link
Contributor

Aaaah right, makes more sense now -- I indeed always order the mus. Thanks Ricardo ;)

@ricardoV94
Copy link
Member Author

Note that my suggestion above only works for iid components. It wouldn't accept stuff with different mus for example

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancements hackathon Suitable for hackathon
Projects
None yet
Development

No branches or pull requests

4 participants