Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Revert size kwarg behaviour and make it work with Ellipsis #4667

Closed
wants to merge 3 commits into from

Conversation

michaelosthege
Copy link
Member

See #4662 for more information.

I will comment key changes in the diff.

@@ -274,11 +274,11 @@ def test_chain_jacob_det():


class TestElementWiseLogp(SeededTest):
def build_model(self, distfam, params, *, size=None, shape=None, transform=None, testval=None):
def build_model(self, distfam, params, *, size=None, transform=None, testval=None):
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

All changes in this file are now back to pre-#4625

@@ -427,27 +426,6 @@ def _distr_parameters_for_repr(self):
return ["a"]


class MultinomialRV(MultinomialRV):
"""Aesara's `MultinomialRV` doesn't broadcast; this one does."""
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The Aesara MultinomialRV can do broadcasting since v2.0.4.

# of the Op, hence the broadcasted p must be included in `size`.
support_shape = (p.shape[-1],)
assert support_shape == (4,)
assert m.eval().shape == size + support_shape
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The parametrization sizes were changed such that the support shape (4,) does not occur in the size which should exclude the support. This way the test parametrization is slightly less confusing.

The broadcasting of the MultinomialRV does not change the fact that it has ndim_supp == 1.

@michaelosthege
Copy link
Member Author

michaelosthege commented Apr 24, 2021

The broadcasting of multivariate RVs is very weird already at the Aesara level:

aesara.tensor.random.basic.multivariate_normal(
    mean=np.ones((3, 9)), cov=np.eye(9), size=(3,)
).eval().shape
>>> expected: (3, 9)
>>> actual: (3, 3, 9)

The above parametrization apparently gave it a 2-dimensional support? That's a problem, because the ndim_supp attribute belongs to the Op and is still at 1.
The other interpretation - keeping support fixed at 1 - is that it decided to replicate twice? Also doesn't make a lot of sense to me.

I noticed this when I tried to refactor the Distribution.dist() code such that size is eagerly passed instead of creating rv_native with size=None.
I attempted this refactor because it seemed more elegant, but in the resize-based implementation seems to be way more robust.

michaelosthege and others added 3 commits April 25, 2021 12:26
And don't use np.atleast_1d on things that can be tensors.
Corresponding tests were reverted, or edited to use other parametrization flavors.
The Ellipsis feature now works with all three dimensionality kwargs.
The MultinomialRV implementation was removed, because the broadcasting behavior was implemented in Aesara.
Closes pymc-devs#4662
@brandonwillard
Copy link
Contributor

This can be rebased onto #4693 and, instead, re-introduce a non-breaking version of the shape keyword.

@twiecki
Copy link
Member

twiecki commented May 14, 2021

Closing in favor of #4696.

@twiecki twiecki closed this May 14, 2021
@michaelosthege michaelosthege deleted the fix-4662 branch August 7, 2021 11:21
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants