-
Notifications
You must be signed in to change notification settings - Fork 759
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Feature/param mixture #592
Conversation
bef493f
to
75e823c
Compare
e004e6a
to
fd096f9
Compare
fd096f9
to
a6bf1e4
Compare
@matthewdhoffman | Ready for you to take a look at again. I robustified the implementation, specifically dealing with sample shape vs batch shape vs event shape. I wasn't sure how to implement mean/variance/stddev for arbitrary batch shapes, so I made them raise an error if the batch shape is non-scalar. I also added another unit test. For the previous unit test, we were checking component-wise means/variances based on identifying from |
I also had a question about the log prob. Is this doing the log sum density, summing over the individual component probabilities, or is this choosing one component's density? Seems like the latter? |
Interesting. I find the idea of a The most important thing I did in these commits is to restore the correspondence between
|
Got it. All makes sense. merging |
Adds the
ParamMixture
distribution toedward.models
. This class implements a mixture model where all of the mixture components are from the same family, but with (possibly) different parameters. (Contrast withMixture
, which allows the mixture components to be of different families.)