Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add 'n' parameter to RandomVariable to access sample_n() functions #323

Closed
wants to merge 60 commits into from

Conversation

matthewdhoffman
Copy link
Collaborator

This is a small PR that adds a keyword argument 'n' to Edward RandomVariables that lets them access the tf.contrib.distributions.Distribution.sample_n() function. This lets us replace (IMO) hacky syntax like ed.Normal(mu=tf.ones([10, 1])*mu_vec, sigma=1.) with ed.Normal(mu=mu_vec, sigma=1., n=10), which is (IMO) clearer, very marginally more efficient, and makes doing conjugacy algebra on the graph a little easier in some cases.

@dustinvtran
Copy link
Member

interesting proposal. i like it!

i wonder if it breaks any of the inference algorithms? my guess is no, because they all work on samples via a random variable's value() and not through the sample() method. does it affect the log_probs?

also: the argument n may have to change to a unique keyword argument that no distribution's argument uses. for example, binomial and multinomial use n for the number of trials.

@matthewdhoffman
Copy link
Collaborator Author

Good catch on the n overloading. Maybe n_samples?

Re: log_prob() working, I think it should be ok, since the distributions in tf.contrib.distribution should be written so that log_prob() works with samples generated via sample() or sample_n(). But worth a quick double-check.

@dustinvtran
Copy link
Member

dustinvtran commented Nov 9, 2016

sure, n_samples is good. by the way, i recommend looking at #326 to deal with the issue of copying random variables.

dustinvtran and others added 23 commits March 1, 2017 23:19
…491)

* add implicit_klqp.py

* add example; let loss_obj be an argument

* fix pep8

* let user pass in custom ratio loss

* improve docstrings

* generalize ImplicitKLqp's scale arg to be a dict

* add simple unit test

* tweak example

* robustify ratio loss arg; update docstrings
* Fix get_variables for recursive ops

* Allow copy of recrusive graphs

* Add tests for recursive args in copy / get_variables

* Fix pep8 error

* extend recursion safety to ancestors, children, descendants, and parents

* Also mirror existing test for get_variables
* let session runs work directly on RandomVariable

* add unit test

* remove use of .value() in examples and docs
* refactor log_*_exp to use tf.reduce_logsumexp

* replace all uses to reduce_log*exp
* tease out Laplace into new file laplace.py

* replace hessian utility fn with tf.hessians

* update laplace to work with MultivariateNormal*

* add unit test; update docs

* clean up code on pointmass vs normal

* revise docs
* hierarchical logistic regression edward

* pep8
* add tex/iclr2017.tex

* prescribe edward's development version for now

* add iclr2017.ipynb; revise snippets
* remove edward/{stats/,models/models.py

* remove all model wrapper examples

* remove model wrapper docs

* remove model wrapper tests

* update docs/

* update tests/

* update .travis.yml,setup.py

* remove ed.placeholder,ed.MFVI

* update edward/criticisms/

* update edward/inferences/
)

* overload more operators

* update unit tests
* ppca tutorial

* revising ppca tutorial

* minor changes post code review
* Add the docker

for using the every environment

* update Dockerfile and Makefile

for gpu environment

* Update Makefile

for gpu environment

* Update Makefile and README.md

for specify the gpu environment

* Update Makefile

for specify the gpu environment

* Update README.md

for gpu environment

* Add Dockerfile and Update Makefile and README.md

for cpu environment
* make miscellaneous revisions to doc

* force 'from * import *' statements in all code snippets, no 'import *' except getting started

* update some examples
…ature/sample_n

Rewrite to catch up with a couple months of changes.
@matthewdhoffman
Copy link
Collaborator Author

@dustinvtran I've finally gotten back to this. I think it's doing what it should, but it'd be good to get another set of eyes on it.

We can now pass in a tuple sample_shape, which adds 1 or more additional outer dimensions beyond whatever is specified by the batch_shape and event_shape. This basically just wraps the sample_shape arg from tf.contrib.distributions. I had to make some small adjustments to make things play nicely with the value parameter. (Possibly it shouldn't be possible to specify both.)

@dustinvtran
Copy link
Member

Cool! Is there a way to see the diff? The one here says 199 files changed.

@chmp
Copy link
Contributor

chmp commented Mar 15, 2017

A rebase to master should fix the number of changed files, shouldn't it?

@matthewdhoffman
Copy link
Collaborator Author

matthewdhoffman commented Mar 27, 2017

I'm going to close this and open a new one based on the conjugacy PR #588.

@matthewdhoffman matthewdhoffman deleted the feature/sample_n branch March 27, 2017 16:11
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.