Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Eliminated redundant computation in repeated calls to leapfrog() function #414

Merged
merged 2 commits into from
Jan 21, 2017

Conversation

matthewdhoffman
Copy link
Collaborator

In the old code, each call to leapfrog() computed the gradient twice with respect to latent variables that hadn't changed. This patch eliminates that duplicated computation, resulting in a near 2x speedup.

It works by adding an n_steps parameter to leapfrog() and moving the loop into the leapfrog() function.

for _ in range(self.n_steps):
new_sample, new_r_sample = leapfrog(new_sample, new_r_sample,
self.step_size, self._log_joint)
new_sample, new_r_sample = leapfrog(new_sample, new_r_sample,
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

cool! i think we can also remove the two lines above this now as well.

grad_log_joint = tf.gradients(log_joint(z_new), list(six.itervalues(z_new)))
for i, key in enumerate(six.iterkeys(z_old)):
r_new[key] += 0.5 * step_size * grad_log_joint[i]
for n in range(n_steps):
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

since we're not using the looping variable, we should do for _ in ... for readability

z_new = {}
r_new = {}

for key in z_old:
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

i think these lines can be condensed with the the copy method in dict's, e.g., z_new = z_old.copy() and r_new = r_old.copy().

@dustinvtran
Copy link
Member

thanks for the catch; this looks great. only minor comments above

@dustinvtran
Copy link
Member

hi @matthewdhoffman: i made the minor changes from my comments above. let me know if you approve/have comments, and i'll merge.

@matthewdhoffman
Copy link
Collaborator Author

LGTM, thanks for cleaning it up. (And for showing me a couple of better idioms!)

@dustinvtran dustinvtran merged commit 7bee63c into master Jan 21, 2017
@dustinvtran dustinvtran deleted the feature/lf_dup branch January 21, 2017 01:59
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants