-
Notifications
You must be signed in to change notification settings - Fork 219
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Issues with constrained parameters depending on each other #2195
Comments
This can be resolved with something like TuringLang/DynamicPPL.jl#588 + some minor changes to # Extract parameter values in a simple form from the invlinked `VarInfo`.
DynamicPPL.values_as(DynamicPPL.invlink(vi, model), OrderedDict) into vals = if DynamicPPL.is_static(model)
# Extract parameter values in a simple form from the invlinked `VarInfo`.
DynamicPPL.values_as(DynamicPPL.invlink(vi, model), OrderedDict)
else
# Re-evaluate the model completely to get invlinked parameters since
# we can't trust the invlinked `VarInfo` to be up-to-date.
extract_realizations(model, deepcopy(vi))
end This then defaults to the "make sure we're doing everything correctly"-approach, but allows the user to avoid all the additional model evaluations by just doing: model = DynamicPPL.mark_as_static(model) before passing As noted in the PR, we probably should have something a bit more general to also capture when we need to use a fully blown |
Combining aforementioned PRs + TuringLang/DynamicPPL.jl#540, I imagine putting something like the following in our if DynamicPPL.has_static_constraints(model)
model = DynamicPPL.mark_as_static(model)
end and then continue business as usual. It will be a heuristic ofc, but will work very well in practice. Could make this a keyword argument to allow it to be disabled. |
Just to clarify for my understanding: this seems to be a VarInfo issue -- because distributions in metadata is evaluated and saved. Then they are used during invlink, which means when using VarInfo, we always assume the distribution type and support are consistent? Then how about with SimpleVarinfo? And would directing user to use SimpleVarinfo an option for solution? (Of course still need utility to check if static) |
It's indeed a |
A better fix to to remove |
But that doesn't address the issue! It's not "a bug" per-se on the side of EDIT: Even the new |
For example, a good use-case is #2099 where we need to |
#2202) * use `values_as_in_model` to extract the parameters from a `Transition` rather than `invlink` + `values_as` * bump BangBang compat entry * added test from #2195 + added HypothesisTests.jl so we can compare chains properly * deepcopy varinfo before calling `values_as_in_model` to avoid mutating the original logprob computations, etc. * bump patch version * fixed tests --------- Co-authored-by: Hong Ge <[email protected]>
fixed by #2202 (comment) |
Problem
In contrast, if we use
Prior
to sample, we're good:The issue is caused by the fact that we use
DynamicPPL.invlink!!(varinfo, model)
when constructing atransition
, which is what ends up in the chain rather than an issue with the inference itself.For example, if we use AdvancedHMC.jl directly:
Visualizing the densities of the resulting chains, we also see that the one from
Turing.NUTS
is incorrect (the blue line), while the other two (Prior
andAdvancedHMC.NUTS
) coincide:Solution?
Fixing this I think will actually be quite annoying 😕 But I do think it's worth doing.
There are a few approaches:
VarInfo
and always store both the linked and the invlinked realizations.No matter how we do this, there is the issue that we can't support this properly for
externalsampler
, etc. that uses theLogDensityFunction
, without explicit re-evaluation of the model 😕 Though it seems it would still be worth adding proper support for this in the "internal" impls of the samplersMight be worth providing an option to force re-evaluation in combination with, say, a warning if we notice that supports change between two different realizations
@yebai @devmotion @sunxd3
The text was updated successfully, but these errors were encountered: