Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fixed externalsampler #2089

Merged
merged 8 commits into from
Oct 17, 2023
Merged

Fixed externalsampler #2089

merged 8 commits into from
Oct 17, 2023

Conversation

torfjelde
Copy link
Member

We recently made some improvements to the construction of the chains (#2071), and in the process seem to have broken the externalsampler (though I'm somewhat confused as to how tests were passing).

@github-actions
Copy link
Contributor

github-actions bot commented Oct 1, 2023

Pull Request Test Coverage Report for Build 6454918003

  • 0 of 5 (0.0%) changed or added relevant lines in 2 files are covered.
  • 1 unchanged line in 1 file lost coverage.
  • Overall coverage remained the same at 0.0%

Changes Missing Coverage Covered Lines Changed/Added Lines %
src/mcmc/Inference.jl 0 2 0.0%
src/mcmc/abstractmcmc.jl 0 3 0.0%
Files with Coverage Reduction New Missed Lines %
src/mcmc/Inference.jl 1 0.0%
Totals Coverage Status
Change from base Build 6441878489: 0.0%
Covered Lines: 0
Relevant Lines: 1451

💛 - Coveralls

@codecov
Copy link

codecov bot commented Oct 1, 2023

Codecov Report

Attention: 5 lines in your changes are missing coverage. Please review.

Comparison is base (4ffe2cd) 0.00% compared to head (0d9fdfc) 0.00%.

Additional details and impacted files
@@          Coverage Diff           @@
##           master   #2089   +/-   ##
======================================
  Coverage    0.00%   0.00%           
======================================
  Files          21      21           
  Lines        1451    1451           
======================================
  Misses       1451    1451           
Files Coverage Δ
src/mcmc/Inference.jl 0.00% <0.00%> (ø)
src/mcmc/abstractmcmc.jl 0.00% <0.00%> (ø)

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

@devmotion
Copy link
Member

According to codecov, both the old and new implementation is not covered.

Related to external samplers,

const ESLogDensityFunction{M<:Model,S<:Sampler{<:ExternalSampler},V<:AbstractVarInfo} = Turing.LogDensityFunction{V,M,<:DynamicPPL.DefaultContext}
seems wrong - the sampler should be part of the right-hand side, shouldn't it? I think it should be changed to

const ESLogDensityFunction{M<:Model,S<:Sampler{<:ExternalSampler},V<:AbstractVarInfo} = Turing.LogDensityFunction{V,M,<:DynamicPPL.SamplingContext{<:S}}

@torfjelde
Copy link
Member Author

torfjelde commented Oct 3, 2023

According to codecov, both the old and new implementation is not covered.

This is quite strange though, as test/mcmc/abstractmcmc.jl is clearly included in test/runtests.jl:

@timeit_include("mcmc/abstractmcmc.jl")

@torfjelde
Copy link
Member Author

seems wrong - the sampler should be part of the right-hand side, shouldn't it? I think it should be changed to

I applied your suggestion a bit too quickly here:) No, the original impl was correct.

externalsampler only works for samplers which requires nothing but the LogDensityProblems interface. The SamplingContext is not necessary here because these samplers will not be overloading any of the tilde statements; they all just need logdensity and/or logdensity_and_gradient, hence why we just use DefaultContext.

@devmotion
Copy link
Member

I see. The line still seems wrong to me - since the (external) sampler is not used on the RHS, it means that anything defined for ESLogDensityFunction will in fact be defined for any TuringLogDensityFunction with a DefaultContext, everything defined for ESLogDensityFunction{M<:Model} will be defined for any TuringLogDensityFunction{M} with a DefaultContext, and even ESLogDensityFunction{M<:Model,S<:Sampler{<:ExternalSampler}} will be defined for any TuringLogDensityFunction{M} with a DefaultContext.

@torfjelde
Copy link
Member Author

torfjelde commented Oct 8, 2023

Ah great point and 100% agree 👍
It seems the only point where it is used is here:

function LogDensityProblems.logdensity(f::ESLogDensityFunction, x::NamedTuple)
return DynamicPPL.logjoint(f.model, DynamicPPL.unflatten(f.varinfo, x))
end
# TODO: make a nicer `set_namedtuple!` and move these functions to DynamicPPL.
function DynamicPPL.unflatten(vi::TypedVarInfo, θ::NamedTuple)
set_namedtuple!(deepcopy(vi), θ)
return vi
end

which is an unfortunate hack we need to be compatible with AdvancedMH.jl IIRC.

But yeah, we should def just remove ESLogDensityFunction at the very least

Comment on lines +101 to +104
function LogDensityProblems.logdensity(
f::Turing.LogDensityFunction{<:AbstractVarInfo,<:Model,<:DynamicPPL.DefaultContext},
x::NamedTuple
)
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is type piracy (I know it was already there 😉 - and BTW DynamicPPL.unflatten below is another instance of type piracy). In DynamicPPL, we explicitly decided to not add this overload: TuringLang/DynamicPPL.jl#501 Can we just remove it completely?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yeaaah I'm aware of both and I don't like them 😕

It's done to be compatible with AdvancedMH.jl which allows you to specify the proposal distribution using NamedTuple 😕 So we can't just remove it without messing with this; we'd have to do some more work for that, unfortunately.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

If you go to that PR, you can see that it was closed it in favour of another commit that did basically the same thing 😅

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Ohh haha I missed that, I just remembered that the issue had been closed 😅

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

You okay with this leaving this for now @devmotion ? This PR is meant to fix existing code, so feel like maybe we should get this merged and maybe make an issue / separate PR?

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yes, let's address this separately.

@JaimeRZP
Copy link
Member

How can I help to get this merged?
We need this to be able to finally merge TuringLang/docs#403

Copy link
Member

@JaimeRZP JaimeRZP left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM!

@torfjelde torfjelde merged commit 878294f into master Oct 17, 2023
13 checks passed
@torfjelde torfjelde deleted the torfjelde/turingtransition-fix branch October 17, 2023 11:08
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants