Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Prior sampler should use PriorContext, not DefaultContext #2169

Closed
torfjelde opened this issue Feb 8, 2024 · 2 comments · Fixed by #2170
Closed

Prior sampler should use PriorContext, not DefaultContext #2169

torfjelde opened this issue Feb 8, 2024 · 2 comments · Fixed by #2170

Comments

@torfjelde
Copy link
Member

Example Thanasi Bakis in Slack:

using Turing

function loglik(θ, x)
    println("In loglik")
    
    return logpdf(Normal(θ, 1), x)
end

@model function test(x)
    θ ~ Normal(0, 1)

    if DynamicPPL.leafcontext(__context__) !== Turing.PriorContext()
        println(DynamicPPL.leafcontext(__context__))
        Turing.@addlogprob! sum(loglik.(θ, x))
    end
end

sample(test(randn(100) * 1000), Prior(), 100)

results in DefaultContext rather than PriorContext.

@thanasibakis
Copy link

Thanks!!

@torfjelde
Copy link
Member Author

torfjelde commented May 18, 2024

As noted by @thanasibakis , the above snippet now does evaluate the code using the DefaultContext after v0.31.4 since we're now re-evaluating the model to convert parameters from internal to user-facing representation.

I don't think this causes any bugs, as this is just a post-inference step, but it does cause additional computation overhead, which is an issue I also raised in the PR that introduced this change #2202, and is thus related to #2215.

No matter, we should maybe be running this value extraction using the PriorContext anyways, since we're only interested in hitting that part of the ~ pipeline.

@yebai

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants