diff --git a/tutorials/docs-09-using-turing-advanced/advanced.jmd b/tutorials/docs-09-using-turing-advanced/advanced.jmd index f938270da..b4c48c22e 100644 --- a/tutorials/docs-09-using-turing-advanced/advanced.jmd +++ b/tutorials/docs-09-using-turing-advanced/advanced.jmd @@ -121,7 +121,7 @@ prior of your model but only when computing the log likelihood and the log joint should [check the type of the internal variable `__context_`](https://github.com/TuringLang/DynamicPPL.jl/issues/154) such as -```{julia; eval=false} +```julia; eval=false if DynamicPPL.leafcontext(__context__) !== Turing.PriorContext() Turing.@addlogprob! myloglikelihood(x, μ) end @@ -176,6 +176,59 @@ gdemo(x) = Turing.Model(gdemo, (; x)) model = gdemo([1.5, 2.0]) ``` +### Reparametrization and generated_quantities + +Often, the most natural parameterization for a model is not the most computationally feasible. Consider the following +(efficiently reparametrized) implementation of Neal's funnel [(Neal, 2003)](https://arxiv.org/abs/physics/0009028): + +```julia; eval=false +@model function Neal() + # Raw draws + y_raw ~ Normal(0, 1) + x_raw ~ arraydist([Normal(0, 1) for i in 1:9]) + + # Transform: + y = 3 * y_raw + x = exp.(y ./ 2) .* x_raw + + # Return: + return [x; y] +end +``` + +In this case, the random variables exposed in the chain (`x_raw`, `y_raw`) are not in a helpful form — what we're after is the deterministically transformed variables `x, y`. + +More generally, there are often quantities in our models that we might be interested in viewing, but which are not explicitly present in our chain. + +We can generate draws from these variables — in this case, `x, y` — by adding them as a return statement to the model, and then calling `generated_quantities(model, chain)`. Calling this function outputs an array of values specified in the return statement of the model. + +For example, in the above reparametrization, we sample from our model: + +```julia; eval=false +chain = sample(Neal(), NUTS(), 1000) +``` + +and then call: + +```julia; eval=false +generated_quantities(Neal(), chain) +``` + +to return an array for each posterior sample containing `x1, x2, ... x9, y`. + +In this case, it might be useful to reorganize our output into a matrix for plotting: + +```julia; eval=false +reparam_chain = reduce(hcat, generated_quantities(Neal(), chain))' +``` + +Where we can recover a vector of our samples as follows: + +```julia; eval=false +x1_samples = reparam_chain[:, 1] +y_samples = reparam_chain[:, 10] +``` + ## Task Copying Turing [copies](https://github.com/JuliaLang/julia/issues/4085) Julia tasks to