Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Transfer all high level docs from Turing.jl #375

Merged
merged 56 commits into from
Jul 7, 2023
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
56 commits
Select commit Hold shift + click to select a range
4f49a3e
Add all high level docs from Turing.jl
yebai Jan 6, 2023
912b1bc
Fix some permalinks.
yebai Jan 6, 2023
b9a8143
Converted Quickstart doc
cgbotta Jan 24, 2023
6a82b20
Converted contributing pages to jmd
cgbotta Jan 25, 2023
306f777
add project files
yebai Jan 26, 2023
892c301
Disable evaluating style examples
yebai Jan 26, 2023
712746e
Update and rename index.md to index.jmd
yebai Jan 26, 2023
1c72459
Rename index.jmd to using-turing.jmd
yebai Jan 26, 2023
8ccdb21
fix formatting for guide.md
yebai Jan 26, 2023
0ad33c2
Fixed formatting.
yebai Jan 26, 2023
2d1e31e
Update tutorials/for-developers-interface/interface.md
yebai Jan 26, 2023
ef95a28
Update tutorials/for-developers-abstractmcmc-turing/how_turing_implem…
yebai Jan 26, 2023
c70aece
Update tutorials/for-developers-abstractmcmc-turing/how_turing_implem…
yebai Jan 26, 2023
9e8c16d
update sample viz
yebai Jan 26, 2023
ff63088
add dummy julia code block to enable jmd building
yebai Jan 26, 2023
cb299b1
Bugfix for negative variance.
yebai Jan 26, 2023
bb392ad
More conversions
cgbotta Feb 2, 2023
a1879f6
Added dependencies and changed directory names
cgbotta Feb 8, 2023
341b21e
Fixed dependencies
cgbotta Apr 17, 2023
57d7894
Fixing dependencies
cgbotta Apr 17, 2023
c2efc56
Fixing dependencies
cgbotta Apr 17, 2023
af861a4
Fixing dependencies
cgbotta Apr 17, 2023
06b1f1a
Fixing dependencies
cgbotta Apr 17, 2023
f6d8ed9
Fixing dependencies
cgbotta Apr 17, 2023
98a4eb8
Fixing dependencies
cgbotta Apr 17, 2023
ee0c267
Fixing dependencies
cgbotta Apr 17, 2023
1d14387
Fixing dependencies
cgbotta Apr 17, 2023
80c455a
Fixing dependencies
cgbotta Apr 17, 2023
a4f4659
Fixing dependencies
cgbotta Apr 17, 2023
6aa8085
Fixing dependencies
cgbotta Apr 17, 2023
8f43c61
Fixing dependencies
cgbotta Apr 17, 2023
6e3433a
Fixing dependencies
cgbotta Apr 17, 2023
4cad629
Fixing dependencies
cgbotta Apr 17, 2023
051da51
Fixing dependencies
cgbotta Apr 17, 2023
2d1583f
other tutorials
cgbotta Apr 17, 2023
b47ab65
other tutorials
cgbotta Apr 17, 2023
96ecc01
other tutorials
cgbotta Apr 17, 2023
49dbdc5
Merge branch 'master' into hg/transfer-docs
yebai Jun 11, 2023
da8bbee
fix merge conflict with master
yebai Jul 6, 2023
41c8c1a
some edits to advanced
yebai Jul 6, 2023
b86ec7f
remove weave from deps
yebai Jul 6, 2023
8787423
improve compiler
yebai Jul 6, 2023
1ef0797
added julia formatting for a block which it was missing (#398)
torfjelde Jul 6, 2023
4ec62cb
AbstractMCMC fixes, rename some files, and update manifests (#399)
willtebbutt Jul 6, 2023
2f51b7b
no empty cell + hyperlink to bishop (#400)
yebai Jul 6, 2023
886d617
add numeric prefix to docs
yebai Jul 6, 2023
56f50cb
triger rebuild
yebai Jul 6, 2023
b3c814a
add Distributions to deps
yebai Jul 6, 2023
8c07963
minor fixes
yebai Jul 6, 2023
a0889fc
Minor edits
yebai Jul 6, 2023
3f0d51e
Update tutorials/docs-12-using-turing-guide/using-turing-guide.jmd
yebai Jul 6, 2023
919ba2f
Update tutorials/docs-09-using-turing-advanced/advanced.jmd
yebai Jul 6, 2023
210a472
fix var name typo
yebai Jul 7, 2023
4447143
Don't eval `distributed` and `optim` code chunks
sunxd3 Jul 7, 2023
8f8dca8
Update using-turing-guide.jmd
yebai Jul 7, 2023
faf76c0
Remove hardcoded output.
sunxd3 Jul 7, 2023
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Original file line number Diff line number Diff line change
Expand Up @@ -191,3 +191,7 @@ if isdefined(Main, :TuringTutorials)
Main.TuringTutorials.tutorial_footer(WEAVE_ARGS[:folder], WEAVE_ARGS[:file])
end
```

```julia

```
4 changes: 4 additions & 0 deletions tutorials/02-logistic-regression/02_logistic-regression.jmd
Original file line number Diff line number Diff line change
Expand Up @@ -244,3 +244,7 @@ if isdefined(Main, :TuringTutorials)
Main.TuringTutorials.tutorial_footer(WEAVE_ARGS[:folder], WEAVE_ARGS[:file])
end
```

```julia

```
Original file line number Diff line number Diff line change
Expand Up @@ -198,3 +198,8 @@ if isdefined(Main, :TuringTutorials)
Main.TuringTutorials.tutorial_footer(WEAVE_ARGS[:folder], WEAVE_ARGS[:file])
end
```

```{julia; eval=false}


```
4 changes: 4 additions & 0 deletions tutorials/04-hidden-markov-model/04_hidden-markov-model.jmd
Original file line number Diff line number Diff line change
Expand Up @@ -190,3 +190,7 @@ if isdefined(Main, :TuringTutorials)
Main.TuringTutorials.tutorial_footer(WEAVE_ARGS[:folder], WEAVE_ARGS[:file])
end
```

```julia

```
4 changes: 4 additions & 0 deletions tutorials/05-linear-regression/05_linear-regression.jmd
Original file line number Diff line number Diff line change
Expand Up @@ -231,3 +231,7 @@ if isdefined(Main, :TuringTutorials)
Main.TuringTutorials.tutorial_footer(WEAVE_ARGS[:folder], WEAVE_ARGS[:file])
end
```

```julia

```
Original file line number Diff line number Diff line change
Expand Up @@ -279,3 +279,7 @@ if isdefined(Main, :TuringTutorials)
Main.TuringTutorials.tutorial_footer(WEAVE_ARGS[:folder], WEAVE_ARGS[:file])
end
```

```julia

```
1 change: 0 additions & 1 deletion tutorials/07-poisson-regression/07_poisson-regression.jmd
Original file line number Diff line number Diff line change
Expand Up @@ -221,4 +221,3 @@ As can be seen from the numeric values and the plots above, the standard deviati
if isdefined(Main, :TuringTutorials)
Main.TuringTutorials.tutorial_footer(WEAVE_ARGS[:folder], WEAVE_ARGS[:file])
end
```
Original file line number Diff line number Diff line change
Expand Up @@ -8,7 +8,7 @@ weave_options:

[Multinomial logistic regression](https://en.wikipedia.org/wiki/Multinomial_logistic_regression) is an extension of logistic regression. Logistic regression is used to model problems in which there are exactly two possible discrete outcomes. Multinomial logistic regression is used to model problems in which there are two or more possible discrete outcomes.

In our example, we'll be using the iris dataset. The goal of the iris multiclass problem is to predict the species of a flower given measurements (in centimeters) of sepal length and width and petal length and width. There are three possible species: Iris setosa, Iris versicolor, and Iris virginica.
In our example, we'll be using the iris dataset. The iris multiclass problem aims to predict the species of a flower given measurements (in centimeters) of sepal length and width and petal length and width. There are three possible species: Iris setosa, Iris versicolor, and Iris virginica.

To start, let's import all the libraries we'll need.

Expand Down Expand Up @@ -214,4 +214,4 @@ This tutorial has demonstrated how to use Turing to perform Bayesian multinomial
if isdefined(Main, :TuringTutorials)
Main.TuringTutorials.tutorial_footer(WEAVE_ARGS[:folder], WEAVE_ARGS[:file])
end
```
```
yebai marked this conversation as resolved.
Show resolved Hide resolved
yebai marked this conversation as resolved.
Show resolved Hide resolved
30 changes: 30 additions & 0 deletions tutorials/08-multinomial-logistic-regression/Manifest.toml
Original file line number Diff line number Diff line change
Expand Up @@ -539,6 +539,12 @@ git-tree-sha1 = "129acf094d168394e80ee1dc4bc06ec835e510a3"
uuid = "2e76f6c2-a576-52d4-95c1-20adfe4de566"
version = "2.8.1+1"

[[Highlights]]
deps = ["DocStringExtensions", "InteractiveUtils", "REPL"]
git-tree-sha1 = "0341077e8a6b9fc1c2ea5edc1e93a956d2aec0c7"
uuid = "eafb193a-b7ab-5a9e-9068-77385905fa72"
version = "0.5.2"

[[HypergeometricFunctions]]
deps = ["DualNumbers", "LinearAlgebra", "OpenLibm_jll", "SpecialFunctions", "Test"]
git-tree-sha1 = "709d864e3ed6e3545230601f94e11ebc65994641"
Expand Down Expand Up @@ -907,6 +913,12 @@ git-tree-sha1 = "91a48569383df24f0fd2baf789df2aade3d0ad80"
uuid = "6f286f6a-111f-5878-ab1e-185364afe411"
version = "0.10.1"

[[Mustache]]
deps = ["Printf", "Tables"]
git-tree-sha1 = "821e918c170ead5298ff84bffee41dd28929a681"
uuid = "ffc61752-8dc7-55ee-8c37-f3e9cdd09e70"
version = "1.0.17"

[[NNlib]]
deps = ["Adapt", "ChainRulesCore", "LinearAlgebra", "Pkg", "Random", "Requires", "Statistics"]
git-tree-sha1 = "33ad5a19dc6730d592d8ce91c14354d758e53b0e"
Expand Down Expand Up @@ -1322,6 +1334,12 @@ git-tree-sha1 = "e0d5bc26226ab1b7648278169858adcfbd861780"
uuid = "f3b207a7-027a-5e70-b257-86293d7955fd"
version = "0.15.4"

[[StringEncodings]]
deps = ["Libiconv_jll"]
git-tree-sha1 = "33c0da881af3248dafefb939a21694b97cfece76"
uuid = "69024149-9ee7-55f6-a4c4-859efe599b68"
version = "0.3.6"

[[StringManipulation]]
git-tree-sha1 = "46da2434b41f41ac3594ee9816ce5541c6096123"
uuid = "892a3eda-7b42-436c-8928-eab12a02cf0e"
Expand Down Expand Up @@ -1472,6 +1490,12 @@ git-tree-sha1 = "b1be2855ed9ed8eac54e5caff2afcdb442d52c23"
uuid = "ea10d353-3f73-51f8-a26c-33c1cb351aa5"
version = "1.4.2"

[[Weave]]
deps = ["Base64", "Dates", "Highlights", "JSON", "Markdown", "Mustache", "Pkg", "Printf", "REPL", "RelocatableFolders", "Requires", "Serialization", "YAML"]
git-tree-sha1 = "092217eb5443926d200ae9325f103906efbb68b1"
uuid = "44d3d7a6-8a23-5bf8-98c5-b353f8df5ec9"
version = "0.10.12"

[[Widgets]]
deps = ["Colors", "Dates", "Observables", "OrderedCollections"]
git-tree-sha1 = "fcdae142c1cfc7d89de2d11e08721d0f2f86c98a"
Expand Down Expand Up @@ -1627,6 +1651,12 @@ git-tree-sha1 = "79c31e7844f6ecf779705fbc12146eb190b7d845"
uuid = "c5fb5394-a638-5e4d-96e5-b29de1b5cf10"
version = "1.4.0+3"

[[YAML]]
deps = ["Base64", "Dates", "Printf", "StringEncodings"]
git-tree-sha1 = "e6330e4b731a6af7959673621e91645eb1356884"
uuid = "ddb6d928-2868-570f-bddf-ab3f9cf99eb6"
version = "0.4.9"

[[Zlib_jll]]
deps = ["Libdl"]
uuid = "83775a58-1f1d-513f-b197-d71354ab007a"
Expand Down
1 change: 1 addition & 0 deletions tutorials/08-multinomial-logistic-regression/Project.toml
Original file line number Diff line number Diff line change
Expand Up @@ -7,3 +7,4 @@ RDatasets = "ce6b1742-4840-55fa-b093-852dadbb1d8b"
Random = "9a3f8284-a2c9-5f02-9a11-845980a1fd5c"
StatsPlots = "f3b207a7-027a-5e70-b257-86293d7955fd"
Turing = "fce5fe82-541a-59a6-adf8-730c64b5f9a0"
Weave = "44d3d7a6-8a23-5bf8-98c5-b353f8df5ec9"
Original file line number Diff line number Diff line change
Expand Up @@ -868,3 +868,8 @@ if isdefined(Main, :TuringTutorials)
Main.TuringTutorials.tutorial_footer(WEAVE_ARGS[:folder], WEAVE_ARGS[:file])
end
```

```{julia; eval=false}


```
Original file line number Diff line number Diff line change
Expand Up @@ -447,3 +447,7 @@ if isdefined(Main, :TuringTutorials)
Main.TuringTutorials.tutorial_footer(WEAVE_ARGS[:folder], WEAVE_ARGS[:file])
end
```

```julia

```
4 changes: 4 additions & 0 deletions tutorials/11-probabilistic-pca/11_probabilistic-pca.jmd
Original file line number Diff line number Diff line change
Expand Up @@ -375,3 +375,7 @@ It can also thought as a matrix factorisation method, in which $\mathbf{X}=(\mat
[^3]: Gareth M. James, Daniela Witten, Trevor Hastie, Robert Tibshirani, *An Introduction to Statistical Learning*, Springer, 2013.
[^4]: David Wipf, Srikantan Nagarajan, *A New View of Automatic Relevance Determination*, NIPS 2007.
[^5]: Christopher Bishop, *Pattern Recognition and Machine Learning*, Springer, 2006.

yebai marked this conversation as resolved.
Show resolved Hide resolved
```julia

```
32 changes: 18 additions & 14 deletions tutorials/12-gaussian-process/12_gaussian-process.jmd
Original file line number Diff line number Diff line change
Expand Up @@ -18,7 +18,7 @@ Importantly, it provides the advantage that the linear mappings from the embedde

Let's start by loading some dependencies.

```julia
```{julia; eval=false}
using Turing
using AbstractGPs
using FillArrays
Expand All @@ -41,7 +41,7 @@ although this is an active area of research.
We will briefly touch on some ways to speed things up at the end of this tutorial.
We transform the original data with non-linear operations in order to demonstrate the power of GPs to work on non-linear relationships, while keeping the problem reasonably small.

```julia
```{julia; eval=false}
data = dataset("datasets", "iris")
species = data[!, "Species"]
index = shuffle(1:150)
Expand All @@ -66,7 +66,7 @@ Indeed, pPCA is basically equivalent to running the GPLVM model with an automati

First, we re-introduce the pPCA model (see the tutorial on pPCA for details)

```julia
```{julia; eval=false}
@model function pPCA(x)
# Dimensionality of the problem.
N, D = size(x)
Expand All @@ -85,15 +85,15 @@ end;
We define two different kernels, a simple linear kernel with an Automatic Relevance Determination transform and a
squared exponential kernel.

```julia
```{julia; eval=false}
linear_kernel(α) = LinearKernel() ∘ ARDTransform(α)
sekernel(α, σ) = σ * SqExponentialKernel() ∘ ARDTransform(α);
```

And here is the GPLVM model.
We create separate models for the two types of kernel.

```julia
```{julia; eval=false}
@model function GPLVM_linear(Y, K)
# Dimensionality of the problem.
N, D = size(Y)
Expand Down Expand Up @@ -134,7 +134,7 @@ end;
end;
```

```julia
```{julia; eval=false}
# Standard GPs don't scale very well in n, so we use a small subsample for the purpose of this tutorial
n_data = 40
# number of features to use from dataset
Expand All @@ -143,12 +143,12 @@ n_features = 4
ndim = 4;
```

```julia
```{julia; eval=false}
ppca = pPCA(dat[1:n_data, 1:n_features])
chain_ppca = sample(ppca, NUTS{Turing.ReverseDiffAD{true}}(), 1000);
```

```julia
```{julia; eval=false}
# we extract the posterior mean estimates of the parameters from the chain
z_mean = reshape(mean(group(chain_ppca, :z))[:, 2], (n_features, n_data))
scatter(z_mean[1, :], z_mean[2, :]; group=labels[1:n_data], xlabel=L"z_1", ylabel=L"z_2")
Expand All @@ -161,12 +161,12 @@ using pPCA (see the pPCA tutorial).

Let's try the same with our linear kernel GPLVM model.

```julia
```{julia; eval=false}
gplvm_linear = GPLVM_linear(dat[1:n_data, 1:n_features], ndim)
chain_linear = sample(gplvm_linear, NUTS{Turing.ReverseDiffAD{true}}(), 500);
```

```julia
```{julia; eval=false}
# we extract the posterior mean estimates of the parameters from the chain
z_mean = reshape(mean(group(chain_linear, :Z))[:, 2], (n_features, n_data))
alpha_mean = mean(group(chain_linear, :α))[:, 2]
Expand All @@ -186,12 +186,12 @@ We can see that similar to the pPCA case, the linear kernel GPLVM fails to disti

Finally, we demonstrate that by changing the kernel to a non-linear function, we are able to separate the data again.

```julia
```{julia; eval=false}
gplvm = GPLVM(dat[1:n_data, 1:n_features], ndim)
chain_gplvm = sample(gplvm, NUTS{Turing.ReverseDiffAD{true}}(), 500);
```

```julia
```{julia; eval=false}
# we extract the posterior mean estimates of the parameters from the chain
z_mean = reshape(mean(group(chain_gplvm, :Z))[:, 2], (ndim, n_data))
alpha_mean = mean(group(chain_gplvm, :α))[:, 2]
Expand All @@ -206,7 +206,7 @@ scatter(
)
```

```julia; echo=false
```{julia; eval=false}
let
@assert abs(
mean(z_mean[alpha1, labels[1:n_data] .== "setosa"]) -
Expand All @@ -217,8 +217,12 @@ end

Now, the split between the two groups is visible again.

```julia, echo=false, skip="notebook", tangle=false
```{julia; eval=false}
if isdefined(Main, :TuringTutorials)
Main.TuringTutorials.tutorial_footer(WEAVE_ARGS[:folder], WEAVE_ARGS[:file])
end
```

```julia

```
4 changes: 4 additions & 0 deletions tutorials/13-seasonal-time-series/13_seasonal_time_series.jmd
Original file line number Diff line number Diff line change
Expand Up @@ -334,3 +334,7 @@ if isdefined(Main, :TuringTutorials)
Main.TuringTutorials.tutorial_footer(WEAVE_ARGS[:folder], WEAVE_ARGS[:file])
end
```

```julia

```
4 changes: 4 additions & 0 deletions tutorials/14-minituring/14_minituring.jmd
Original file line number Diff line number Diff line change
Expand Up @@ -286,3 +286,7 @@ sample(turing_m(3.0), MH(ScalMat(2, 1.0)), 1_000_000)
```

As you can see, with our simple probabilistic programming language and custom samplers we get similar results as Turing.

```julia

```
Loading