Skip to content

Commit

Permalink
Transfer all high level docs from Turing.jl (#375)
Browse files Browse the repository at this point in the history
* Add all high level docs from Turing.jl

* Fix some permalinks.

* Converted Quickstart doc

* Converted contributing pages to jmd

* add project files

* Disable evaluating style examples

* Update and rename index.md to index.jmd

* Rename index.jmd to using-turing.jmd

* fix formatting for guide.md

* Fixed formatting.

* Update tutorials/for-developers-interface/interface.md

Co-authored-by: github-actions[bot] <41898282+github-actions[bot]@users.noreply.github.com>

* Update tutorials/for-developers-abstractmcmc-turing/how_turing_implements_abstractmcmc.md

Co-authored-by: github-actions[bot] <41898282+github-actions[bot]@users.noreply.github.com>

* Update tutorials/for-developers-abstractmcmc-turing/how_turing_implements_abstractmcmc.md

Co-authored-by: github-actions[bot] <41898282+github-actions[bot]@users.noreply.github.com>

* update sample viz

* add dummy julia code block to enable jmd building

* Bugfix for negative variance.

* More conversions

* Added dependencies and changed directory names

* Fixed dependencies

* Fixing dependencies

* Fixing dependencies

* Fixing dependencies

* Fixing dependencies

* Fixing dependencies

* Fixing dependencies

* Fixing dependencies

* Fixing dependencies

* Fixing dependencies

* Fixing dependencies

* Fixing dependencies

* Fixing dependencies

* Fixing dependencies

* Fixing dependencies

* Fixing dependencies

* other tutorials

* other tutorials

* other tutorials

* fix merge conflict with master

* some edits to advanced

* remove weave from deps

* improve compiler

* added julia formatting for a block which it was missing (#398)

Co-authored-by: Hong Ge <[email protected]>

* AbstractMCMC fixes, rename some files, and update manifests (#399)

* abstract-mcmc

* Rename files

* no empty cell + hyperlink to bishop (#400)

* add numeric prefix to docs

* triger rebuild

* add Distributions to deps

* minor fixes

* Minor edits

* Update tutorials/docs-12-using-turing-guide/using-turing-guide.jmd

Co-authored-by: github-actions[bot] <41898282+github-actions[bot]@users.noreply.github.com>

* Update tutorials/docs-09-using-turing-advanced/advanced.jmd

Co-authored-by: github-actions[bot] <41898282+github-actions[bot]@users.noreply.github.com>

* fix var name typo

* Don't eval `distributed` and `optim` code chunks

* Update using-turing-guide.jmd

* Remove hardcoded output.

---------

Co-authored-by: Colton Botta <[email protected]>
Co-authored-by: github-actions[bot] <41898282+github-actions[bot]@users.noreply.github.com>
Co-authored-by: Tor Erlend Fjelde <[email protected]>
Co-authored-by: Will Tebbutt <[email protected]>
Co-authored-by: Xianda Sun <[email protected]>
  • Loading branch information
6 people authored Jul 7, 2023
1 parent 0a8c7cb commit 0294372
Show file tree
Hide file tree
Showing 62 changed files with 26,399 additions and 17 deletions.
Original file line number Diff line number Diff line change
Expand Up @@ -191,3 +191,7 @@ if isdefined(Main, :TuringTutorials)
Main.TuringTutorials.tutorial_footer(WEAVE_ARGS[:folder], WEAVE_ARGS[:file])
end
```

```julia

```
4 changes: 4 additions & 0 deletions tutorials/02-logistic-regression/02_logistic-regression.jmd
Original file line number Diff line number Diff line change
Expand Up @@ -244,3 +244,7 @@ if isdefined(Main, :TuringTutorials)
Main.TuringTutorials.tutorial_footer(WEAVE_ARGS[:folder], WEAVE_ARGS[:file])
end
```

```julia

```
Original file line number Diff line number Diff line change
Expand Up @@ -198,3 +198,8 @@ if isdefined(Main, :TuringTutorials)
Main.TuringTutorials.tutorial_footer(WEAVE_ARGS[:folder], WEAVE_ARGS[:file])
end
```

```{julia; eval=false}


```
4 changes: 4 additions & 0 deletions tutorials/04-hidden-markov-model/04_hidden-markov-model.jmd
Original file line number Diff line number Diff line change
Expand Up @@ -190,3 +190,7 @@ if isdefined(Main, :TuringTutorials)
Main.TuringTutorials.tutorial_footer(WEAVE_ARGS[:folder], WEAVE_ARGS[:file])
end
```

```julia

```
4 changes: 4 additions & 0 deletions tutorials/05-linear-regression/05_linear-regression.jmd
Original file line number Diff line number Diff line change
Expand Up @@ -231,3 +231,7 @@ if isdefined(Main, :TuringTutorials)
Main.TuringTutorials.tutorial_footer(WEAVE_ARGS[:folder], WEAVE_ARGS[:file])
end
```

```julia

```
Original file line number Diff line number Diff line change
Expand Up @@ -279,3 +279,7 @@ if isdefined(Main, :TuringTutorials)
Main.TuringTutorials.tutorial_footer(WEAVE_ARGS[:folder], WEAVE_ARGS[:file])
end
```

```julia

```
1 change: 0 additions & 1 deletion tutorials/07-poisson-regression/07_poisson-regression.jmd
Original file line number Diff line number Diff line change
Expand Up @@ -221,4 +221,3 @@ As can be seen from the numeric values and the plots above, the standard deviati
if isdefined(Main, :TuringTutorials)
Main.TuringTutorials.tutorial_footer(WEAVE_ARGS[:folder], WEAVE_ARGS[:file])
end
```
Original file line number Diff line number Diff line change
Expand Up @@ -8,7 +8,7 @@ weave_options:

[Multinomial logistic regression](https://en.wikipedia.org/wiki/Multinomial_logistic_regression) is an extension of logistic regression. Logistic regression is used to model problems in which there are exactly two possible discrete outcomes. Multinomial logistic regression is used to model problems in which there are two or more possible discrete outcomes.

In our example, we'll be using the iris dataset. The goal of the iris multiclass problem is to predict the species of a flower given measurements (in centimeters) of sepal length and width and petal length and width. There are three possible species: Iris setosa, Iris versicolor, and Iris virginica.
In our example, we'll be using the iris dataset. The iris multiclass problem aims to predict the species of a flower given measurements (in centimeters) of sepal length and width and petal length and width. There are three possible species: Iris setosa, Iris versicolor, and Iris virginica.

To start, let's import all the libraries we'll need.

Expand Down Expand Up @@ -214,4 +214,4 @@ This tutorial has demonstrated how to use Turing to perform Bayesian multinomial
if isdefined(Main, :TuringTutorials)
Main.TuringTutorials.tutorial_footer(WEAVE_ARGS[:folder], WEAVE_ARGS[:file])
end
```
```
30 changes: 30 additions & 0 deletions tutorials/08-multinomial-logistic-regression/Manifest.toml
Original file line number Diff line number Diff line change
Expand Up @@ -555,6 +555,12 @@ git-tree-sha1 = "129acf094d168394e80ee1dc4bc06ec835e510a3"
uuid = "2e76f6c2-a576-52d4-95c1-20adfe4de566"
version = "2.8.1+1"

[[Highlights]]
deps = ["DocStringExtensions", "InteractiveUtils", "REPL"]
git-tree-sha1 = "0341077e8a6b9fc1c2ea5edc1e93a956d2aec0c7"
uuid = "eafb193a-b7ab-5a9e-9068-77385905fa72"
version = "0.5.2"

[[HypergeometricFunctions]]
deps = ["DualNumbers", "LinearAlgebra", "OpenLibm_jll", "SpecialFunctions"]
git-tree-sha1 = "0ec02c648befc2f94156eaef13b0f38106212f3f"
Expand Down Expand Up @@ -942,6 +948,12 @@ git-tree-sha1 = "68bf5103e002c44adfd71fea6bd770b3f0586843"
uuid = "6f286f6a-111f-5878-ab1e-185364afe411"
version = "0.10.2"

[[Mustache]]
deps = ["Printf", "Tables"]
git-tree-sha1 = "821e918c170ead5298ff84bffee41dd28929a681"
uuid = "ffc61752-8dc7-55ee-8c37-f3e9cdd09e70"
version = "1.0.17"

[[NNlib]]
deps = ["Adapt", "Atomix", "ChainRulesCore", "GPUArraysCore", "KernelAbstractions", "LinearAlgebra", "Pkg", "Random", "Requires", "Statistics"]
git-tree-sha1 = "72240e3f5ca031937bd536182cb2c031da5f46dd"
Expand Down Expand Up @@ -1363,6 +1375,12 @@ git-tree-sha1 = "14ef622cf28b05e38f8af1de57bc9142b03fbfe3"
uuid = "f3b207a7-027a-5e70-b257-86293d7955fd"
version = "0.15.5"

[[StringEncodings]]
deps = ["Libiconv_jll"]
git-tree-sha1 = "33c0da881af3248dafefb939a21694b97cfece76"
uuid = "69024149-9ee7-55f6-a4c4-859efe599b68"
version = "0.3.6"

[[StringManipulation]]
git-tree-sha1 = "46da2434b41f41ac3594ee9816ce5541c6096123"
uuid = "892a3eda-7b42-436c-8928-eab12a02cf0e"
Expand Down Expand Up @@ -1536,6 +1554,12 @@ git-tree-sha1 = "b1be2855ed9ed8eac54e5caff2afcdb442d52c23"
uuid = "ea10d353-3f73-51f8-a26c-33c1cb351aa5"
version = "1.4.2"

[[Weave]]
deps = ["Base64", "Dates", "Highlights", "JSON", "Markdown", "Mustache", "Pkg", "Printf", "REPL", "RelocatableFolders", "Requires", "Serialization", "YAML"]
git-tree-sha1 = "092217eb5443926d200ae9325f103906efbb68b1"
uuid = "44d3d7a6-8a23-5bf8-98c5-b353f8df5ec9"
version = "0.10.12"

[[Widgets]]
deps = ["Colors", "Dates", "Observables", "OrderedCollections"]
git-tree-sha1 = "fcdae142c1cfc7d89de2d11e08721d0f2f86c98a"
Expand Down Expand Up @@ -1691,6 +1715,12 @@ git-tree-sha1 = "79c31e7844f6ecf779705fbc12146eb190b7d845"
uuid = "c5fb5394-a638-5e4d-96e5-b29de1b5cf10"
version = "1.4.0+3"

[[YAML]]
deps = ["Base64", "Dates", "Printf", "StringEncodings"]
git-tree-sha1 = "e6330e4b731a6af7959673621e91645eb1356884"
uuid = "ddb6d928-2868-570f-bddf-ab3f9cf99eb6"
version = "0.4.9"

[[Zlib_jll]]
deps = ["Libdl"]
uuid = "83775a58-1f1d-513f-b197-d71354ab007a"
Expand Down
1 change: 1 addition & 0 deletions tutorials/08-multinomial-logistic-regression/Project.toml
Original file line number Diff line number Diff line change
Expand Up @@ -7,3 +7,4 @@ RDatasets = "ce6b1742-4840-55fa-b093-852dadbb1d8b"
Random = "9a3f8284-a2c9-5f02-9a11-845980a1fd5c"
StatsPlots = "f3b207a7-027a-5e70-b257-86293d7955fd"
Turing = "fce5fe82-541a-59a6-adf8-730c64b5f9a0"
Weave = "44d3d7a6-8a23-5bf8-98c5-b353f8df5ec9"
Original file line number Diff line number Diff line change
Expand Up @@ -868,3 +868,8 @@ if isdefined(Main, :TuringTutorials)
Main.TuringTutorials.tutorial_footer(WEAVE_ARGS[:folder], WEAVE_ARGS[:file])
end
```

```{julia; eval=false}


```
Original file line number Diff line number Diff line change
Expand Up @@ -447,3 +447,7 @@ if isdefined(Main, :TuringTutorials)
Main.TuringTutorials.tutorial_footer(WEAVE_ARGS[:folder], WEAVE_ARGS[:file])
end
```

```julia

```
4 changes: 4 additions & 0 deletions tutorials/11-probabilistic-pca/11_probabilistic-pca.jmd
Original file line number Diff line number Diff line change
Expand Up @@ -375,3 +375,7 @@ It can also thought as a matrix factorisation method, in which $\mathbf{X}=(\mat
[^3]: Gareth M. James, Daniela Witten, Trevor Hastie, Robert Tibshirani, *An Introduction to Statistical Learning*, Springer, 2013.
[^4]: David Wipf, Srikantan Nagarajan, *A New View of Automatic Relevance Determination*, NIPS 2007.
[^5]: Christopher Bishop, *Pattern Recognition and Machine Learning*, Springer, 2006.

```julia

```
32 changes: 18 additions & 14 deletions tutorials/12-gaussian-process/12_gaussian-process.jmd
Original file line number Diff line number Diff line change
Expand Up @@ -18,7 +18,7 @@ Importantly, it provides the advantage that the linear mappings from the embedde

Let's start by loading some dependencies.

```julia
```{julia; eval=false}
using Turing
using AbstractGPs
using FillArrays
Expand All @@ -41,7 +41,7 @@ although this is an active area of research.
We will briefly touch on some ways to speed things up at the end of this tutorial.
We transform the original data with non-linear operations in order to demonstrate the power of GPs to work on non-linear relationships, while keeping the problem reasonably small.

```julia
```{julia; eval=false}
data = dataset("datasets", "iris")
species = data[!, "Species"]
index = shuffle(1:150)
Expand All @@ -66,7 +66,7 @@ Indeed, pPCA is basically equivalent to running the GPLVM model with an automati

First, we re-introduce the pPCA model (see the tutorial on pPCA for details)

```julia
```{julia; eval=false}
@model function pPCA(x)
# Dimensionality of the problem.
N, D = size(x)
Expand All @@ -85,15 +85,15 @@ end;
We define two different kernels, a simple linear kernel with an Automatic Relevance Determination transform and a
squared exponential kernel.

```julia
```{julia; eval=false}
linear_kernel(α) = LinearKernel() ∘ ARDTransform(α)
sekernel(α, σ) = σ * SqExponentialKernel() ∘ ARDTransform(α);
```

And here is the GPLVM model.
We create separate models for the two types of kernel.

```julia
```{julia; eval=false}
@model function GPLVM_linear(Y, K)
# Dimensionality of the problem.
N, D = size(Y)
Expand Down Expand Up @@ -134,7 +134,7 @@ end;
end;
```

```julia
```{julia; eval=false}
# Standard GPs don't scale very well in n, so we use a small subsample for the purpose of this tutorial
n_data = 40
# number of features to use from dataset
Expand All @@ -143,12 +143,12 @@ n_features = 4
ndim = 4;
```

```julia
```{julia; eval=false}
ppca = pPCA(dat[1:n_data, 1:n_features])
chain_ppca = sample(ppca, NUTS{Turing.ReverseDiffAD{true}}(), 1000);
```

```julia
```{julia; eval=false}
# we extract the posterior mean estimates of the parameters from the chain
z_mean = reshape(mean(group(chain_ppca, :z))[:, 2], (n_features, n_data))
scatter(z_mean[1, :], z_mean[2, :]; group=labels[1:n_data], xlabel=L"z_1", ylabel=L"z_2")
Expand All @@ -161,12 +161,12 @@ using pPCA (see the pPCA tutorial).

Let's try the same with our linear kernel GPLVM model.

```julia
```{julia; eval=false}
gplvm_linear = GPLVM_linear(dat[1:n_data, 1:n_features], ndim)
chain_linear = sample(gplvm_linear, NUTS{Turing.ReverseDiffAD{true}}(), 500);
```

```julia
```{julia; eval=false}
# we extract the posterior mean estimates of the parameters from the chain
z_mean = reshape(mean(group(chain_linear, :Z))[:, 2], (n_features, n_data))
alpha_mean = mean(group(chain_linear, :α))[:, 2]
Expand All @@ -186,12 +186,12 @@ We can see that similar to the pPCA case, the linear kernel GPLVM fails to disti

Finally, we demonstrate that by changing the kernel to a non-linear function, we are able to separate the data again.

```julia
```{julia; eval=false}
gplvm = GPLVM(dat[1:n_data, 1:n_features], ndim)
chain_gplvm = sample(gplvm, NUTS{Turing.ReverseDiffAD{true}}(), 500);
```

```julia
```{julia; eval=false}
# we extract the posterior mean estimates of the parameters from the chain
z_mean = reshape(mean(group(chain_gplvm, :Z))[:, 2], (ndim, n_data))
alpha_mean = mean(group(chain_gplvm, :α))[:, 2]
Expand All @@ -206,7 +206,7 @@ scatter(
)
```

```julia; echo=false
```{julia; eval=false}
let
@assert abs(
mean(z_mean[alpha1, labels[1:n_data] .== "setosa"]) -
Expand All @@ -217,8 +217,12 @@ end

Now, the split between the two groups is visible again.

```julia, echo=false, skip="notebook", tangle=false
```{julia; eval=false}
if isdefined(Main, :TuringTutorials)
Main.TuringTutorials.tutorial_footer(WEAVE_ARGS[:folder], WEAVE_ARGS[:file])
end
```

```julia

```
4 changes: 4 additions & 0 deletions tutorials/13-seasonal-time-series/13_seasonal_time_series.jmd
Original file line number Diff line number Diff line change
Expand Up @@ -334,3 +334,7 @@ if isdefined(Main, :TuringTutorials)
Main.TuringTutorials.tutorial_footer(WEAVE_ARGS[:folder], WEAVE_ARGS[:file])
end
```

```julia

```
4 changes: 4 additions & 0 deletions tutorials/14-minituring/14_minituring.jmd
Original file line number Diff line number Diff line change
Expand Up @@ -286,3 +286,7 @@ sample(turing_m(3.0), MH(ScalMat(2, 1.0)), 1_000_000)
```

As you can see, with our simple probabilistic programming language and custom samplers we get similar results as Turing.

```julia

```
Loading

0 comments on commit 0294372

Please sign in to comment.