From dd7768f4715a0f777a149ae9e0c7aeebc3e6eeb7 Mon Sep 17 00:00:00 2001 From: Markus Hauru Date: Wed, 29 May 2024 10:32:06 +0100 Subject: [PATCH] Fix some typos and broken links --- tutorials/docs-00-getting-started/index.qmd | 2 +- tutorials/docs-01-contributing-guide/index.qmd | 2 +- .../index.qmd | 4 ++-- tutorials/docs-12-using-turing-guide/index.qmd | 10 ++++++---- 4 files changed, 10 insertions(+), 8 deletions(-) diff --git a/tutorials/docs-00-getting-started/index.qmd b/tutorials/docs-00-getting-started/index.qmd index 930419cef..fbc99e95c 100644 --- a/tutorials/docs-00-getting-started/index.qmd +++ b/tutorials/docs-00-getting-started/index.qmd @@ -95,7 +95,7 @@ function sample_posterior(alpha, beta, mean, lambda, iterations) for i in 1:iterations sample_variance = rand(InverseGamma(alpha, beta), 1) sample_x = rand(Normal(mean, sqrt(sample_variance[1]) / lambda), 1) - sanples = append!(samples, sample_x) + samples = append!(samples, sample_x) end return samples end diff --git a/tutorials/docs-01-contributing-guide/index.qmd b/tutorials/docs-01-contributing-guide/index.qmd index 56c64e85d..0821202d9 100755 --- a/tutorials/docs-01-contributing-guide/index.qmd +++ b/tutorials/docs-01-contributing-guide/index.qmd @@ -4,7 +4,7 @@ title: Contributing Turing is an open-source project. If you feel that you have relevant skills and are interested in contributing, please get in touch. You can contribute by opening issues on GitHub, implementing things yourself, and making a pull request. We would also appreciate example models written using Turing. -Turing has a [style guide]({{< meta site-url >}}docs/tutorials/docs-02-contributing-style-guide/). Reviewing it before making a pull request is not strictly necessary, but you may be asked to change portions of your code to conform with the style guide before it is merged. +Turing has a [style guide](#style-guide). Reviewing it before making a pull request is not strictly necessary, but you may be asked to change portions of your code to conform with the style guide before it is merged. ### What Can I Do? diff --git a/tutorials/docs-04-for-developers-abstractmcmc-turing/index.qmd b/tutorials/docs-04-for-developers-abstractmcmc-turing/index.qmd index 1613fa7ef..fa90d8f27 100755 --- a/tutorials/docs-04-for-developers-abstractmcmc-turing/index.qmd +++ b/tutorials/docs-04-for-developers-abstractmcmc-turing/index.qmd @@ -33,9 +33,9 @@ n_samples = 1000 chn = sample(mod, alg, n_samples, progress=false) ``` -The function `sample` is part of the AbstractMCMC interface. As explained in the [interface guide](https://turinglang.org/dev/docs/for-developers/interface), building a a sampling method that can be used by `sample` consists in overloading the structs and functions in `AbstractMCMC`. The interface guide also gives a standalone example of their implementation, [`AdvancedMH.jl`](). +The function `sample` is part of the AbstractMCMC interface. As explained in the [interface guide](https://turinglang.org/dev/docs/for-developers/interface), building a sampling method that can be used by `sample` consists in overloading the structs and functions in `AbstractMCMC`. The interface guide also gives a standalone example of their implementation, [`AdvancedMH.jl`](). -Turing sampling methods (most of which are written [here](https://github.com/TuringLang/Turing.jl/tree/master/src/inference)) also implement `AbstractMCMC`. Turing defines a particular architecture for `AbstractMCMC` implementations, that enables working with models defined by the `@model` macro, and uses DynamicPPL as a backend. The goal of this page is to describe this architecture, and how you would go about implementing your own sampling method in Turing, using Importance Sampling as an example. I don't go into all the details: for instance, I don't address selectors or parallelism. +Turing sampling methods (most of which are written [here](https://github.com/TuringLang/Turing.jl/tree/master/src/mcmc)) also implement `AbstractMCMC`. Turing defines a particular architecture for `AbstractMCMC` implementations, that enables working with models defined by the `@model` macro, and uses DynamicPPL as a backend. The goal of this page is to describe this architecture, and how you would go about implementing your own sampling method in Turing, using Importance Sampling as an example. I don't go into all the details: for instance, I don't address selectors or parallelism. First, we explain how Importance Sampling works in the abstract. Consider the model defined in the first code block. Mathematically, it can be written: diff --git a/tutorials/docs-12-using-turing-guide/index.qmd b/tutorials/docs-12-using-turing-guide/index.qmd index b23e5b7be..43c363628 100755 --- a/tutorials/docs-12-using-turing-guide/index.qmd +++ b/tutorials/docs-12-using-turing-guide/index.qmd @@ -340,15 +340,17 @@ For example, let `c` be a `Chain`: #### Variable Types and Type Parameters -The element type of a vector (or matrix) of random variables should match the `eltype` of the its prior distribution, `<: Integer` for discrete distributions and `<: AbstractFloat` for continuous distributions. Moreover, if the continuous random variable is to be sampled using a Hamiltonian sampler, the vector's element type needs to either be: +The element type of a vector (or matrix) of random variables should match the `eltype` of its prior distribution, `<: Integer` for discrete distributions and `<: AbstractFloat` for continuous distributions. Moreover, if the continuous random variable is to be sampled using a Hamiltonian sampler, the vector's element type needs to either be: 1. `Real` to enable auto-differentiation through the model which uses special number types that are sub-types of `Real`, or -2. Some type parameter `T` defined in the model header using the type parameter syntax, e.g. `function gdemo(x, ::Type{T} = Float64) where {T}`. Similarly, when using a particle sampler, the Julia variable used should either be: +2. Some type parameter `T` defined in the model header using the type parameter syntax, e.g. `function gdemo(x, ::Type{T} = Float64) where {T}`. -3. An `Array`, or +Similarly, when using a particle sampler, the Julia variable used should either be: -4. An instance of some type parameter `T` defined in the model header using the type parameter syntax, e.g. `function gdemo(x, ::Type{T} = Vector{Float64}) where {T}`. +1. An `Array`, or + +2. An instance of some type parameter `T` defined in the model header using the type parameter syntax, e.g. `function gdemo(x, ::Type{T} = Vector{Float64}) where {T}`. ### Querying Probabilities from Model or Chain