Skip to content

Commit

Permalink
Fix some typos and broken links
Browse files Browse the repository at this point in the history
  • Loading branch information
mhauru committed May 29, 2024
1 parent 5871974 commit dd7768f
Show file tree
Hide file tree
Showing 4 changed files with 10 additions and 8 deletions.
2 changes: 1 addition & 1 deletion tutorials/docs-00-getting-started/index.qmd
Original file line number Diff line number Diff line change
Expand Up @@ -95,7 +95,7 @@ function sample_posterior(alpha, beta, mean, lambda, iterations)
for i in 1:iterations
sample_variance = rand(InverseGamma(alpha, beta), 1)
sample_x = rand(Normal(mean, sqrt(sample_variance[1]) / lambda), 1)
sanples = append!(samples, sample_x)
samples = append!(samples, sample_x)
end
return samples
end
Expand Down
2 changes: 1 addition & 1 deletion tutorials/docs-01-contributing-guide/index.qmd
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,7 @@ title: Contributing

Turing is an open-source project. If you feel that you have relevant skills and are interested in contributing, please get in touch. You can contribute by opening issues on GitHub, implementing things yourself, and making a pull request. We would also appreciate example models written using Turing.

Turing has a [style guide]({{< meta site-url >}}docs/tutorials/docs-02-contributing-style-guide/). Reviewing it before making a pull request is not strictly necessary, but you may be asked to change portions of your code to conform with the style guide before it is merged.
Turing has a [style guide](#style-guide). Reviewing it before making a pull request is not strictly necessary, but you may be asked to change portions of your code to conform with the style guide before it is merged.

### What Can I Do?

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -33,9 +33,9 @@ n_samples = 1000
chn = sample(mod, alg, n_samples, progress=false)
```

The function `sample` is part of the AbstractMCMC interface. As explained in the [interface guide](https://turinglang.org/dev/docs/for-developers/interface), building a a sampling method that can be used by `sample` consists in overloading the structs and functions in `AbstractMCMC`. The interface guide also gives a standalone example of their implementation, [`AdvancedMH.jl`]().
The function `sample` is part of the AbstractMCMC interface. As explained in the [interface guide](https://turinglang.org/dev/docs/for-developers/interface), building a sampling method that can be used by `sample` consists in overloading the structs and functions in `AbstractMCMC`. The interface guide also gives a standalone example of their implementation, [`AdvancedMH.jl`]().

Turing sampling methods (most of which are written [here](https://github.com/TuringLang/Turing.jl/tree/master/src/inference)) also implement `AbstractMCMC`. Turing defines a particular architecture for `AbstractMCMC` implementations, that enables working with models defined by the `@model` macro, and uses DynamicPPL as a backend. The goal of this page is to describe this architecture, and how you would go about implementing your own sampling method in Turing, using Importance Sampling as an example. I don't go into all the details: for instance, I don't address selectors or parallelism.
Turing sampling methods (most of which are written [here](https://github.com/TuringLang/Turing.jl/tree/master/src/mcmc)) also implement `AbstractMCMC`. Turing defines a particular architecture for `AbstractMCMC` implementations, that enables working with models defined by the `@model` macro, and uses DynamicPPL as a backend. The goal of this page is to describe this architecture, and how you would go about implementing your own sampling method in Turing, using Importance Sampling as an example. I don't go into all the details: for instance, I don't address selectors or parallelism.

First, we explain how Importance Sampling works in the abstract. Consider the model defined in the first code block. Mathematically, it can be written:

Expand Down
10 changes: 6 additions & 4 deletions tutorials/docs-12-using-turing-guide/index.qmd
Original file line number Diff line number Diff line change
Expand Up @@ -340,15 +340,17 @@ For example, let `c` be a `Chain`:

#### Variable Types and Type Parameters

The element type of a vector (or matrix) of random variables should match the `eltype` of the its prior distribution, `<: Integer` for discrete distributions and `<: AbstractFloat` for continuous distributions. Moreover, if the continuous random variable is to be sampled using a Hamiltonian sampler, the vector's element type needs to either be:
The element type of a vector (or matrix) of random variables should match the `eltype` of its prior distribution, `<: Integer` for discrete distributions and `<: AbstractFloat` for continuous distributions. Moreover, if the continuous random variable is to be sampled using a Hamiltonian sampler, the vector's element type needs to either be:

1. `Real` to enable auto-differentiation through the model which uses special number types that are sub-types of `Real`, or

2. Some type parameter `T` defined in the model header using the type parameter syntax, e.g. `function gdemo(x, ::Type{T} = Float64) where {T}`. Similarly, when using a particle sampler, the Julia variable used should either be:
2. Some type parameter `T` defined in the model header using the type parameter syntax, e.g. `function gdemo(x, ::Type{T} = Float64) where {T}`.

3. An `Array`, or
Similarly, when using a particle sampler, the Julia variable used should either be:

4. An instance of some type parameter `T` defined in the model header using the type parameter syntax, e.g. `function gdemo(x, ::Type{T} = Vector{Float64}) where {T}`.
1. An `Array`, or

2. An instance of some type parameter `T` defined in the model header using the type parameter syntax, e.g. `function gdemo(x, ::Type{T} = Vector{Float64}) where {T}`.

### Querying Probabilities from Model or Chain

Expand Down

0 comments on commit dd7768f

Please sign in to comment.