diff --git a/articles/web_only/basic-intro.html b/articles/web_only/basic-intro.html index 3deea88f..ac617e1a 100644 --- a/articles/web_only/basic-intro.html +++ b/articles/web_only/basic-intro.html @@ -409,8 +409,8 @@

Model diagnostics#> #> SAMPLING FOR MODEL 'tmb_generic' NOW (CHAIN 1). #> Chain 1: -#> Chain 1: Gradient evaluation took 0.01007 seconds -#> Chain 1: 1000 transitions using 10 leapfrog steps per transition would take 100.7 seconds. +#> Chain 1: Gradient evaluation took 0.010102 seconds +#> Chain 1: 1000 transitions using 10 leapfrog steps per transition would take 101.02 seconds. #> Chain 1: Adjust your expectations accordingly! #> Chain 1: #> Chain 1: @@ -435,9 +435,9 @@

Model diagnostics#> Chain 1: Iteration: 100 / 101 [ 99%] (Warmup) #> Chain 1: Iteration: 101 / 101 [100%] (Sampling) #> Chain 1: -#> Chain 1: Elapsed Time: 26.615 seconds (Warm-up) -#> Chain 1: 0.197 seconds (Sampling) -#> Chain 1: 26.812 seconds (Total) +#> Chain 1: Elapsed Time: 26.933 seconds (Warm-up) +#> Chain 1: 0.198 seconds (Sampling) +#> Chain 1: 27.131 seconds (Total) #> Chain 1:
 AIC(m4)
-#> [1] 12534.29
+#> [1] 12534.33

To plot these, we make a data frame that contains all combinations of the time-varying covariate and time. This is easily created using expand.grid() or tidyr::expand_grid().

@@ -710,9 +710,7 @@

Time-varying effectsnd$depth_scaled2 <- nd$depth_scaled^2 p <- predict(m4, newdata = nd, se_fit = TRUE, re_form = NA) -#> Warning in sqrt(diag(cov)): NaNs produced -
-
+
 ggplot(p, aes(depth_scaled, exp(est),
   ymin = exp(est - 1.96 * est_se),
   ymax = exp(est + 1.96 * est_se),
diff --git a/articles/web_only/basic-intro_files/figure-html/plot-cv-1.png b/articles/web_only/basic-intro_files/figure-html/plot-cv-1.png
index 91afc737..0f6345c0 100644
Binary files a/articles/web_only/basic-intro_files/figure-html/plot-cv-1.png and b/articles/web_only/basic-intro_files/figure-html/plot-cv-1.png differ
diff --git a/articles/web_only/basic-intro_files/figure-html/residuals-mcmc-1.png b/articles/web_only/basic-intro_files/figure-html/residuals-mcmc-1.png
index e494d007..84372761 100644
Binary files a/articles/web_only/basic-intro_files/figure-html/residuals-mcmc-1.png and b/articles/web_only/basic-intro_files/figure-html/residuals-mcmc-1.png differ
diff --git a/articles/web_only/basic-intro_files/figure-html/tv-depth-eff-1.png b/articles/web_only/basic-intro_files/figure-html/tv-depth-eff-1.png
index 9f240044..6e199431 100644
Binary files a/articles/web_only/basic-intro_files/figure-html/tv-depth-eff-1.png and b/articles/web_only/basic-intro_files/figure-html/tv-depth-eff-1.png differ
diff --git a/articles/web_only/bayesian.html b/articles/web_only/bayesian.html
index b6def578..ce7f586e 100644
--- a/articles/web_only/bayesian.html
+++ b/articles/web_only/bayesian.html
@@ -396,7 +396,7 @@ 

Passing the model to tmbstan#> omega_s[85] -0.29 0.01 0.21 -0.68 -0.44 -0.30 -0.15 0.11 1111 1.00 #> lp__ 136.06 0.93 9.26 118.11 129.87 136.38 142.46 152.88 100 1.02 #> -#> Samples were drawn using NUTS(diag_e) at Sun May 26 03:41:09 2024. +#> Samples were drawn using NUTS(diag_e) at Sun May 26 04:17:07 2024. #> For each parameter, n_eff is a crude measure of effective sample size, #> and Rhat is the potential scale reduction factor on split chains (at #> convergence, Rhat=1).

diff --git a/articles/web_only/bayesian_files/figure-html/unnamed-chunk-9-1.png b/articles/web_only/bayesian_files/figure-html/unnamed-chunk-9-1.png index e8b031ef..97ec2ec9 100644 Binary files a/articles/web_only/bayesian_files/figure-html/unnamed-chunk-9-1.png and b/articles/web_only/bayesian_files/figure-html/unnamed-chunk-9-1.png differ diff --git a/articles/web_only/delta-models.html b/articles/web_only/delta-models.html index 86f4739f..c5203362 100644 --- a/articles/web_only/delta-models.html +++ b/articles/web_only/delta-models.html @@ -521,8 +521,8 @@

#> #> Dispersion parameter: 0.94 #> Matérn range: 0.01 -#> Spatial SD: 727.28 -#> Spatiotemporal IID SD: 2064.88 +#> Spatial SD: 727.31 +#> Spatiotemporal IID SD: 2064.97 #> ML criterion at convergence: 5102.136 #> #> See ?tidy.sdmTMB to extract these values as a data frame. diff --git a/articles/web_only/delta-models_files/figure-html/cv-1.png b/articles/web_only/delta-models_files/figure-html/cv-1.png index cb931227..0fbc16af 100644 Binary files a/articles/web_only/delta-models_files/figure-html/cv-1.png and b/articles/web_only/delta-models_files/figure-html/cv-1.png differ diff --git a/articles/web_only/delta-models_files/figure-html/unnamed-chunk-19-1.png b/articles/web_only/delta-models_files/figure-html/unnamed-chunk-19-1.png index 11833a24..1d08c9ec 100644 Binary files a/articles/web_only/delta-models_files/figure-html/unnamed-chunk-19-1.png and b/articles/web_only/delta-models_files/figure-html/unnamed-chunk-19-1.png differ diff --git a/articles/web_only/delta-models_files/figure-html/unnamed-chunk-20-1.png b/articles/web_only/delta-models_files/figure-html/unnamed-chunk-20-1.png index 21f3ee27..40ee8ed5 100644 Binary files a/articles/web_only/delta-models_files/figure-html/unnamed-chunk-20-1.png and b/articles/web_only/delta-models_files/figure-html/unnamed-chunk-20-1.png differ diff --git a/articles/web_only/residual-checking.html b/articles/web_only/residual-checking.html index a29b7575..ba359df3 100644 --- a/articles/web_only/residual-checking.html +++ b/articles/web_only/residual-checking.html @@ -231,8 +231,8 @@

MCMC-based randomized-quantile #> #> SAMPLING FOR MODEL 'tmb_generic' NOW (CHAIN 1). #> Chain 1: -#> Chain 1: Gradient evaluation took 0.000965 seconds -#> Chain 1: 1000 transitions using 10 leapfrog steps per transition would take 9.65 seconds. +#> Chain 1: Gradient evaluation took 0.000971 seconds +#> Chain 1: 1000 transitions using 10 leapfrog steps per transition would take 9.71 seconds. #> Chain 1: Adjust your expectations accordingly! #> Chain 1: #> Chain 1: @@ -249,9 +249,9 @@

MCMC-based randomized-quantile #> Chain 1: Iteration: 720 / 800 [ 90%] (Sampling) #> Chain 1: Iteration: 800 / 800 [100%] (Sampling) #> Chain 1: -#> Chain 1: Elapsed Time: 6.699 seconds (Warm-up) -#> Chain 1: 4.761 seconds (Sampling) -#> Chain 1: 11.46 seconds (Total) +#> Chain 1: Elapsed Time: 6.755 seconds (Warm-up) +#> Chain 1: 4.818 seconds (Sampling) +#> Chain 1: 11.573 seconds (Total) #> Chain 1:
 mcmc_res <- residuals(fit_nb2, type = "mle-mcmc", mcmc_samples = samps)
@@ -507,8 +507,8 @@ 

The need for one-sample residuals#> #> SAMPLING FOR MODEL 'tmb_generic' NOW (CHAIN 1). #> Chain 1: -#> Chain 1: Gradient evaluation took 0.002334 seconds -#> Chain 1: 1000 transitions using 10 leapfrog steps per transition would take 23.34 seconds. +#> Chain 1: Gradient evaluation took 0.002067 seconds +#> Chain 1: 1000 transitions using 10 leapfrog steps per transition would take 20.67 seconds. #> Chain 1: Adjust your expectations accordingly! #> Chain 1: #> Chain 1: @@ -525,9 +525,9 @@

The need for one-sample residuals#> Chain 1: Iteration: 360 / 400 [ 90%] (Sampling) #> Chain 1: Iteration: 400 / 400 [100%] (Sampling) #> Chain 1: -#> Chain 1: Elapsed Time: 30.393 seconds (Warm-up) -#> Chain 1: 29.823 seconds (Sampling) -#> Chain 1: 60.216 seconds (Total) +#> Chain 1: Elapsed Time: 30.753 seconds (Warm-up) +#> Chain 1: 30.123 seconds (Sampling) +#> Chain 1: 60.876 seconds (Total) #> Chain 1: #> Warning: The largest R-hat is 1.08, indicating chains have not mixed. #> Running the chains for more iterations may help. See diff --git a/pkgdown.yml b/pkgdown.yml index fcdd5a29..44b2eb36 100644 --- a/pkgdown.yml +++ b/pkgdown.yml @@ -15,7 +15,7 @@ articles: spatial-trend-models: web_only/spatial-trend-models.html threshold-models: web_only/threshold-models.html visreg: web_only/visreg.html -last_built: 2024-05-26T03:36Z +last_built: 2024-05-26T04:12Z urls: reference: https://pbs-assess.github.io/sdmTMB/reference article: https://pbs-assess.github.io/sdmTMB/articles diff --git a/reference/get_index_sims-1.png b/reference/get_index_sims-1.png index efc476c4..bd7be7ae 100644 Binary files a/reference/get_index_sims-1.png and b/reference/get_index_sims-1.png differ diff --git a/reference/sdmTMB.html b/reference/sdmTMB.html index 71ad9351..d2f3fa46 100644 --- a/reference/sdmTMB.html +++ b/reference/sdmTMB.html @@ -1059,8 +1059,8 @@

Examples#> Dispersion parameter: 12.80 #> Tweedie p: 1.56 #> Matérn range: 0.01 -#> Spatial SD: 2573.00 -#> Spatiotemporal IID SD: 2087.99 +#> Spatial SD: 2571.71 +#> Spatiotemporal IID SD: 2086.94 #> ML criterion at convergence: 2911.371 #> #> See ?tidy.sdmTMB to extract these values as a data frame. diff --git a/search.json b/search.json index 9496bb2e..d9516da5 100644 --- a/search.json +++ b/search.json @@ -1 +1 @@ -[{"path":"https://pbs-assess.github.io/sdmTMB/articles/model-description.html","id":"other-vignettes-available","dir":"Articles","previous_headings":"","what":"Other vignettes available","title":"sdmTMB model description","text":"vignette viewed CRAN, note many vignettes describing use sdmTMB available documentation site Articles.","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/articles/model-description.html","id":"notation-conventions","dir":"Articles","previous_headings":"","what":"Notation conventions","title":"sdmTMB model description","text":"appendix uses following notation conventions, generally follow guidance Edwards & Auger-Méthé (2019): Greek symbols parameters, Latin/Roman alphabet data (except \\(\\boldsymbol{Q}\\) \\(\\boldsymbol{H}\\), used convention), bold symbols vectors matrices (e.g., \\(\\boldsymbol{\\omega}\\) vector \\(\\omega_{\\boldsymbol{s}}\\) value \\(\\boldsymbol{\\omega}\\) point space \\(\\boldsymbol{s}\\)), \\(\\phi\\) distribution dispersion parameters consistency code, \\(\\mathbb{E}[y]\\) define expected value (mean) variable \\(y\\), \\(\\mathrm{Var}[y]\\) define expected variance variable \\(y\\), \\(^*\\) superscript represents interpolated projected values opposed values knot locations (e.g., \\(\\boldsymbol{\\omega}\\) vs. \\(\\boldsymbol{\\omega}^*\\)), possible, notation chosen match VAST (Thorson 2019) maintain consistency (e.g., \\(\\boldsymbol{\\omega}\\) spatial fields \\(\\boldsymbol{\\epsilon}_t\\) spatiotemporal fields). include tables major indices (Table 1) symbols (Table 2). Table 1: Subscript notation Table 2: Symbol notation, code representation (model output model template code), descriptions.","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/articles/model-description.html","id":"sdmtmb-model-structure","dir":"Articles","previous_headings":"","what":"sdmTMB model structure","title":"sdmTMB model description","text":"complete sdmTMB model can written \\[ \\begin{aligned} \\mathbb{E}[y_{\\boldsymbol{s},t}] &= \\mu_{\\boldsymbol{s},t},\\\\ \\mu_{\\boldsymbol{s},t} &= f^{-1} \\left( \\boldsymbol{X}^{\\mathrm{main}}_{\\boldsymbol{s},t} \\boldsymbol{\\beta} + O_{\\boldsymbol{s},t} + \\alpha_g + \\boldsymbol{X}^{\\mathrm{tvc}}_{\\boldsymbol{s},t} \\boldsymbol{\\gamma_t} + \\boldsymbol{X}^{\\mathrm{svc}}_{\\boldsymbol{s},t} \\zeta_{\\boldsymbol{s}} + \\omega_{\\boldsymbol{s}} + \\epsilon_{\\boldsymbol{s},t} \\right), \\end{aligned} \\] \\(y_{\\boldsymbol{s},t}\\) represents response data point \\(\\boldsymbol{s}\\) time \\(t\\); \\(\\mu\\) represents mean; \\(f\\) represents link function (e.g., log logit) \\(f^{-1}\\) represents inverse; \\(\\boldsymbol{X}^{\\mathrm{main}}\\), \\(\\boldsymbol{X}^{\\mathrm{tvc}}\\), \\(\\boldsymbol{X}^{\\mathrm{svc}}\\) represent design matrices (superscript identifiers ‘main’ = main effects, ‘tvc’ = time varying coefficients, ‘svc’ = spatially varying coefficients); \\(\\boldsymbol{\\beta}\\) represents vector fixed-effect coefficients; \\(O_{\\boldsymbol{s},t}\\) represents offset: covariate (usually log transformed) coefficient fixed one; \\(\\alpha_{g}\\) represents random intercepts group \\(g\\), \\(\\alpha_{g}\\sim \\mathrm{N}(0,\\sigma^2_\\alpha)\\); \\(\\gamma_{t}\\) represents time-varying coefficients (random walk), \\(\\gamma_{t} \\sim \\mathrm{N}(\\gamma_{t-1},\\sigma^2_\\gamma)\\); \\(\\zeta_{\\boldsymbol{s}}\\) represents spatially varying coefficients (random field), \\(\\zeta_{\\boldsymbol{s}} \\sim \\mathrm{MVN}(\\boldsymbol{0},\\boldsymbol{\\Sigma}_\\zeta)\\); \\(\\omega_{\\boldsymbol{s}}\\) represents spatial component (random field), \\(\\omega_{\\boldsymbol{s}} \\sim \\mathrm{MVN}(\\boldsymbol{0},\\boldsymbol{\\Sigma}_\\omega)\\); \\(\\epsilon_{\\boldsymbol{s},t}\\) represents spatiotemporal component (random field), \\(\\epsilon_{\\boldsymbol{s},t} \\sim \\mathrm{MVN}(\\boldsymbol{0},\\boldsymbol{\\Sigma}_{\\epsilon})\\). single sdmTMB model rarely, ever, contain components. Next, split model describe various parts detail using ‘\\(\\ldots\\)’ represent optional components.","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/articles/model-description.html","id":"main-effects","dir":"Articles","previous_headings":"sdmTMB model structure","what":"Main effects","title":"sdmTMB model description","text":"\\[ \\begin{aligned} \\mu_{\\boldsymbol{s},t} &= f^{-1} \\left( \\boldsymbol{X}^{\\mathrm{main}}_{\\boldsymbol{s},t} \\boldsymbol{\\beta} \\ldots \\right) \\end{aligned} \\] Within sdmTMB(), \\(\\boldsymbol{X}^{\\mathrm{main}}_{\\boldsymbol{s},t} \\boldsymbol{\\beta}\\) defined formula argument represents main-effect model matrix corresponding vector coefficients. main effect formula can contain optional penalized smoothers non-linear functions defined .","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/articles/model-description.html","id":"smoothers","dir":"Articles","previous_headings":"sdmTMB model structure > Main effects","what":"Smoothers","title":"sdmTMB model description","text":"Smoothers sdmTMB implemented formula syntax familiar mgcv (Wood 2017) users fitting GAMs (generalized additive models). Smooths implemented formula using + s(x), implements smooth mgcv::s(). Within smooths, syntax commonly used mgcv::s() can applied, e.g. 2-dimensional smooths may constructed + s(x, y); smooths can specific various factor levels, + s(x, = group); smooths can vary according continuous variable, + s(x, = x2); basis function dimensions may specified, e.g. + s(x, k = 4) (see ?mgcv::choose.k); various types splines may constructed cyclic splines model seasonality, e.g. + s(month, bs = \"cc\", k = 12). mgcv can fit unpenalized (e.g., B-splines) penalized splines (P-splines), sdmTMB implements penalized splines. penalized splines constructed sdmTMB using function mgcv::smooth2random(), transforms splines random effects (associated design matrices) estimable mixed-effects modelling framework. approach implemented R packages gamm4 (Wood & Scheipl 2020) brms (Bürkner 2017).","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/articles/model-description.html","id":"linear-break-point-threshold-models","dir":"Articles","previous_headings":"sdmTMB model structure > Main effects","what":"Linear break-point threshold models","title":"sdmTMB model description","text":"linear break-point “hockey stick” model can used describe threshold asymptotic responses. function consists two pieces, \\(x < b_{1}\\), \\(s(x) = x \\cdot b_{0}\\), \\(x > b_{1}\\), \\(s(x) = b_{1} \\cdot b_{0}\\). cases, \\(b_{0}\\) represents slope function threshold, product \\(b_{1} \\cdot b_{0}\\) represents value asymptote. constraints placed parameters \\(b_{0}\\) \\(b_{1}\\). models can fit including + breakpt(x) model formula, x covariate. formula can contain single break-point covariate.","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/articles/model-description.html","id":"logistic-threshold-models","dir":"Articles","previous_headings":"sdmTMB model structure > Main effects","what":"Logistic threshold models","title":"sdmTMB model description","text":"Models logistic threshold relationships predictor response can fit form \\[ s(x)=\\tau + \\psi\\ { \\left[ 1+{ e }^{ -\\ln \\left(19\\right) \\cdot \\left( x-s50 \\right) / \\left(s95 - s50 \\right) } \\right] }^{-1}, \\] \\(s\\) represents logistic function, \\(\\psi\\) scaling parameter (controlling height y-axis response; unconstrained), \\(\\tau\\) intercept, \\(s50\\) parameter controlling point function reaches 50% maximum (\\(\\psi\\)), \\(s95\\) parameter controlling point function reaches 95% maximum. parameter \\(s50\\) unconstrained \\(s95\\) constrained larger \\(s50\\). models can fit including + logistic(x) model formula, x covariate. formula can contain single logistic covariate.","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/articles/model-description.html","id":"spatial-random-fields","dir":"Articles","previous_headings":"sdmTMB model structure","what":"Spatial random fields","title":"sdmTMB model description","text":"Spatial random fields, \\(\\omega_{\\boldsymbol{s}}\\), included spatial = '' (TRUE) omitted spatial = '' (FALSE). \\[ \\begin{aligned} \\mu_{\\boldsymbol{s},t} &= f^{-1} \\left( \\ldots + \\omega_{\\boldsymbol{s}} + \\ldots \\right),\\\\ \\boldsymbol{\\omega} &\\sim \\operatorname{MVNormal} \\left( \\boldsymbol{0}, \\boldsymbol{\\Sigma}_\\omega \\right),\\\\ \\end{aligned} \\] marginal standard deviation \\(\\boldsymbol{\\omega}\\) indicated Spatial SD printed model output sigma_O output sdmTMB::tidy(fit, \"ran_pars\"). ‘O’ ‘omega’ (\\(\\omega\\)). Internally, random fields follow Gaussian Markov random field (GMRF) \\[ \\boldsymbol{\\omega} \\sim \\mathrm{MVNormal}\\left(\\boldsymbol{0}, \\sigma_\\omega^2 \\boldsymbol{Q}^{-1}_\\omega\\right), \\] \\(\\boldsymbol{Q}_\\omega\\) sparse precision matrix \\(\\sigma_\\omega^2\\) marginal variance.","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/articles/model-description.html","id":"spatiotemporal-random-fields","dir":"Articles","previous_headings":"sdmTMB model structure","what":"Spatiotemporal random fields","title":"sdmTMB model description","text":"Spatiotemporal random fields included default multiple time elements (time argument NULL) can set IID (independent identically distributed, 'iid'; default), AR(1) ('ar1'), random walk ('rw'), ('') via spatiotemporal argument. text values case insensitive. Spatiotemporal random fields represented \\(\\boldsymbol{\\epsilon}_t\\) within sdmTMB. chosen match representation VAST (Thorson 2019). marginal standard deviation \\(\\boldsymbol{\\epsilon}_t\\) indicated Spatiotemporal SD printed model output sigma_E output sdmTMB::tidy(fit, \"ran_pars\"). ‘E’ ‘epsilon’ (\\(\\epsilon\\)).","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/articles/model-description.html","id":"iid-spatiotemporal-random-fields","dir":"Articles","previous_headings":"sdmTMB model structure > Spatiotemporal random fields","what":"IID spatiotemporal random fields","title":"sdmTMB model description","text":"IID spatiotemporal random fields (spatiotemporal = 'iid') can represented \\[ \\begin{aligned} \\mu_{\\boldsymbol{s},t} &= f^{-1} \\left( \\ldots + \\epsilon_{\\boldsymbol{s},t} + \\ldots \\right),\\\\ \\boldsymbol{\\epsilon_{t}} &\\sim \\operatorname{MVNormal} \\left( \\boldsymbol{0}, \\boldsymbol{\\Sigma}_{\\epsilon} \\right). \\end{aligned} \\] \\(\\epsilon_{\\boldsymbol{s},t}\\) represent random field deviations point \\(\\boldsymbol{s}\\) time \\(t\\). random fields assumed independent across time steps. Similarly spatial random fields, spatiotemporal random fields (including versions described ) parameterized internally sparse precision matrix (\\(\\boldsymbol{Q}_\\epsilon\\)) \\[ \\boldsymbol{\\epsilon_{t}} \\sim \\mathrm{MVNormal}\\left(\\boldsymbol{0}, \\sigma_\\epsilon^2 \\boldsymbol{Q}^{-1}_\\epsilon\\right). \\]","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/articles/model-description.html","id":"ar1-spatiotemporal-random-fields","dir":"Articles","previous_headings":"sdmTMB model structure > Spatiotemporal random fields","what":"AR(1) spatiotemporal random fields","title":"sdmTMB model description","text":"First-order auto regressive, AR(1), spatiotemporal random fields (spatiotemporal = 'ar1') add parameter defining correlation random field deviations one time step next. defined \\[ \\begin{aligned} \\mu_{\\boldsymbol{s},t} &= f^{-1} \\left( \\ldots + \\delta_{\\boldsymbol{s},t} \\ldots \\right),\\\\ \\boldsymbol{\\delta}_{t=1} &\\sim \\operatorname{MVNormal} (\\boldsymbol{0}, \\boldsymbol{\\Sigma}_{\\epsilon}),\\\\ \\boldsymbol{\\delta}_{t>1} &= \\rho \\boldsymbol{\\delta}_{t-1} + \\sqrt{1 - \\rho^2} \\boldsymbol{\\epsilon_{t}}, \\: \\boldsymbol{\\epsilon_{t}} \\sim \\operatorname{MVNormal} \\left(\\boldsymbol{0}, \\boldsymbol{\\Sigma}_{\\epsilon} \\right), \\end{aligned} \\] \\(\\rho\\) correlation subsequent spatiotemporal random fields. \\(\\rho \\boldsymbol{\\delta}_{t-1} + \\sqrt{1 - \\rho^2}\\) term scales spatiotemporal variance correlation represents steady-state marginal variance. correlation \\(\\rho\\) allows mean-reverting spatiotemporal fields, constrained \\(-1 < \\rho < 1\\). Internally, parameter estimated ar1_phi, unconstrained. parameter ar1_phi transformed \\(\\rho\\) \\(\\rho = 2 \\left( \\mathrm{logit}^{-1}(\\texttt{ar1\\_phi}) - 1 \\right)\\).","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/articles/model-description.html","id":"random-walk-spatiotemporal-random-fields-rw","dir":"Articles","previous_headings":"sdmTMB model structure > Spatiotemporal random fields","what":"Random walk spatiotemporal random fields (RW)","title":"sdmTMB model description","text":"Random walk spatiotemporal random fields (spatiotemporal = 'rw') represent model difference spatiotemporal deviations one time step next IID. defined \\[ \\begin{aligned} \\mu_{\\boldsymbol{s},t} &= f^{-1} \\left( \\ldots + \\delta_{\\boldsymbol{s},t} + \\ldots \\right),\\\\ \\boldsymbol{\\delta}_{t=1} &\\sim \\operatorname{MVNormal} (\\boldsymbol{0}, \\boldsymbol{\\Sigma}_{\\epsilon}),\\\\ \\boldsymbol{\\delta}_{t>1} &= \\boldsymbol{\\delta}_{t-1} + \\boldsymbol{\\epsilon_{t-1}}, \\: \\boldsymbol{\\epsilon_{t-1}} \\sim \\operatorname{MVNormal} \\left(\\boldsymbol{0}, \\boldsymbol{\\Sigma}_{\\epsilon} \\right), \\end{aligned} \\] distribution spatiotemporal field initial time step AR(1) model, absence \\(\\rho\\) parameter allows spatiotemporal field non-stationary time. Note , contrast AR(1) parametrization, variance longer steady-state marginal variance.","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/articles/model-description.html","id":"time-varying-regression-parameters","dir":"Articles","previous_headings":"sdmTMB model structure","what":"Time-varying regression parameters","title":"sdmTMB model description","text":"Parameters can modelled time-varying according random walk first-order autoregressive, AR(1), process. time-series model defined time_varying_type. types: \\[ \\begin{aligned} \\mu_{\\boldsymbol{s},t} &= f^{-1} \\left( \\ldots + \\boldsymbol{X}^{\\mathrm{tvc}}_{\\boldsymbol{s},t} \\boldsymbol{\\gamma_{t}} + \\ldots \\right), \\end{aligned} \\] \\(\\boldsymbol{\\gamma_t}\\) optional vector time-varying regression parameters \\(\\boldsymbol{X}^{\\mathrm{tvc}}_{\\boldsymbol{s},t}\\) corresponding model matrix covariate values. defined via time_varying argument, assuming time argument also supplied column name. time_varying takes one-sided formula. ~ 1 implies time-varying intercept. time_varying_type = 'rw', first time step estimated independently: \\[ \\begin{aligned} \\gamma_{t=1} &\\sim \\operatorname{Uniform} \\left(-\\infty, \\infty \\right),\\\\ \\gamma_{t>1} &\\sim \\operatorname{Normal} \\left(\\gamma_{t-1}, \\sigma^2_{\\gamma} \\right). \\end{aligned} \\] case, first time-step value given implicit uniform prior. .e., variable appear fixed effect formula since initial value estimated part time-varying formula. formula time_varying = ~ 1 implicitly represents time-varying intercept (assuming time argument supplied) , case, intercept omitted main effects (formula ~ + 0 + ... formula ~ -1 + ...). time_varying_type = 'rw0', first time step estimated mean-zero prior: \\[ \\begin{aligned} \\gamma_{t=1} &\\sim \\operatorname{Normal} \\left(0, \\sigma^2_{\\gamma} \\right),\\\\ \\gamma_{t>1} &\\sim \\operatorname{Normal} \\left(\\gamma_{t-1}, \\sigma^2_{\\gamma} \\right). \\end{aligned} \\] case, time-varying variable (including intercept) included main effects. suggest using formulation, leave 'rw' option legacy code works. time_varying_type = 'ar1': \\[ \\begin{aligned} \\gamma_{t=1} &\\sim \\operatorname{Normal} \\left(0, \\sigma^2_{\\gamma} \\right),\\\\ \\gamma_{t>1} &\\sim \\operatorname{Normal} \\left(\\rho_\\gamma\\gamma_{t-1}, \\sqrt{1 - \\rho_\\gamma^2} \\sigma^2_{\\gamma} \\right), \\end{aligned} \\] \\(\\rho_{\\gamma}\\) correlation subsequent time steps. first time step given mean-zero prior.","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/articles/model-description.html","id":"spatially-varying-coefficients-svc","dir":"Articles","previous_headings":"sdmTMB model structure","what":"Spatially varying coefficients (SVC)","title":"sdmTMB model description","text":"Spatially varying coefficient models defined \\[ \\begin{aligned} \\mu_{\\boldsymbol{s},t} &= f^{-1} \\left( \\ldots + \\boldsymbol{X}^{\\mathrm{svc}}_{\\boldsymbol{s}, t} \\zeta_{\\boldsymbol{s}} + \\ldots \\right),\\\\ \\boldsymbol{\\zeta} &\\sim \\operatorname{MVNormal} \\left( \\boldsymbol{0}, \\boldsymbol{\\Sigma}_\\zeta \\right), \\end{aligned} \\] \\(\\boldsymbol{\\zeta}\\) random field representing spatially varying coefficient. Usually, \\(\\boldsymbol{X}^{\\mathrm{svc}}_{\\boldsymbol{s}, t}\\) represent prediction matrix constant spatially given time \\(t\\) defined one-sided formula supplied spatial_varying. example spatial_varying = ~ 0 + x, 0 omits intercept. random fields parameterized internally sparse precision matrix (\\(\\boldsymbol{Q}_\\zeta\\)) \\[ \\boldsymbol{\\zeta} \\sim \\mathrm{MVNormal}\\left(\\boldsymbol{0}, \\sigma_\\zeta^2 \\boldsymbol{Q}^{-1}_\\zeta\\right). \\]","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/articles/model-description.html","id":"iid-random-or-multi-level-intercepts","dir":"Articles","previous_headings":"sdmTMB model structure","what":"IID random or multi-level intercepts","title":"sdmTMB model description","text":"Multilevel/hierarchical intercepts defined \\[ \\begin{aligned} \\mu_{\\boldsymbol{s},t} &= f^{-1} \\left( \\ldots + \\alpha_{g} + \\ldots \\right),\\\\ \\alpha_g &\\sim \\operatorname{Normal} \\left(0, \\sigma_\\alpha^2 \\right),\\\\ \\end{aligned} \\] \\(\\alpha_g\\) example optional “random” intercept—intercept mean zero varies level \\(g\\) constrained \\(\\sigma_\\alpha\\). defined formula argument via (1 | g) syntax lme4 glmmTMB. can multiple random intercepts, despite showing one . E.g., (1 | g1) + (1 | g2), case assumed independent uncorrelated .","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/articles/model-description.html","id":"offset-terms","dir":"Articles","previous_headings":"sdmTMB model structure","what":"Offset terms","title":"sdmTMB model description","text":"Offset terms can included offset argument sdmTMB(). included linear predictor \\[ \\begin{aligned} \\mu_{\\boldsymbol{s},t} &= f^{-1} \\left( \\ldots + O_{\\boldsymbol{s},t} + \\ldots \\right), \\end{aligned} \\] \\(O_{\\boldsymbol{s},t}\\) offset term—log transformed variable without coefficient (assuming log link). offset included prediction. Therefore, offset represents measure effort, example, prediction one unit effort (log(1) = 0).","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/articles/model-description.html","id":"observation-model-families","dir":"Articles","previous_headings":"","what":"Observation model families","title":"sdmTMB model description","text":"describe main observation families available sdmTMB comment parametrization, statistical properties, utility, code representation sdmTMB.","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/articles/model-description.html","id":"binomial","dir":"Articles","previous_headings":"Observation model families","what":"Binomial","title":"sdmTMB model description","text":"\\[ \\operatorname{Binomial} \\left(N, \\mu \\right) \\] \\(N\\) size number trials, \\(\\mu\\) probability success trial. \\(N = 1\\), distribution becomes Bernoulli distribution. Internally, distribution parameterized robust version TMB, numerically stable probabilities approach 0 1. Following structure stats::glm(), lme4, glmmTMB, binomial family can specified one 4 ways: response may factor (model classifies first level versus others) response may binomial (0/1) response can matrix form cbind(success, failure), response may observed proportions, weights argument used specify Binomial size (\\(N\\)) parameter (probabilty ~ ..., weights = N). Code defined within TMB. Example: family = binomial(link = \"logit\")","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/articles/model-description.html","id":"beta","dir":"Articles","previous_headings":"Observation model families","what":"Beta","title":"sdmTMB model description","text":"\\[ \\operatorname{Beta} \\left(\\mu \\phi, (1 - \\mu) \\phi \\right) \\] \\(\\mu\\) mean \\(\\phi\\) precision parameter. parametrization follows Ferrari & Cribari-Neto (2004) betareg R package (Cribari-Neto & Zeileis 2010). variance \\(\\mu (1 - \\mu) / (\\phi + 1)\\). Code defined within TMB. Example: family = Beta(link = \"logit\")","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/articles/model-description.html","id":"gamma","dir":"Articles","previous_headings":"Observation model families","what":"Gamma","title":"sdmTMB model description","text":"\\[ \\operatorname{Gamma} \\left( \\phi, \\frac{\\mu}{\\phi} \\right) \\] \\(\\phi\\) represents Gamma shape \\(\\mu / \\phi\\) represents scale. mean \\(\\mu\\) variance \\(\\mu \\cdot \\phi^2\\). Code defined within TMB. Example: family = Gamma(link = \"log\")","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/articles/model-description.html","id":"gaussian","dir":"Articles","previous_headings":"Observation model families","what":"Gaussian","title":"sdmTMB model description","text":"\\[ \\operatorname{Normal} \\left( \\mu, \\phi^2 \\right) \\] \\(\\mu\\) mean \\(\\phi\\) standard deviation. variance \\(\\phi^2\\). Example: family = Gaussian(link = \"identity\") Code defined within TMB.","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/articles/model-description.html","id":"lognormal","dir":"Articles","previous_headings":"Observation model families","what":"Lognormal","title":"sdmTMB model description","text":"sdmTMB uses bias-corrected lognormal distribution \\(\\phi\\) represents standard deviation log-space: \\[ \\operatorname{Lognormal} \\left( \\log \\mu - \\frac{\\phi^2}{2}, \\phi^2 \\right). \\] bias correction, \\(\\mathbb{E}[y] = \\mu\\) \\(\\mathrm{Var}[\\log y] = \\phi^2\\). Code defined within sdmTMB based TMB dnorm() normal density. Example: family = lognormal(link = \"log\")","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/articles/model-description.html","id":"negative-binomial-1-nb1","dir":"Articles","previous_headings":"Observation model families","what":"Negative Binomial 1 (NB1)","title":"sdmTMB model description","text":"\\[ \\operatorname{NB1} \\left( \\mu, \\phi \\right) \\] \\(\\mu\\) mean \\(\\phi\\) dispersion parameter. variance scales linearly mean \\(\\mathrm{Var}[y] = \\mu + \\mu / \\phi\\) (Hilbe 2011). Internally, distribution parameterized robust version TMB. Code defined within sdmTMB based NB2 borrowed glmmTMB. Example: family = nbinom1(link = \"log\")","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/articles/model-description.html","id":"negative-binomial-2-nb2","dir":"Articles","previous_headings":"Observation model families","what":"Negative Binomial 2 (NB2)","title":"sdmTMB model description","text":"\\[ \\operatorname{NB2} \\left( \\mu, \\phi \\right) \\] \\(\\mu\\) mean \\(\\phi\\) dispersion parameter. variance scales quadratically mean \\(\\mathrm{Var}[y] = \\mu + \\mu^2 / \\phi\\) (Hilbe 2011). NB2 parametrization commonly seen ecology NB1. Internally, distribution parameterized robust version TMB. Code defined within TMB. Example: family = nbinom2(link = \"log\")","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/articles/model-description.html","id":"poisson","dir":"Articles","previous_headings":"Observation model families","what":"Poisson","title":"sdmTMB model description","text":"\\[ \\operatorname{Poisson} \\left( \\mu \\right) \\] \\(\\mu\\) represents mean \\(\\mathrm{Var}[y] = \\mu\\). Code defined within TMB. Example: family = poisson(link = \"log\")","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/articles/model-description.html","id":"student-t","dir":"Articles","previous_headings":"Observation model families","what":"Student-t","title":"sdmTMB model description","text":"\\[ \\operatorname{Student-t} \\left( \\mu, \\phi, \\nu \\right) \\] \\(\\nu\\), degrees freedom (df), user-supplied fixed parameter. Lower values \\(\\nu\\) result heavier tails compared Gaussian distribution. approximately df = 20, distribution becomes similar Gaussian. Student-t distribution low degrees freedom (e.g., \\(\\nu \\le 7\\)) can helpful modelling data otherwise suitable Gaussian needs approach robust outliers (e.g., Anderson et al. 2017). Code defined within sdmTMB based dt() distribution TMB. Example: family = student(link = \"log\", df = 7)","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/articles/model-description.html","id":"tweedie","dir":"Articles","previous_headings":"Observation model families","what":"Tweedie","title":"sdmTMB model description","text":"\\[ \\operatorname{Tweedie} \\left(\\mu, p, \\phi \\right), \\: 1 < p < 2 \\] \\(\\mu\\) mean, \\(p\\) power parameter constrained 1 2, \\(\\phi\\) dispersion parameter. Tweedie distribution can helpful modelling data positive continuous also contain zeros. Internally, \\(p\\) transformed \\(\\mathrm{logit}^{-1} (\\texttt{thetaf}) + 1\\) constrain 1 2 estimated unconstrained variable. source code implemented cplm package (Zhang 2013) based Dunn & Smyth (2005). TMB version defined . Example: family = tweedie(link = \"log\")","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/articles/model-description.html","id":"gamma-mixture","dir":"Articles","previous_headings":"Observation model families","what":"Gamma mixture","title":"sdmTMB model description","text":"2 component mixture extends Gamma distribution, \\[ (1 - p) \\cdot \\operatorname{Gamma} \\left( \\phi, \\frac{\\mu_{1}}{\\phi} \\right) + p \\cdot \\operatorname{Gamma} \\left( \\phi, \\frac{\\mu_{2}}{\\phi} \\right), \\] \\(\\phi\\) represents Gamma shape, \\(\\mu_{1} / \\phi\\) represents scale first (smaller component) distribution, \\(\\mu_{2} / \\phi\\) represents scale second (larger component) distribution, \\(p\\) controls contribution component mixture (also interpreted probability larger events). mean \\((1-p) \\cdot \\mu_{1} + p \\cdot \\mu_{2}\\) variance \\((1-p) ^ 2 \\cdot \\mu_{1} \\cdot \\phi^2 + (p) ^ 2 \\cdot \\mu_{2} \\cdot \\phi^2\\). , mixture distributions, probability larger mean can obtained plogis(fit$model$par[[\"logit_p_mix\"]]) ratio larger mean smaller mean can obtained 1 + exp(fit$model$par[[\"log_ratio_mix\"]]). standard errors available TMB sdreport: fit$sd_report. wish fix probability large (.e., extreme) mean, can hard estimate, can use map list. E.g.: Example: family = gamma_mix(link = \"log\"). See also family = delta_gamma_mix() extension incorporating distribution delta models.","code":"sdmTMB(..., control = sdmTMBcontrol( start = list(logit_p_mix = qlogis(0.05)), # 5% probability of 'mu2' map = list(logit_p_mix = factor(NA)) # don't estimate ) )"},{"path":"https://pbs-assess.github.io/sdmTMB/articles/model-description.html","id":"lognormal-mixture","dir":"Articles","previous_headings":"Observation model families","what":"Lognormal mixture","title":"sdmTMB model description","text":"2 component mixture extends lognormal distribution, \\[ (1 - p) \\cdot \\operatorname{Lognormal} \\left( \\log \\mu_{1} - \\frac{\\phi^2}{2}, \\phi^2 \\right) + p \\cdot \\operatorname{Lognormal} \\left( \\log \\mu_{2} - \\frac{\\phi^2}{2}, \\phi^2 \\right). \\] bias correction, \\(\\mathbb{E}[y] = (1-p) \\cdot \\mu_{1} + p \\cdot \\mu_{2}\\) \\(\\mathrm{Var}[\\log y] = (1-p)^2 \\cdot \\phi^2 + p^2 \\cdot \\phi^2\\). Gamma mixture, \\(p\\) controls contribution component mixture (also interpreted probability larger events). Example: family = lognormal_mix(link = \"log\"). See also family = delta_lognormal_mix() extension incorporating distribution delta models.","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/articles/model-description.html","id":"negative-binomial-2-mixture","dir":"Articles","previous_headings":"Observation model families","what":"Negative binomial 2 mixture","title":"sdmTMB model description","text":"2 component mixture extends NB2 distribution, \\[ (1 - p) \\cdot \\operatorname{NB2} \\left( \\mu_1, \\phi \\right) + p \\cdot \\operatorname{NB2} \\left( \\mu_2, \\phi \\right) \\] \\(\\mu_1\\) mean first (smaller component) distribution, \\(\\mu_2\\) mean larger component, \\(p\\) controls contribution component mixture. Example: family = nbinom2_mix(link = \"log\")","code":""},{"path":[]},{"path":"https://pbs-assess.github.io/sdmTMB/articles/model-description.html","id":"matérn-parameterization","dir":"Articles","previous_headings":"Gaussian random fields","what":"Matérn parameterization","title":"sdmTMB model description","text":"Matérn defines covariance \\(\\Phi \\left( s_j, s_k \\right)\\) spatial locations \\(s_j\\) \\(s_k\\) \\[ \\Phi\\left( s_j,s_k \\right) = \\tau^2/\\Gamma(\\nu)2^{\\nu - 1} (\\kappa d_{jk})^\\nu K_\\nu \\left( \\kappa d_{jk} \\right), \\] \\(\\tau^2\\) controls spatial variance, \\(\\nu\\) controls smoothness, \\(\\Gamma\\) represents Gamma function, \\(d_{jk}\\) represents distance locations \\(s_j\\) \\(s_k\\), \\(K_\\nu\\) represents modified Bessel function second kind, \\(\\kappa\\) represents decorrelation rate. parameter \\(\\nu\\) set 1 take advantage Stochastic Partial Differential Equation (SPDE) approximation GRF greatly increase computational efficiency (Lindgren et al. 2011). Internally, parameters \\(\\kappa\\) \\(\\tau\\) converted range marginal standard deviation \\(\\sigma\\) \\(\\textrm{range} = \\sqrt{8} / \\kappa\\) \\(\\sigma = 1 / \\sqrt{4 \\pi \\exp \\left(2 \\log(\\tau) + 2 \\log(\\kappa) \\right) }\\). case spatiotemporal model spatial spatiotemporal fields, share_range = TRUE sdmTMB() (default), single \\(\\kappa\\) range estimated separate \\(\\sigma_\\omega\\) \\(\\sigma_\\epsilon\\). often makes sense since data often weakly informative \\(\\kappa\\). share_range = FALSE, separate \\(\\kappa_\\omega\\) \\(\\kappa_\\epsilon\\) estimated. spatially varying coefficient field always shares \\(\\kappa\\) spatial random field.","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/articles/model-description.html","id":"projection-boldsymbola-matrix","dir":"Articles","previous_headings":"Gaussian random fields","what":"Projection \\(\\boldsymbol{A}\\) matrix","title":"sdmTMB model description","text":"values spatial variables knots multiplied projection matrix \\(\\boldsymbol{}\\) bilinearly interpolates knot locations values locations observed predicted data (Lindgren & Rue 2015) \\[ \\boldsymbol{\\omega}^* = \\boldsymbol{} \\boldsymbol{\\omega}, \\] \\(\\boldsymbol{\\omega}^*\\) represents values spatial random fields observed locations predicted data locations. matrix \\(\\boldsymbol{}\\) row data point prediction point column knot. Three non-zero elements row define weight neighbouring 3 knot locations location \\(\\boldsymbol{s}\\). bilinear interpolation happens spatiotemporal random fields \\[ \\boldsymbol{\\epsilon}_t^* = \\boldsymbol{} \\boldsymbol{\\epsilon}_t. \\]","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/articles/model-description.html","id":"anisotropy","dir":"Articles","previous_headings":"Gaussian random fields","what":"Anisotropy","title":"sdmTMB model description","text":"TMB allows anisotropy, spatial covariance may asymmetric respect latitude longitude (full details). Anisotropy can turned logical anisotropy argument sdmTMB(). number ways implement anisotropic covariance (Fuglstad et al. 2015), adopt 2-parameter rotation matrix \\(\\textbf{H}\\). elements \\(\\textbf{H}\\) defined parameter vector \\(\\boldsymbol{x}\\) \\(H_{1,1} = x_{1}\\), \\(H_{1,2} = H_{2,1} = x_{2}\\) \\(H_{2,2} = (1 + x_{2}^2) / x_{1}\\). model fitted sdmTMB(), anisotropy relationships may plotted using plot_anisotropy() function, takes fitted object argument. barrier mesh used, anisotropy disabled.","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/articles/model-description.html","id":"incorporating-physical-barriers-into-the-spde","dir":"Articles","previous_headings":"Gaussian random fields","what":"Incorporating physical barriers into the SPDE","title":"sdmTMB model description","text":"cases spatial domain interest may complex bounded barrier land water (e.g., coastlines, islands, lakes). SPDE models allow physical barriers incorporated modelling (Bakka et al. 2019). sdmTMB() models, mesh construction occurs two steps: user (1) constructs mesh call sdmTMB::make_mesh(), (2) passes mesh sdmTMB::add_barrier_mesh(). barriers must constructed sf objects (Pebesma 2018) polygons defining barriers. See ?sdmTMB::add_barrier_mesh example. barrier implementation requires user select fraction value (range_fraction argument) defines fraction usual spatial range crossing barrier (Bakka et al. 2019). example, range estimated 10 km, range_fraction = 0.2 assume range 2 km across barrier. let spatial correlation decay 5 times faster distance. experimentation, values around 0.1 0.2 seem work well values much lower 0.1 can result convergence issues. website Francesco Serafini Haakon Bakka provides illustration INLA. implementation within TMB borrowed code written Olav Nikolai Breivik Hans Skaug TMB Case Studies Github site.","code":""},{"path":[]},{"path":"https://pbs-assess.github.io/sdmTMB/articles/model-description.html","id":"optimization-details","dir":"Articles","previous_headings":"Optimization","what":"Optimization details","title":"sdmTMB model description","text":"sdmTMB model fit maximum marginal likelihood. Internally, TMB (Kristensen et al. 2016) model template calculates marginal log likelihood gradient, negative log likelihood minimized via non-linear optimization routine stats::nlminb() R (Gay 1990; R Core Team 2021). Random effects estimated values maximize log likelihood conditional estimated fixed effects integrated via Laplace approximation (Kristensen et al. 2016). Like AD Model Builder (Fournier et al. 2012), TMB allows parameters fit phases include multiphase argument sdmTMB::sdmTMBcontrol() allow . high-dimensional models (many fixed random effects), phased estimation may faster result stable convergence. sdmTMB, phased estimation proceeds first estimating fixed-effect parameters contributing likelihood (holding random effects constant initial values). second phase, random-effect parameters (variances) also estimated. Fixed-effect parameters also estimated second phase initialized estimates first phase. cases, single call stats::nlminb() may result convergence (e.g., maximum gradient marginal likelihood respect fixed-effect parameters small enough yet), algorithm may need run multiple times. sdmTMB::sdmTMBcontrol() function, include argument nlminb_loops restart optimization previous best values. number nlminb_loops generally small (e.g., 2 3 initially), defaults 1. sdmTMB models, Hessian may also unstable need re-evaluated. optionally stats::optimHess() routine call stats::nlminb(). stats::optimHess() function implements Newton optimization routine find Hessian, include argument newton_loops sdmTMB::sdmTMBcontrol() allow multiple function evaluations (starting previous best value). default, included newton_loops set 0. model already fit, function sdmTMB::run_extra_optimization() can run additional optimization loops either routine reduce maximum gradient.","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/articles/model-description.html","id":"assessing-convergence","dir":"Articles","previous_headings":"Optimization","what":"Assessing convergence","title":"sdmTMB model description","text":"Much guidance around diagnostics glmmTMB also applies sdmTMB, e.g. glmmTMB vignette troubleshooting. Optimization stats::nlminb() involves specifying number iterations evaluations (eval.max iter.max) tolerances (abs.tol, rel.tol, x.tol, xf.tol)—greater number iterations smaller tolerance thresholds increase chance optimal solution found, evaluations translates longer computation time. Warnings non-positive-definite Hessian matrices (accompanied parameters NAs standard errors) often mean models improperly specified given data. Standard errors can observed output print.sdmTMB() checking fit$sd_report. maximum gradient marginal likelihood respect fixed-effect parameters can checked inspecting (fit$gradients). Guidance varies, maximum gradient likely least \\(< 0.001\\) assuming fitting routine consistent convergence. maximum gradients already relatively small, can often reduced additional optimization calls beginning previous best parameter vector described sdmTMB::run_extra_optimization().","code":""},{"path":[]},{"path":"https://pbs-assess.github.io/sdmTMB/articles/web_only/basic-intro.html","id":"parameter-estimates","dir":"Articles > Web_only","previous_headings":"","what":"Parameter estimates","title":"Introduction to modelling with sdmTMB","text":"can view confidence intervals fixed effects using tidy function: similarly random effect variance parameters: Note standard errors reported coefficients log space, confidence intervals reported. parameters defined follows: range: derived parameter defines distance 2 points effectively independent (actually 13% correlated). share_range argument changed FALSE spatial spatiotemporal ranges unique, otherwise default share range. phi: Observation error scale parameter (e.g., SD Gaussian). sigma_O: SD spatial process (“Omega”). sigma_E: SD spatiotemporal process (“Epsilon”). tweedie_p: Tweedie p (power) parameter; 1 2. model used AR1 spatiotemporal fields : rho: Spatiotemporal correlation years; -1 1. model includes spatial_varying predictor : sigma_Z: SD spatially varying coefficient field (“Zeta”).","code":"tidy(m3, conf.int = TRUE) #> # A tibble: 3 × 5 #> term estimate std.error conf.low conf.high #> #> 1 (Intercept) 1.86 0.208 1.45 2.26 #> 2 poly(log(depth), 2)1 -65.1 6.32 -77.5 -52.8 #> 3 poly(log(depth), 2)2 -96.5 5.98 -108. -84.8 tidy(m3, \"ran_pars\", conf.int = TRUE) #> # A tibble: 5 × 5 #> term estimate std.error conf.low conf.high #> #> 1 range 19.8 3.03 14.6 26.7 #> 2 phi 11.0 0.377 10.3 11.8 #> 3 sigma_O 1.40 0.162 1.12 1.76 #> 4 sigma_E 1.55 0.129 1.32 1.83 #> 5 tweedie_p 1.50 0.0119 1.48 1.52"},{"path":"https://pbs-assess.github.io/sdmTMB/articles/web_only/basic-intro.html","id":"model-diagnostics","dir":"Articles > Web_only","previous_headings":"","what":"Model diagnostics","title":"Introduction to modelling with sdmTMB","text":"can inspect randomized quantile residuals: fast calculate can look ‘’ even model consistent data. MCMC-based residuals reliable slow. can calculate help sdmTMBextra package. https://github.com/pbs-assess/sdmTMBextra. practice like want mcmc_iter mcmc_warmup. Total samples mcmc_iter - mcmc_warmup. also just use spatial model vignette builds quickly. See ?residuals.sdmTMB().","code":"pcod$resids <- residuals(m3) # randomized quantile residuals #> Note what used to be the default sdmTMB residuals (before version 0.4.3.9005) #> are now `type = 'mle-eb'`. We recommend using the current default `'mle-mvn'`, #> which takes one sample from the approximate posterior of the random effects or #> `dharma_residuals()` using a similar approach. qqnorm(pcod$resids) qqline(pcod$resids) ggplot(pcod, aes(X, Y, col = resids)) + scale_colour_gradient2() + geom_point() + facet_wrap(~year) + coord_fixed() set.seed(123) samps <- sdmTMBextra::predict_mle_mcmc(m3, mcmc_warmup = 100, mcmc_iter = 101) #> #> SAMPLING FOR MODEL 'tmb_generic' NOW (CHAIN 1). #> Chain 1: #> Chain 1: Gradient evaluation took 0.01007 seconds #> Chain 1: 1000 transitions using 10 leapfrog steps per transition would take 100.7 seconds. #> Chain 1: Adjust your expectations accordingly! #> Chain 1: #> Chain 1: #> Chain 1: WARNING: There aren't enough warmup iterations to fit the #> Chain 1: three stages of adaptation as currently configured. #> Chain 1: Reducing each adaptation stage to 15%/75%/10% of #> Chain 1: the given number of warmup iterations: #> Chain 1: init_buffer = 15 #> Chain 1: adapt_window = 75 #> Chain 1: term_buffer = 10 #> Chain 1: #> Chain 1: Iteration: 1 / 101 [ 0%] (Warmup) #> Chain 1: Iteration: 10 / 101 [ 9%] (Warmup) #> Chain 1: Iteration: 20 / 101 [ 19%] (Warmup) #> Chain 1: Iteration: 30 / 101 [ 29%] (Warmup) #> Chain 1: Iteration: 40 / 101 [ 39%] (Warmup) #> Chain 1: Iteration: 50 / 101 [ 49%] (Warmup) #> Chain 1: Iteration: 60 / 101 [ 59%] (Warmup) #> Chain 1: Iteration: 70 / 101 [ 69%] (Warmup) #> Chain 1: Iteration: 80 / 101 [ 79%] (Warmup) #> Chain 1: Iteration: 90 / 101 [ 89%] (Warmup) #> Chain 1: Iteration: 100 / 101 [ 99%] (Warmup) #> Chain 1: Iteration: 101 / 101 [100%] (Sampling) #> Chain 1: #> Chain 1: Elapsed Time: 26.615 seconds (Warm-up) #> Chain 1: 0.197 seconds (Sampling) #> Chain 1: 26.812 seconds (Total) #> Chain 1: r <- residuals(m3, \"mle-mcmc\", mcmc_samples = samps) qqnorm(r) qqline(r)"},{"path":"https://pbs-assess.github.io/sdmTMB/articles/web_only/basic-intro.html","id":"spatial-predictions","dir":"Articles > Web_only","previous_headings":"","what":"Spatial predictions","title":"Introduction to modelling with sdmTMB","text":"Now, purposes example (e.g., visualization), want predict fine-scale grid entire survey domain. grid built package Queen Charlotte Sound named qcs_grid. prediction grid also needs covariates used model . can replicate grid across necessary years: Now make predictions new data: Let’s make small function help make maps. four kinds predictions get model. First, show predictions incorporate fixed effects random effects: can also look just fixed effects, quadratic effect depth: can look spatial random effects represent consistent deviations space time accounted fixed effects. words, deviations represent consistent biotic abiotic factors affecting biomass density accounted model. finally can look spatiotemporal random effects represent deviation fixed effect predictions spatial random effect deviations. represent biotic abiotic factors changing time accounted model. can also estimate uncertainty spatiotemporal density predictions using simulations joint precision matrix setting nsim > 0 predict function. generate 100 estimates use apply() calculate upper lower confidence intervals, standard deviation, coefficient variation (CV). Plot CV estimates:","code":"glimpse(qcs_grid) #> Rows: 7,314 #> Columns: 5 #> $ X 456, 458, 460, 462, 464, 466, 468, 470, 472, 474, 476, 4… #> $ Y 5636, 5636, 5636, 5636, 5636, 5636, 5636, 5636, 5636, 56… #> $ depth 347.08345, 223.33479, 203.74085, 183.29868, 182.99983, 1… #> $ depth_scaled 1.56081222, 0.56976988, 0.36336929, 0.12570465, 0.122036… #> $ depth_scaled2 2.436134794, 0.324637712, 0.132037240, 0.015801659, 0.01… grid_yrs <- replicate_df(qcs_grid, \"year\", unique(pcod$year)) predictions <- predict(m3, newdata = grid_yrs) plot_map <- function(dat, column) { ggplot(dat, aes(X, Y, fill = {{ column }})) + geom_raster() + coord_fixed() } plot_map(predictions, exp(est)) + scale_fill_viridis_c( trans = \"sqrt\", # trim extreme high values to make spatial variation more visible na.value = \"yellow\", limits = c(0, quantile(exp(predictions$est), 0.995)) ) + facet_wrap(~year) + ggtitle(\"Prediction (fixed effects + all random effects)\", subtitle = paste(\"maximum estimated biomass density =\", round(max(exp(predictions$est)))) ) plot_map(predictions, exp(est_non_rf)) + scale_fill_viridis_c(trans = \"sqrt\") + ggtitle(\"Prediction (fixed effects only)\") plot_map(predictions, omega_s) + scale_fill_gradient2() + ggtitle(\"Spatial random effects only\") plot_map(predictions, epsilon_st) + scale_fill_gradient2() + facet_wrap(~year) + ggtitle(\"Spatiotemporal random effects only\") sim <- predict(m3, newdata = grid_yrs, nsim = 100) sim_last <- sim[grid_yrs$year == max(grid_yrs$year), ] # just plot last year pred_last <- predictions[predictions$year == max(grid_yrs$year), ] pred_last$lwr <- apply(exp(sim_last), 1, quantile, probs = 0.025) pred_last$upr <- apply(exp(sim_last), 1, quantile, probs = 0.975) pred_last$sd <- round(apply(exp(sim_last), 1, function(x) sd(x)), 2) pred_last$cv <- round(apply(exp(sim_last), 1, function(x) sd(x) / mean(x)), 2) ggplot(pred_last, aes(X, Y, fill = cv)) + geom_raster() + scale_fill_viridis_c()"},{"path":"https://pbs-assess.github.io/sdmTMB/articles/web_only/basic-intro.html","id":"conditional-effects","dir":"Articles > Web_only","previous_headings":"","what":"Conditional effects","title":"Introduction to modelling with sdmTMB","text":"can visualize conditional effect covariates feeding simplified data frames predict function fix covariate values want fixed (e.g., means) vary parameters want visualize (across range values): also visreg package. version link space residuals partial randomized quantile residuals. See scale argument visreg response scale plots. ggeffects package marginal effects plot. also faster since relies already estimated coefficients variance-covariance matrix.","code":"nd <- data.frame( depth = seq(min(pcod$depth), max(pcod$depth), length.out = 100 ), year = 2015L # a chosen year ) p <- predict(m3, newdata = nd, se_fit = TRUE, re_form = NA) ggplot(p, aes(depth, exp(est), ymin = exp(est - 1.96 * est_se), ymax = exp(est + 1.96 * est_se) )) + geom_line() + geom_ribbon(alpha = 0.4) + scale_x_continuous() + coord_cartesian(expand = F) + labs(x = \"Depth (m)\", y = \"Biomass density (kg/km2)\") visreg::visreg(m3, \"depth\") ggeffects::ggeffect(m3, \"depth [0:500 by=1]\") %>% plot() #> Warning: Removed 1 row containing missing values or values outside the scale range #> (`geom_line()`)."},{"path":"https://pbs-assess.github.io/sdmTMB/articles/web_only/basic-intro.html","id":"time-varying-effects","dir":"Articles > Web_only","previous_headings":"Conditional effects","what":"Time-varying effects","title":"Introduction to modelling with sdmTMB","text":"also let effect depth vary time. example, helps give year separate mean effect (.factor(year)). ~ 0 part formula omits intercept. models like take longer fit, might want set silent = FALSE can monitor progress. plot , make data frame contains combinations time-varying covariate time. easily created using expand.grid() tidyr::expand_grid().","code":"m4 <- sdmTMB( density ~ 0 + as.factor(year), data = pcod, time_varying = ~ 0 + depth_scaled + depth_scaled2, mesh = mesh, family = tweedie(link = \"log\"), spatial = \"on\", time = \"year\", spatiotemporal = \"IID\" ) #> Detected irregular time spacing with an AR(1) or random walk process. #> Consider filling in the missing time slices with `extra_time`. #> `extra_time = c(2006, 2008, 2010, 2012, 2014, 2016)` #> Warning in sqrt(diag(cov)): NaNs produced #> Warning: The model may not have converged: non-positive-definite Hessian #> matrix. m4 #> Spatiotemporal model fit by ML ['sdmTMB'] #> Formula: density ~ 0 + as.factor(year) #> Mesh: mesh (isotropic covariance) #> Time column: year #> Data: pcod #> Family: tweedie(link = 'log') #> Warning in sqrt(diag(object$cov.fixed)): NaNs produced #> Warning in sqrt(diag(object$cov.fixed)): NaNs produced #> Warning in sqrt(diag(object$cov.fixed)): NaNs produced #> Warning in sqrt(diag(object$cov.fixed)): NaNs produced #> Warning in sqrt(diag(object$cov.fixed)): NaNs produced #> Warning in sqrt(diag(object$cov.fixed)): NaNs produced #> Warning in sqrt(diag(object$cov.fixed)): NaNs produced #> Warning in sqrt(diag(object$cov.fixed)): NaNs produced #> Warning in sqrt(diag(object$cov.fixed)): NaNs produced #> Warning in sqrt(diag(object$cov.fixed)): NaNs produced #> Warning in sqrt(diag(object$cov.fixed)): NaNs produced #> Warning in sqrt(diag(object$cov.fixed)): NaNs produced #> Warning in sqrt(diag(object$cov.fixed)): NaNs produced #> Warning in sqrt(diag(object$cov.fixed)): NaNs produced #> Warning in sqrt(diag(object$cov.fixed)): NaNs produced #> Warning in sqrt(diag(object$cov.fixed)): NaNs produced #> Warning in sqrt(diag(object$cov.fixed)): NaNs produced #> Warning in sqrt(diag(object$cov.fixed)): NaNs produced #> Warning in sqrt(diag(object$cov.fixed)): NaNs produced #> Warning in sqrt(diag(object$cov.fixed)): NaNs produced #> Warning in sqrt(diag(object$cov.fixed)): NaNs produced #> Warning in sqrt(diag(object$cov.fixed)): NaNs produced #> Warning in sqrt(diag(object$cov.fixed)): NaNs produced #> Warning in sqrt(diag(object$cov.fixed)): NaNs produced #> Warning in sqrt(diag(object$cov.fixed)): NaNs produced #> Warning in sqrt(diag(object$cov.fixed)): NaNs produced #> Warning in sqrt(diag(object$cov.fixed)): NaNs produced #> Warning in sqrt(diag(object$cov.fixed)): NaNs produced #> Warning in sqrt(diag(object$cov.fixed)): NaNs produced #> Warning in sqrt(diag(object$cov.fixed)): NaNs produced #> Warning in sqrt(diag(object$cov.fixed)): NaNs produced #> Warning in sqrt(diag(object$cov.fixed)): NaNs produced #> Warning in sqrt(diag(object$cov.fixed)): NaNs produced #> Warning in sqrt(diag(object$cov.fixed)): NaNs produced #> #> coef.est coef.se #> as.factor(year)2003 3.36 0.30 #> as.factor(year)2004 4.04 0.28 #> as.factor(year)2005 3.77 0.28 #> as.factor(year)2007 2.64 0.30 #> as.factor(year)2009 2.65 0.29 #> as.factor(year)2011 3.72 0.29 #> as.factor(year)2013 3.51 0.27 #> as.factor(year)2015 3.76 0.28 #> as.factor(year)2017 3.21 0.30 #> #> Time-varying parameters: #> coef.est coef.se #> depth_scaled-2003 -0.96 0.08 #> depth_scaled-2004 -0.96 0.08 #> depth_scaled-2005 -0.96 0.08 #> depth_scaled-2007 -0.96 0.08 #> depth_scaled-2009 -0.96 0.08 #> depth_scaled-2011 -0.96 0.08 #> depth_scaled-2013 -0.96 0.08 #> depth_scaled-2015 -0.96 0.08 #> depth_scaled-2017 -0.96 0.08 #> depth_scaled2-2003 -1.50 0.23 #> depth_scaled2-2004 -1.70 0.17 #> depth_scaled2-2005 -1.68 0.22 #> depth_scaled2-2007 -1.82 0.28 #> depth_scaled2-2009 -0.99 0.17 #> depth_scaled2-2011 -2.02 0.25 #> depth_scaled2-2013 -1.09 0.12 #> depth_scaled2-2015 -1.80 0.22 #> depth_scaled2-2017 -2.10 0.26 #> #> Dispersion parameter: 10.86 #> Tweedie p: 1.50 #> Matérn range: 13.58 #> Spatial SD: 1.63 #> Spatiotemporal IID SD: 1.66 #> ML criterion at convergence: 6251.146 #> #> See ?tidy.sdmTMB to extract these values as a data frame. #> #> **Possible issues detected! Check output of sanity().** AIC(m4) #> [1] 12534.29 nd <- expand.grid( depth_scaled = seq(min(pcod$depth_scaled) + 0.2, max(pcod$depth_scaled) - 0.2, length.out = 50 ), year = unique(pcod$year) # all years ) nd$depth_scaled2 <- nd$depth_scaled^2 p <- predict(m4, newdata = nd, se_fit = TRUE, re_form = NA) #> Warning in sqrt(diag(cov)): NaNs produced ggplot(p, aes(depth_scaled, exp(est), ymin = exp(est - 1.96 * est_se), ymax = exp(est + 1.96 * est_se), group = as.factor(year) )) + geom_line(aes(colour = year), lwd = 1) + geom_ribbon(aes(fill = year), alpha = 0.1) + scale_colour_viridis_c() + scale_fill_viridis_c() + scale_x_continuous(labels = function(x) round(exp(x * pcod$depth_sd[1] + pcod$depth_mean[1]))) + coord_cartesian(expand = F) + labs(x = \"Depth (m)\", y = \"Biomass density (kg/km2)\")"},{"path":"https://pbs-assess.github.io/sdmTMB/articles/web_only/bayesian.html","id":"simulating-data","dir":"Articles > Web_only","previous_headings":"","what":"Simulating data","title":"Bayesian estimation with sdmTMB","text":"demonstrate using simulated dataset. Visualize simulated data:","code":"set.seed(123) predictor_dat <- data.frame( X = runif(500), Y = runif(500), a1 = rnorm(500) ) mesh <- make_mesh(predictor_dat, xy_cols = c(\"X\", \"Y\"), cutoff = 0.1) # plot(mesh) # mesh$mesh$n sim_dat <- sdmTMB_simulate( formula = ~a1, data = predictor_dat, mesh = mesh, family = gaussian(), range = 0.3, phi = 0.2, sigma_O = 0.2, seed = 123, B = c(0.8, -0.4) # B0 = intercept, B1 = a1 slope ) ggplot(sim_dat, aes(X, Y, colour = observed)) + geom_point() + scale_color_viridis_c()"},{"path":"https://pbs-assess.github.io/sdmTMB/articles/web_only/bayesian.html","id":"fitting-the-model-with-marginal-likelihood","dir":"Articles > Web_only","previous_headings":"","what":"Fitting the model with marginal likelihood","title":"Bayesian estimation with sdmTMB","text":"First, fit spatial random field GLMM maximum likelihood:","code":"fit <- sdmTMB( observed ~ a1, data = sim_dat, mesh = mesh, family = gaussian(), spatial = \"on\" ) fit #> Spatial model fit by ML ['sdmTMB'] #> Formula: observed ~ a1 #> Mesh: mesh (isotropic covariance) #> Data: sim_dat #> Family: gaussian(link = 'identity') #> #> coef.est coef.se #> (Intercept) 0.8 0.06 #> a1 -0.4 0.01 #> #> Dispersion parameter: 0.20 #> Matérn range: 0.32 #> Spatial SD: 0.17 #> ML criterion at convergence: -65.939 #> #> See ?tidy.sdmTMB to extract these values as a data frame."},{"path":"https://pbs-assess.github.io/sdmTMB/articles/web_only/bayesian.html","id":"adding-priors","dir":"Articles > Web_only","previous_headings":"","what":"Adding priors","title":"Bayesian estimation with sdmTMB","text":"first model fit use priors. case, priors implied uniform internal parameter space. However, sdmTMB provides option applying priors. show example applying Normal(0, 5) (mean, SD) prior intercept Normal(0, 1) prior slope parameter. guess model matrix structure based formula, can verify looking internal model matrix previous fit (using do_fit = FALSE save time didn’t want fit first time). column corresponds order b priors:","code":"head(fit$tmb_data$X_ij[[1]]) #> (Intercept) a1 #> 1 1 -0.60189285 #> 2 1 -0.99369859 #> 3 1 1.02678506 #> 4 1 0.75106130 #> 5 1 -1.50916654 #> 6 1 -0.09514745 fit <- sdmTMB( observed ~ a1, data = sim_dat, mesh = mesh, family = gaussian(), spatial = \"on\", priors = sdmTMBpriors( # location = vector of means; scale = vector of standard deviations: b = normal(location = c(0, 0), scale = c(5, 2)), ) ) fit #> Spatial model fit by ML ['sdmTMB'] #> Formula: observed ~ a1 #> Mesh: mesh (isotropic covariance) #> Data: sim_dat #> Family: gaussian(link = 'identity') #> #> coef.est coef.se #> (Intercept) 0.8 0.06 #> a1 -0.4 0.01 #> #> Dispersion parameter: 0.20 #> Matérn range: 0.32 #> Spatial SD: 0.17 #> ML criterion at convergence: -62.846 #> #> See ?tidy.sdmTMB to extract these values as a data frame."},{"path":"https://pbs-assess.github.io/sdmTMB/articles/web_only/bayesian.html","id":"fixing-a-spatial-correlation-parameter-to-improve-convergence","dir":"Articles > Web_only","previous_headings":"","what":"Fixing a spatial correlation parameter to improve convergence","title":"Bayesian estimation with sdmTMB","text":"Sometimes spatial correlation parameters can challenging estimate Stan. One option apply penalized complexity (PC) priors sdmTMBpriors() Matérn parameters. Another option, can also used conjunction priors, fix one parameters maximum likelihood estimate (MLE) values. Frequently, fixing parameter ln_kappa can help convergence (e.g., Monnahan et al. 2021). estimated parameter transformed range estimate, controls rate spatial correlation decay. Now rebuild fitted object fixed (‘mapped’) ln_kappa parameters using update() function. ’ll use do_fit = FALSE avoid actually fitting updated model since ’s necessary.","code":"# grab the internal parameter list at estimated values: pars <- sdmTMB::get_pars(fit) # create a 'map' vector for TMB # factor NA values cause TMB to fix or map the parameter at the starting value: kappa_map <- factor(rep(NA, length(pars$ln_kappa))) # rebuild model updating some elements: fit_mle <- update( fit, control = sdmTMBcontrol( start = list( ln_kappa = pars$ln_kappa #< ), map = list( ln_kappa = kappa_map #< ) ), do_fit = FALSE #< ) #> ℹ Initiating `ln_kappa` at specified starting value(s) of: #> 2.173, 2.173 #> ℹ Fixing or mirroring `ln_kappa`"},{"path":"https://pbs-assess.github.io/sdmTMB/articles/web_only/bayesian.html","id":"passing-the-model-to-tmbstan","dir":"Articles > Web_only","previous_headings":"","what":"Passing the model to tmbstan","title":"Bayesian estimation with sdmTMB","text":"Now can pass $tmb_obj element model tmbstan::tmbstan(). using 1000 iterations 2 chains vignette builds quickly. practice, likely want use (e.g., 2000 iterations, 4 chains). Sometimes may need adjust sampler settings : See Details section ?rstan::stan. can also ‘thin’ samples via thin argument working model predictions becomes cumbersome given large number required samples. can look model: Rhat values look reasonable (< 1.05). n_eff (number effective samples) values mostly look reasonable (> 100) inference mean parameters except intercept (b_j[1]). Furthermore, can see correlation MCMC samples b_j[1]. try running iterations chains /placing priors parameters described (highly recommended). Now can use various functions visualize posterior:","code":"fit_stan <- tmbstan::tmbstan( fit_mle$tmb_obj, iter = 1000, chains = 2, seed = 8217 # ensures repeatability ) tmbstan::tmbstan( ..., control = list(adapt_delta = 0.9, max_treedepth = 12) ) fit_stan #> Inference for Stan model: sdmTMB. #> 2 chains, each with iter=1000; warmup=500; thin=1; #> post-warmup draws per chain=500, total post-warmup draws=1000. #> #> mean se_mean sd 2.5% 25% 50% 75% 97.5% n_eff Rhat #> b_j[1] 0.80 0.01 0.06 0.68 0.76 0.80 0.84 0.92 82 1.02 #> b_j[2] -0.40 0.00 0.01 -0.41 -0.40 -0.40 -0.39 -0.38 1747 1.00 #> ln_tau_O -1.67 0.01 0.14 -1.93 -1.76 -1.67 -1.57 -1.42 169 1.01 #> ln_phi -1.63 0.00 0.03 -1.69 -1.65 -1.63 -1.61 -1.56 1360 1.00 #> omega_s[1] -0.09 0.01 0.09 -0.26 -0.14 -0.09 -0.03 0.09 162 1.01 #> omega_s[2] -0.06 0.01 0.09 -0.25 -0.12 -0.06 0.01 0.11 196 1.01 #> omega_s[3] 0.01 0.01 0.10 -0.17 -0.05 0.01 0.08 0.20 196 1.01 #> omega_s[4] -0.21 0.01 0.09 -0.40 -0.27 -0.21 -0.15 -0.02 206 1.01 #> omega_s[5] -0.34 0.01 0.10 -0.53 -0.41 -0.33 -0.27 -0.13 247 1.00 #> omega_s[6] -0.08 0.01 0.10 -0.27 -0.15 -0.08 0.00 0.13 268 1.01 #> omega_s[7] -0.02 0.01 0.09 -0.19 -0.08 -0.02 0.04 0.15 172 1.01 #> omega_s[8] -0.23 0.01 0.10 -0.42 -0.29 -0.23 -0.16 -0.05 219 1.01 #> omega_s[9] -0.32 0.01 0.09 -0.50 -0.38 -0.32 -0.26 -0.13 166 1.02 #> omega_s[10] 0.29 0.01 0.09 0.11 0.23 0.28 0.35 0.46 147 1.01 #> omega_s[11] -0.16 0.01 0.09 -0.34 -0.22 -0.16 -0.09 0.02 208 1.01 #> omega_s[12] 0.00 0.01 0.10 -0.19 -0.07 0.01 0.07 0.21 205 1.01 #> omega_s[13] 0.19 0.01 0.09 0.02 0.13 0.19 0.25 0.36 166 1.01 #> omega_s[14] -0.09 0.01 0.10 -0.29 -0.16 -0.09 -0.03 0.11 253 1.01 #> omega_s[15] 0.22 0.01 0.09 0.06 0.16 0.22 0.28 0.40 152 1.01 #> omega_s[16] -0.02 0.01 0.10 -0.21 -0.08 -0.02 0.05 0.16 198 1.01 #> omega_s[17] -0.15 0.01 0.09 -0.34 -0.21 -0.14 -0.08 0.04 198 1.01 #> omega_s[18] -0.28 0.01 0.11 -0.50 -0.35 -0.28 -0.21 -0.09 270 1.00 #> omega_s[19] 0.00 0.01 0.10 -0.18 -0.06 0.00 0.07 0.19 211 1.01 #> omega_s[20] 0.03 0.01 0.08 -0.14 -0.03 0.02 0.09 0.19 180 1.01 #> omega_s[21] 0.08 0.01 0.10 -0.11 0.02 0.08 0.15 0.26 191 1.01 #> omega_s[22] -0.01 0.01 0.10 -0.21 -0.08 -0.01 0.05 0.17 212 1.00 #> omega_s[23] 0.12 0.01 0.09 -0.04 0.06 0.12 0.18 0.30 156 1.01 #> omega_s[24] 0.20 0.01 0.10 0.00 0.13 0.19 0.26 0.40 256 1.00 #> omega_s[25] 0.08 0.01 0.09 -0.09 0.01 0.07 0.14 0.26 167 1.01 #> omega_s[26] -0.01 0.01 0.10 -0.21 -0.08 -0.01 0.06 0.20 215 1.01 #> omega_s[27] -0.11 0.01 0.09 -0.30 -0.18 -0.10 -0.05 0.06 216 1.00 #> omega_s[28] 0.12 0.01 0.10 -0.08 0.05 0.11 0.18 0.32 229 1.00 #> omega_s[29] 0.30 0.01 0.09 0.12 0.23 0.30 0.36 0.47 212 1.01 #> omega_s[30] -0.04 0.01 0.09 -0.22 -0.10 -0.04 0.02 0.14 193 1.01 #> omega_s[31] 0.09 0.01 0.09 -0.08 0.04 0.09 0.15 0.26 154 1.01 #> omega_s[32] 0.05 0.01 0.12 -0.18 -0.02 0.05 0.14 0.28 244 1.01 #> omega_s[33] 0.08 0.01 0.09 -0.10 0.01 0.07 0.14 0.26 219 1.01 #> omega_s[34] 0.04 0.01 0.09 -0.13 -0.03 0.04 0.10 0.20 163 1.01 #> omega_s[35] 0.07 0.01 0.10 -0.10 0.00 0.07 0.14 0.26 230 1.01 #> omega_s[36] 0.14 0.01 0.09 -0.04 0.08 0.15 0.20 0.33 197 1.01 #> omega_s[37] 0.16 0.01 0.11 -0.06 0.09 0.16 0.24 0.39 275 1.00 #> omega_s[38] 0.12 0.01 0.10 -0.09 0.05 0.11 0.19 0.31 217 1.01 #> omega_s[39] -0.22 0.01 0.10 -0.40 -0.28 -0.22 -0.15 -0.04 186 1.01 #> omega_s[40] -0.03 0.01 0.09 -0.21 -0.09 -0.03 0.03 0.17 197 1.01 #> omega_s[41] 0.19 0.01 0.08 0.03 0.13 0.19 0.24 0.35 143 1.01 #> omega_s[42] 0.21 0.01 0.09 0.01 0.14 0.21 0.27 0.38 173 1.01 #> omega_s[43] 0.15 0.01 0.10 -0.04 0.07 0.15 0.21 0.35 221 1.01 #> omega_s[44] 0.14 0.01 0.10 -0.07 0.08 0.14 0.20 0.32 194 1.01 #> omega_s[45] 0.10 0.01 0.10 -0.10 0.02 0.10 0.16 0.30 248 1.00 #> omega_s[46] 0.06 0.01 0.10 -0.14 -0.01 0.06 0.12 0.26 187 1.01 #> omega_s[47] 0.31 0.01 0.09 0.13 0.24 0.31 0.37 0.50 166 1.01 #> omega_s[48] -0.24 0.01 0.10 -0.44 -0.31 -0.24 -0.18 -0.06 196 1.01 #> omega_s[49] 0.10 0.01 0.10 -0.10 0.04 0.10 0.17 0.32 255 1.01 #> omega_s[50] -0.09 0.01 0.09 -0.26 -0.14 -0.09 -0.03 0.08 178 1.01 #> omega_s[51] 0.25 0.01 0.11 0.04 0.17 0.25 0.32 0.48 247 1.01 #> omega_s[52] -0.21 0.01 0.11 -0.43 -0.29 -0.21 -0.14 -0.01 251 1.01 #> omega_s[53] 0.04 0.01 0.10 -0.16 -0.03 0.04 0.10 0.22 213 1.00 #> omega_s[54] 0.03 0.01 0.10 -0.16 -0.04 0.03 0.10 0.22 214 1.00 #> omega_s[55] -0.09 0.01 0.11 -0.30 -0.16 -0.09 -0.01 0.15 311 1.01 #> omega_s[56] -0.42 0.01 0.10 -0.62 -0.48 -0.41 -0.35 -0.24 229 1.01 #> omega_s[57] 0.01 0.01 0.11 -0.20 -0.07 0.00 0.08 0.24 274 1.00 #> omega_s[58] -0.21 0.01 0.10 -0.40 -0.28 -0.21 -0.14 -0.01 201 1.01 #> omega_s[59] 0.04 0.01 0.24 -0.43 -0.11 0.05 0.20 0.47 992 1.00 #> omega_s[60] -0.23 0.01 0.28 -0.78 -0.41 -0.23 -0.03 0.27 1087 1.00 #> omega_s[61] -0.26 0.01 0.24 -0.69 -0.41 -0.27 -0.10 0.20 662 1.00 #> omega_s[62] -0.26 0.01 0.23 -0.72 -0.42 -0.26 -0.11 0.19 693 1.00 #> omega_s[63] -0.27 0.01 0.23 -0.72 -0.43 -0.28 -0.12 0.18 807 1.00 #> omega_s[64] 0.08 0.01 0.22 -0.37 -0.07 0.09 0.23 0.51 866 1.00 #> omega_s[65] 0.17 0.01 0.22 -0.29 0.02 0.17 0.32 0.57 693 1.00 #> omega_s[66] 0.16 0.01 0.22 -0.27 0.02 0.16 0.31 0.60 525 1.00 #> omega_s[67] -0.03 0.01 0.22 -0.49 -0.17 -0.01 0.13 0.42 1154 1.00 #> omega_s[68] -0.02 0.01 0.25 -0.50 -0.17 -0.02 0.15 0.47 1152 1.00 #> omega_s[69] 0.00 0.01 0.22 -0.46 -0.15 0.00 0.15 0.43 458 1.01 #> omega_s[70] 0.01 0.01 0.21 -0.42 -0.13 0.01 0.15 0.40 431 1.00 #> omega_s[71] 0.17 0.01 0.24 -0.30 0.01 0.17 0.33 0.67 884 1.00 #> omega_s[72] -0.12 0.01 0.24 -0.62 -0.29 -0.11 0.04 0.35 1064 1.00 #> omega_s[73] -0.14 0.01 0.22 -0.58 -0.28 -0.15 0.01 0.30 600 1.00 #> omega_s[74] -0.14 0.01 0.22 -0.56 -0.29 -0.14 0.01 0.26 624 1.00 #> omega_s[75] -0.38 0.01 0.20 -0.76 -0.51 -0.38 -0.25 0.00 654 1.00 #> omega_s[76] 0.10 0.01 0.21 -0.31 -0.04 0.10 0.23 0.51 848 1.00 #> omega_s[77] 0.09 0.01 0.22 -0.34 -0.04 0.09 0.23 0.52 1053 1.00 #> omega_s[78] -0.06 0.01 0.21 -0.47 -0.19 -0.06 0.08 0.35 1693 1.00 #> omega_s[79] 0.06 0.01 0.16 -0.27 -0.05 0.06 0.17 0.37 622 1.00 #> omega_s[80] -0.18 0.01 0.23 -0.65 -0.33 -0.18 -0.02 0.27 632 1.00 #> omega_s[81] -0.07 0.01 0.20 -0.45 -0.20 -0.07 0.05 0.33 1165 1.00 #> omega_s[82] -0.14 0.01 0.21 -0.55 -0.28 -0.14 0.00 0.29 810 1.00 #> omega_s[83] -0.02 0.01 0.23 -0.47 -0.18 -0.01 0.14 0.44 1096 1.00 #> omega_s[84] 0.12 0.01 0.20 -0.26 -0.02 0.12 0.25 0.53 952 1.00 #> omega_s[85] -0.29 0.01 0.21 -0.68 -0.44 -0.30 -0.15 0.11 1111 1.00 #> lp__ 136.06 0.93 9.26 118.11 129.87 136.38 142.46 152.88 100 1.02 #> #> Samples were drawn using NUTS(diag_e) at Sun May 26 03:41:09 2024. #> For each parameter, n_eff is a crude measure of effective sample size, #> and Rhat is the potential scale reduction factor on split chains (at #> convergence, Rhat=1). plot(fit_stan) #> 'pars' not specified. Showing first 10 parameters by default. #> ci_level: 0.8 (80% intervals) #> outer_level: 0.95 (95% intervals) pars_plot <- c(\"b_j[1]\", \"b_j[2]\", \"ln_tau_O\", \"omega_s[1]\") bayesplot::mcmc_trace(fit_stan, pars = pars_plot) bayesplot::mcmc_pairs(fit_stan, pars = pars_plot)"},{"path":"https://pbs-assess.github.io/sdmTMB/articles/web_only/bayesian.html","id":"posterior-predictive-checks","dir":"Articles > Web_only","previous_headings":"","what":"Posterior predictive checks","title":"Bayesian estimation with sdmTMB","text":"can perform posterior predictive checks assess whether model can generate predictive data consistent observations. , can make use simulate.sdmTMB() passing Stan model. simulate.sdmTMB() take draws joint parameter posterior add observation error. need ensure nsim less equal total number post-warmup samples. See ?bayesplot::pp_check. solid line represents density observed data light blue lines represent density 50 posterior predictive simulations. case, simulated data seem consistent observed data.","code":"set.seed(19292) samps <- sdmTMBextra::extract_mcmc(fit_stan) s <- simulate(fit_mle, mcmc_samples = samps, nsim = 50) bayesplot::pp_check( sim_dat$observed, yrep = t(s), fun = bayesplot::ppc_dens_overlay )"},{"path":"https://pbs-assess.github.io/sdmTMB/articles/web_only/bayesian.html","id":"plotting-predictions","dir":"Articles > Web_only","previous_headings":"","what":"Plotting predictions","title":"Bayesian estimation with sdmTMB","text":"can make predictions Bayesian model supplying posterior samples mcmc_samples argument predict.sdmTMB(). output matrix row corresponds row predicted data column corresponds sample. can summarize draws various ways visualize : predict grid given value a1:","code":"pred <- predict(fit_mle, mcmc_samples = samps) dim(pred) #> [1] 500 1000 sim_dat$post_mean <- apply(pred, 1, mean) sim_dat$post_sd <- apply(pred, 1, sd) ggplot(sim_dat, aes(X, Y, colour = post_mean)) + geom_point() + scale_color_viridis_c() ggplot(sim_dat, aes(X, Y, colour = post_sd)) + geom_point() + scale_color_viridis_c() nd <- expand.grid( X = seq(0, 1, length.out = 70), Y = seq(0, 1, length.out = 70), a1 = 0 ) pred <- predict(fit_mle, newdata = nd, mcmc_samples = samps) nd$post_mean <- apply(pred, 1, mean) nd$post_sd <- apply(pred, 1, sd) ggplot(nd, aes(X, Y, fill = post_mean)) + geom_raster() + scale_fill_viridis_c() + coord_fixed() ggplot(nd, aes(X, Y, fill = post_sd)) + geom_raster() + scale_fill_viridis_c() + coord_fixed()"},{"path":"https://pbs-assess.github.io/sdmTMB/articles/web_only/bayesian.html","id":"extracting-parameter-posterior-samples","dir":"Articles > Web_only","previous_headings":"","what":"Extracting parameter posterior samples","title":"Bayesian estimation with sdmTMB","text":"can extract posterior samples rstan::extract(), result list element corresponds parameter set parameters: example calculating derived parameter, calculate marginal spatial random field standard deviation:","code":"post <- rstan::extract(fit_stan) names(post) #> [1] \"b_j\" \"ln_tau_O\" \"ln_phi\" \"omega_s\" \"lp__\" hist(post$b_j[, 1]) ln_kappa <- get_pars(fit_mle)$ln_kappa[1] # 2 elements since 2nd would be for spatiotemporal ln_tau_O <- post$ln_tau_O sigma_O <- 1 / sqrt(4 * pi * exp(2 * ln_tau_O + 2 * ln_kappa)) hist(sigma_O)"},{"path":"https://pbs-assess.github.io/sdmTMB/articles/web_only/bayesian.html","id":"extracting-the-posterior-of-other-predicted-elements","dir":"Articles > Web_only","previous_headings":"","what":"Extracting the posterior of other predicted elements","title":"Bayesian estimation with sdmTMB","text":"default predict.sdmTMB() returns overall prediction link space tmbstan model passed . instead want element might find usual data frame returned predict.sdmTMB() applied regular sdmTMB model, can specify sims_var argument. example, let’s extract spatial random field values \"omega_s\". options documented ?predict.sdmTMB().","code":"fit_pred <- predict( fit_mle, newdata = nd, mcmc_samples = samps, sims_var = \"omega_s\" #< ) nd$spatial_rf_mean <- apply(fit_pred, 1, mean) nd$spatial_rf_sd <- apply(fit_pred, 1, sd) ggplot(nd, aes(X, Y, fill = spatial_rf_mean)) + geom_raster() + scale_fill_gradient2() + coord_fixed() ggplot(nd, aes(X, Y, fill = spatial_rf_sd)) + geom_raster() + scale_fill_viridis_c() + coord_fixed()"},{"path":[]},{"path":"https://pbs-assess.github.io/sdmTMB/articles/web_only/cross-validation.html","id":"overview","dir":"Articles > Web_only","previous_headings":"","what":"Overview","title":"Cross-validation for model evaluation and comparison","text":"Cross-validation one best approaches can used quantify model performance compare sdmTMB models different structures (unlike AIC, approach also factor uncertainty random effects). Arguably challenging decision implementing cross-validation specify folds (fold representing subset data turn held used test set). Folds may vary number data partitioned, likely slightly different application. goals sdmTMB applications may focused spatial prediction; include making prediction new spatial regions (e.g. unsampled areas areas sampled every year). types models recommend exploring folds using blockCV spatialsample packages (Valavi et al. 2019; Silge 2021). general, spatial sampling approaches assign observations spatially autocorrelated fold. Accounting spatial correlation can lead better estimates covariate effects, well prediction errors. Alternatively, goals analysis sdmTMB may evaluate predictive accuracy model time (e.g. missing survey year, prediction future years). retrospective analyses, points within year may assigned fold (groups years fold). contrast, models forward looking use Leave Future Cross-Validation (LFOCV). LFOCV, data year \\(t\\) used predict observations \\(t+1\\), etc.","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/articles/web_only/cross-validation.html","id":"cross-validation-in-sdmtmb","dir":"Articles > Web_only","previous_headings":"","what":"Cross validation in sdmTMB","title":"Cross-validation for model evaluation and comparison","text":"Cross validation sdmTMB implemented using sdmTMB_cv() function, k_folds argument specifying number folds (defaults 8). function uses parallelization default future::plan() set, can turned parallel argument. example, folds assigned randomly—can modified specific spatial temporal applications. Without getting complexities blockCV spatialsample packages, simply use kmeans generate spatial clusters, e.g. similarly, clusters assigned time—, year unique fold. Note year included factor spatiotemporal fields turned estimated missing years.","code":"data(pcod) mesh <- make_mesh(pcod, c(\"X\", \"Y\"), cutoff = 25) pcod$fyear <- as.factor(pcod$year) # Set parallel processing if desired: library(future) plan(multisession, workers = 2) m_cv <- sdmTMB_cv( density ~ 0 + s(depth_scaled) + fyear, data = pcod, mesh = mesh, family = tweedie(link = \"log\"), k_folds = 4 ) #> Running fits with `future.apply()`. #> Set a parallel `future::plan()` to use parallel processing. clust <- kmeans(pcod[, c(\"X\", \"Y\")], 20)$cluster m_cv <- sdmTMB_cv( density ~ 0 + s(depth_scaled) + fyear, data = pcod, mesh = mesh, fold_ids = clust, family = tweedie(link = \"log\") ) #> Running fits with `future.apply()`. #> Set a parallel `future::plan()` to use parallel processing. clust <- as.numeric(as.factor(pcod$year)) m_cv <- sdmTMB_cv( density ~ 0 + s(depth_scaled), data = pcod, mesh = mesh, fold_ids = clust, spatiotemporal = \"off\", family = tweedie(link = \"log\") ) #> Running fits with `future.apply()`. #> Set a parallel `future::plan()` to use parallel processing."},{"path":"https://pbs-assess.github.io/sdmTMB/articles/web_only/cross-validation.html","id":"measuring-model-performance","dir":"Articles > Web_only","previous_headings":"","what":"Measuring model performance","title":"Cross-validation for model evaluation and comparison","text":"Lots measures predictive accuracy can used evaluate model performance. default, sdmTMB_cv() returns list contains sum log likelihoods left-fold total summed across left-folds. roughly equivalent expected log predictive density (ELPD) Bayesian literature can interpreted predictive ability model new observations. can accessed , inspecting quantities across folds may help elucidate whether particular folds difficult predict.","code":"m_cv <- sdmTMB_cv( density ~ 0 + s(depth_scaled) + fyear, data = pcod, mesh = mesh, family = tweedie(link = \"log\"), k_folds = 4 ) #> Running fits with `future.apply()`. #> Set a parallel `future::plan()` to use parallel processing. m_cv$fold_loglik # fold log-likelihood #> [1] -1720.122 -1756.138 -1502.946 -1661.257 m_cv$sum_loglik # total log-likelihood #> [1] -6640.463"},{"path":"https://pbs-assess.github.io/sdmTMB/articles/web_only/cross-validation.html","id":"single-splits","dir":"Articles > Web_only","previous_headings":"","what":"Single splits","title":"Cross-validation for model evaluation and comparison","text":"cases single test set evaluated (e.g., 10% data), using sdmTMB_cv() function may overkill two sdmTMB() models fit, using function may worthwhile reduce coding errors (log-likelihood calculations). example, assign two folds, randomly holding 10% observations test set (test set given ID = 1, training set given ID = 2). can ignore total log-likelihood, just focus first element list list:","code":"clust <- sample(1:2, size = nrow(pcod), replace = TRUE, prob = c(0.1, 0.9)) m_cv <- sdmTMB_cv( density ~ 0 + s(depth_scaled) + fyear, data = pcod, mesh = mesh, fold_ids = clust, family = tweedie(link = \"log\"), k_folds = length(unique(clust)) ) m_cv$fold_loglik[[1]] #> [1] -608.5838"},{"path":"https://pbs-assess.github.io/sdmTMB/articles/web_only/cross-validation.html","id":"comparing-two-or-more-models","dir":"Articles > Web_only","previous_headings":"","what":"Comparing two or more models","title":"Cross-validation for model evaluation and comparison","text":"can use output sdmTMB_cv() compare two models. example, wanted evaluate support depth effect , 10-fold cross validation (’s important folds across two models). example, using either predictive log-likelihood ELPD lead one conclude including depth improves predictive accuracy model.","code":"clust <- sample(seq_len(10), size = nrow(pcod), replace = TRUE) m1 <- sdmTMB_cv( density ~ 0 + fyear, data = pcod, mesh = mesh, fold_ids = clust, family = tweedie(link = \"log\") ) #> Running fits with `future.apply()`. #> Set a parallel `future::plan()` to use parallel processing. m2 <- sdmTMB_cv( density ~ 0 + fyear + s(depth_scaled), data = pcod, mesh = mesh, fold_ids = clust, family = tweedie(link = \"log\") ) #> Running fits with `future.apply()`. #> Set a parallel `future::plan()` to use parallel processing. # Compare log-likelihoods -- higher is better! m1$sum_loglik #> [1] -6748.929 m2$sum_loglik #> [1] -6596.481"},{"path":"https://pbs-assess.github.io/sdmTMB/articles/web_only/cross-validation.html","id":"model-ensembling","dir":"Articles > Web_only","previous_headings":"","what":"Model ensembling","title":"Cross-validation for model evaluation and comparison","text":"Finally, instead identifying single “best” models, may interested model averaging. sdmTMB package, ’ve implemented model stacking procedure described (Yao et al. 2018) sdmTMB_stacking() function. procedure uses optimization find normalized weights maximize total log-likelihood across models (metrics may also used). Inputs function list models (fictitious model_list), list element output call sdmTMB_cv(): default calculation uses data fold. instead, split data 10/90 split (example ), wouldn’t want use 2nd model fit generate weights. just wanted use predictions first fold onto 10% test set, specify using include_folds argument.","code":"weights <- sdmTMB_stacking(model_list) weights <- sdmTMB_stacking(model_list, include_folds = 1)"},{"path":[]},{"path":"https://pbs-assess.github.io/sdmTMB/articles/web_only/delta-models.html","id":"example-with-built-in-delta-model","dir":"Articles > Web_only","previous_headings":"","what":"Example with built-in delta model","title":"Fitting delta (hurdle) models with sdmTMB","text":", show example fitting using built-delta functionality, well build model component separately combine. built-approach convenient, allows parameters shared across components, allows calculation derived quantities standardized indexes (get_index()) internally calculated standard errors. use dataset built sdmTMB package: trawl survey data Pacific Cod Queen Charlotte Sound, British Columbia, Canada. density units kg/km2. , X Y coordinates UTM zone 9. first create mesh use models. can fit model Pacific cod density using delta-gamma model, including smoothed effect depth. default built-delta models formula, spatial spatiotemporal structure, anisotropy shared two model components. However, elements (formula, spatial, spatiotemporal, share_range) can also specified independently model using list format within function argument (see examples ). first element list binomial component second element positive component (e.g., Gamma). elements must shared now (e.g., smoothers, spatially varying coefficients, time-varying coefficients). specify settings spatial spatiotemporal effects model component, create list settings within spatial spatiotemporal arguments. example, spatial = list(\"\", \"\"), spatiotemporal = list(\"\", \"rw\"). similarly specify different formula component model, using list(y ~ x1, y ~ x2) instance, include effect depth positive model, remove presence-absence model. However, currently limitations specifying separate formulas model component. two formulas : smoothers threshold effects random intercepts now, must specified single formula shared across two models. model component can similarly separate settings share_range, determines whether shared spatial spatiotemporal range parameter (TRUE) independent range parameters (FALSE), using list. Lastly, whether anisotropy included model determined logical argument anisotropy (.e., TRUE FALSE), separately specified model. anisotropy included, default shared across two model components. However can made unique model component using sdmTMBcontrol(map = ...) adding argument control fitting model. ‘maps’ anisotropy parameters unique across model components. fit delta model, can evaluate plot output, diagnostics, predictions similar sdmTMB models. printed model output show estimates standard errors parameters model separately. Using tidy() function turn sdmTMB model output data frame, argument model=1 model=2 specify model component extract dataframe. See tidy.sdmTMB() additional arguments options. built-delta models, default function return estimated response parameters grid cell model separately, notated 1 (presence/absence model) 2 (positive catch model) column name. See predict.sdmTMB() description values data frame. can use predictions built-delta model (making sure return_tmb_object=TRUE) get index values using get_index() function. can used predictions include first second models (.e., using default specifying model argument) predictions generated using model=NA. get_index() function automatically combine predictions first second model calculating index values. modelling purposes creating index see vignette Index standardization sdmTMB. can plot conditional effects covariates (depth example model) using package visreg specifying model component model=1 presence-absence model model=2 positive catch model. Currently, plotting effects built-delta models ggeffects supported. See vignette using visreg sdmTMB information. built-delta models can also evaluated residuals() functions sdmTMB. Similarly generating predictions, can specify model components want return residuals using model argument specifying =1 =2. See residuals.sdmTMB() additional options evaluating residuals sdmTMB models. can also simulate new observations fitted delta model. functions, can specify model simulate using argument model=1 presence/absence, model=2 positive catches, model=NA combined predictions. See simulate.sdmTMB() details simulation options.","code":"pcod_mesh <- make_mesh(pcod, c(\"X\", \"Y\"), cutoff = 15) fit_dg <- sdmTMB(density ~ 1 + s(depth), data = pcod, mesh = pcod_mesh, time = \"year\", family = delta_gamma() ) print(fit_dg) #> Spatiotemporal model fit by ML ['sdmTMB'] #> Formula: density ~ 1 + s(depth) #> Mesh: pcod_mesh (isotropic covariance) #> Time column: year #> Data: pcod #> Family: delta_gamma(link1 = 'logit', link2 = 'log') #> #> Delta/hurdle model 1: ----------------------------------- #> Family: binomial(link = 'logit') #> coef.est coef.se #> (Intercept) -0.34 0.61 #> sdepth 1.28 2.91 #> #> Smooth terms: #> Std. Dev. #> sds(depth) 14.38 #> #> Matérn range: 61.41 #> Spatial SD: 1.71 #> Spatiotemporal IID SD: 0.81 #> #> Delta/hurdle model 2: ----------------------------------- #> Family: Gamma(link = 'log') #> coef.est coef.se #> (Intercept) 3.67 0.12 #> sdepth 0.31 1.29 #> #> Smooth terms: #> Std. Dev. #> sds(depth) 5.41 #> #> Dispersion parameter: 1.03 #> Matérn range: 14.80 #> Spatial SD: 0.69 #> Spatiotemporal IID SD: 1.45 #> #> ML criterion at convergence: 6126.400 #> #> See ?tidy.sdmTMB to extract these values as a data frame. tidy(fit_dg) # model = 1 is default #> # A tibble: 1 × 3 #> term estimate std.error #> #> 1 (Intercept) -0.343 0.614 tidy(fit_dg, model = 1) #> # A tibble: 1 × 3 #> term estimate std.error #> #> 1 (Intercept) -0.343 0.614 tidy(fit_dg, model = 1, \"ran_pars\", conf.int = TRUE) #> # A tibble: 3 × 5 #> term estimate std.error conf.low conf.high #> #> 1 range 61.4 14.1 39.2 96.3 #> 2 sigma_O 1.71 0.265 1.26 2.32 #> 3 sigma_E 0.806 0.142 0.570 1.14 tidy(fit_dg, model = 2) #> # A tibble: 1 × 3 #> term estimate std.error #> #> 1 (Intercept) 3.67 0.120 tidy(fit_dg, model = 2, \"ran_pars\", conf.int = TRUE) #> # A tibble: 4 × 5 #> term estimate std.error conf.low conf.high #> #> 1 range 14.8 5.02 7.62 28.8 #> 2 phi 1.03 0.0502 0.939 1.14 #> 3 sigma_O 0.691 0.228 0.361 1.32 #> 4 sigma_E 1.45 0.336 0.919 2.28 grid_yrs <- replicate_df(qcs_grid, \"year\", unique(pcod$year)) p <- predict(fit_dg, newdata = grid_yrs) str(p) #> 'data.frame': 65826 obs. of 16 variables: #> $ X : num 456 458 460 462 464 466 468 470 472 474 ... #> $ Y : num 5636 5636 5636 5636 5636 ... #> $ depth : num 347 223 204 183 183 ... #> $ depth_scaled : num 1.561 0.57 0.363 0.126 0.122 ... #> $ depth_scaled2: num 2.4361 0.3246 0.132 0.0158 0.0149 ... #> $ year : int 2003 2003 2003 2003 2003 2003 2003 2003 2003 2003 ... #> $ est1 : num -5.804 -0.796 0.021 0.87 0.98 ... #> $ est2 : num 3.2 3.71 4.02 4.39 4.43 ... #> $ est_non_rf1 : num -5.508 -0.598 0.12 0.871 0.882 ... #> $ est_non_rf2 : num 2.83 3.31 3.59 3.94 3.94 ... #> $ est_rf1 : num -0.296138 -0.197484 -0.09883 -0.000177 0.098477 ... #> $ est_rf2 : num 0.368 0.398 0.428 0.459 0.489 ... #> $ omega_s1 : num -0.00543 0.09317 0.19178 0.29039 0.38899 ... #> $ omega_s2 : num 0.108 0.119 0.129 0.139 0.15 ... #> $ epsilon_st1 : num -0.291 -0.291 -0.291 -0.291 -0.291 ... #> $ epsilon_st2 : num 0.26 0.28 0.299 0.319 0.339 ... p2 <- predict(fit_dg, newdata = grid_yrs, return_tmb_object = TRUE) ind_dg <- get_index(p2, bias_correct = FALSE) visreg_delta(fit_dg, xvar = \"depth\", model = 1, gg = TRUE) #> These are residuals for delta model component 1. Use the `model` argument to #> select the other component. visreg_delta(fit_dg, xvar = \"depth\", model = 2, gg = TRUE) simulations <- simulate(fit_dg, nsim = 5, seed = 5090, model = NA)"},{"path":"https://pbs-assess.github.io/sdmTMB/articles/web_only/delta-models.html","id":"delta-models-by-fitting-two-components-separately-and-combining-predictions","dir":"Articles > Web_only","previous_headings":"","what":"Delta models by fitting two components separately and combining predictions","title":"Fitting delta (hurdle) models with sdmTMB","text":"Next, show example implement delta-gamma model sdmTMB, component fit separately combined. approach gives maximum flexibility model lets develop time. limitations calculating index abundance want share parameters. necessary use mesh models, one can updating first mesh match reduced data frame shown : delta-gamma model similar Tweedie model Intro modelling sdmTMB vignette, except use s() depth effect. One can use different covariates model, case just let depth effect wiggly specifying k = 3. Next, need way combining predictions across two models. need point predictions, can just multiply predictions two models applying inverse link: measure uncertainty required, can simulate joint parameter precision matrix using predict() function number simulations selected (e.g., sims = 500). predictions come simulated draws parameter covariance matrix, predictions become consistent larger number draws. However, greater number draws takes longer calculate use memory (larger matrix), fewer draws (~100) may fine experimentation. larger number (say ~1000) may appropriate final model runs. p_combined_sim matrix row row data predicted width nsim. can process matrix however like. can save median predictions upper lower 95% confidence intervals: can calculate spatial uncertainty:","code":"glimpse(pcod) #> Rows: 2,143 #> Columns: 12 #> $ year 2003, 2003, 2003, 2003, 2003, 2003, 2003, 2003, 2003, 20… #> $ X 446.4752, 446.4594, 448.5987, 436.9157, 420.6101, 417.71… #> $ Y 5793.426, 5800.136, 5801.687, 5802.305, 5771.055, 5772.2… #> $ depth 201, 212, 220, 197, 256, 293, 410, 387, 285, 270, 381, 1… #> $ density 113.138476, 41.704922, 0.000000, 15.706138, 0.000000, 0.… #> $ present 1, 1, 0, 1, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0,… #> $ lat 52.28858, 52.34890, 52.36305, 52.36738, 52.08437, 52.094… #> $ lon -129.7847, -129.7860, -129.7549, -129.9265, -130.1586, -… #> $ depth_mean 5.155194, 5.155194, 5.155194, 5.155194, 5.155194, 5.1551… #> $ depth_sd 0.4448783, 0.4448783, 0.4448783, 0.4448783, 0.4448783, 0… #> $ depth_scaled 0.3329252, 0.4526914, 0.5359529, 0.2877417, 0.8766077, 1… #> $ depth_scaled2 0.11083919, 0.20492947, 0.28724555, 0.08279527, 0.768440… mesh1 <- make_mesh(pcod, c(\"X\", \"Y\"), cutoff = 20) # coarse for vignette speed dat2 <- subset(pcod, density > 0) mesh2 <- make_mesh(dat2, xy_cols = c(\"X\", \"Y\"), mesh = mesh1$mesh ) m1 <- sdmTMB( formula = present ~ 0 + as.factor(year) + s(depth, k = 3), data = pcod, mesh = mesh1, time = \"year\", family = binomial(link = \"logit\"), spatiotemporal = \"iid\", spatial = \"on\" ) m1 #> Spatiotemporal model fit by ML ['sdmTMB'] #> Formula: present ~ 0 + as.factor(year) + s(depth, k = 3) #> Mesh: mesh1 (isotropic covariance) #> Time column: year #> Data: pcod #> Family: binomial(link = 'logit') #> #> coef.est coef.se #> as.factor(year)2003 -0.76 0.42 #> as.factor(year)2004 -0.40 0.42 #> as.factor(year)2005 -0.42 0.42 #> as.factor(year)2007 -1.40 0.42 #> as.factor(year)2009 -1.16 0.42 #> as.factor(year)2011 -1.56 0.42 #> as.factor(year)2013 -0.38 0.42 #> as.factor(year)2015 -0.65 0.42 #> as.factor(year)2017 -1.56 0.42 #> sdepth -5.66 0.50 #> #> Smooth terms: #> Std. Dev. #> sds(depth) 85.9 #> #> Matérn range: 28.70 #> Spatial SD: 1.89 #> Spatiotemporal IID SD: 0.90 #> ML criterion at convergence: 1054.414 #> #> See ?tidy.sdmTMB to extract these values as a data frame. m2 <- sdmTMB( formula = density ~ 0 + as.factor(year) + s(depth), data = dat2, mesh = mesh2, time = \"year\", family = Gamma(link = \"log\"), spatiotemporal = \"iid\", spatial = \"on\" ) m2 #> Spatiotemporal model fit by ML ['sdmTMB'] #> Formula: density ~ 0 + as.factor(year) + s(depth) #> Mesh: mesh2 (isotropic covariance) #> Time column: year #> Data: dat2 #> Family: Gamma(link = 'log') #> #> coef.est coef.se #> as.factor(year)2003 4.06 0.20 #> as.factor(year)2004 4.22 0.20 #> as.factor(year)2005 4.19 0.20 #> as.factor(year)2007 3.38 0.20 #> as.factor(year)2009 3.71 0.21 #> as.factor(year)2011 4.51 0.21 #> as.factor(year)2013 4.02 0.19 #> as.factor(year)2015 4.13 0.20 #> as.factor(year)2017 3.84 0.22 #> sdepth -0.28 0.36 #> #> Smooth terms: #> Std. Dev. #> sds(depth) 1.89 #> #> Dispersion parameter: 0.94 #> Matérn range: 0.01 #> Spatial SD: 727.28 #> Spatiotemporal IID SD: 2064.88 #> ML criterion at convergence: 5102.136 #> #> See ?tidy.sdmTMB to extract these values as a data frame. #> #> **Possible issues detected! Check output of sanity().** pred <- grid_yrs # use the grid as template for saving our predictions p_bin <- predict(m1, newdata = grid_yrs) p_pos <- predict(m2, newdata = grid_yrs) p_bin_prob <- m1$family$linkinv(p_bin$est) p_pos_exp <- m2$family$linkinv(p_pos$est) pred$est_exp <- p_bin_prob * p_pos_exp set.seed(28239) p_bin_sim <- predict(m1, newdata = grid_yrs, nsim = 100) p_pos_sim <- predict(m2, newdata = grid_yrs, nsim = 100) p_bin_prob_sim <- m1$family$linkinv(p_bin_sim) p_pos_exp_sim <- m2$family$linkinv(p_pos_sim) p_combined_sim <- p_bin_prob_sim * p_pos_exp_sim pred$median <- apply(p_combined_sim, 1, median) plot(pred$est_exp, pred$median) ggplot(subset(pred, year == 2017), aes(X, Y, fill = median)) + geom_raster() + coord_fixed() + scale_fill_viridis_c(trans = \"sqrt\") pred$cv <- apply(p_combined_sim, 1, function(x) sd(x) / mean(x)) ggplot(subset(pred, year == 2017), aes(X, Y, fill = cv)) + # 2017 as an example geom_raster() + coord_fixed() + scale_fill_viridis_c(trans = \"log10\")"},{"path":"https://pbs-assess.github.io/sdmTMB/articles/web_only/ggeffects.html","id":"example-with-pacific-cod-presence","dir":"Articles > Web_only","previous_headings":"","what":"Example with Pacific cod presence","title":"Visualizing marginal effects in sdmTMB models with ggeffects","text":"start, use Pacific cod example data. fit model fish presence/absence covariates depth fixed effect year using Tweedie distribution. can use ggeffects::ggeffect() see effect depth probability Pacific cod present. can control range interval depths predicted within function (e.g. [0:500 =1]). can also plot effects year. can add data points can also use ggeffect plot multiple variables listing terms = c(), first term listed indicating variable plotted x-axis, remaining listed terms (four total) indicating groups. Adding facet = TRUE show year separate plot, instead overlain one plot. Adding facet = TRUE show year separate plot, instead overlain one plot. can also use make ggplot plot calling ggeffects object dat data frame. Plotting using continuous response (density) rather presence-similar. instance: can fit model interaction two continuous variables: plotting two continuous variables, ggeffect() make non-target (2nd) variable discrete selecting different levels. specify levels rather letting ggeffect() choose , use brackets selected values within term list, instance","code":"pcod$fyear <- as.factor(pcod$year) mesh <- make_mesh(pcod, c(\"X\", \"Y\"), cutoff = 20) fit <- sdmTMB(present ~ poly(depth, 2) + fyear, data = pcod, mesh = mesh, spatial = \"on\", family = binomial() ) g <- ggeffect(fit, \"depth [0:500 by=1]\") plot(g) g2 <- ggeffect(fit, \"fyear\") plot(g2) plot(g, add.data = TRUE) #> Warning: Argument `add.data` is deprecated and will be removed in the future. #> Please use `show_data` instead. #> Data points may overlap. Use the `jitter` argument to add some amount of #> random variation to the location of data points and avoid overplotting. dat <- ggeffect(fit, terms = c(\"depth [0:350 by=5]\", \"fyear\")) plot(dat) plot(dat, facet = TRUE) ggplot(dat, aes(x, predicted, colour = group)) + geom_line() g5 <- ggeffect(fit3, terms = c(\"depth_scaled [-3:2.7 by=0.01]\", \"numeric_year\")) plot(g5) plot(g5, facet = TRUE) g6 <- ggeffect(fit3, terms = c(\"depth_scaled [-3:2.7 by=0.01]\", \"numeric_year [1,7,15]\")) plot(g6)"},{"path":"https://pbs-assess.github.io/sdmTMB/articles/web_only/poisson-link.html","id":"introduction","dir":"Articles > Web_only","previous_headings":"","what":"Introduction","title":"Poisson-link delta models","text":"Jim Thorson introduced Poisson-link delta model CJFAS paper (Thorson 2018). ’s useful model family dealing positive continuous data zeroes. can used circumstances use Tweedie traditional delta-gamma delta-lognormal model logit log links. Poisson-link delta formulation describes go linear predictors link space expected values entered data likelihood. likelihood first component Bernoulli (.e., binomial single trials) likelihood second component can gamma lognormal. sdmTMB, families specified delta_gamma(type = \"poisson-link\") delta_lognormal(type = \"poisson-link\").","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/articles/web_only/poisson-link.html","id":"the-model","dir":"Articles > Web_only","previous_headings":"","what":"The model","title":"Poisson-link delta models","text":"Thorson (2018) describes process model : \\[ \\begin{aligned} p &= 1 - \\exp(-n),\\\\ r &= \\frac{n w}{p}, \\end{aligned} \\] \\(p\\) represents encounter probability, \\(n\\) represents group numbers density, \\(r\\) represents positive catch rate (.e., catch rate given encounter), \\(w\\) represents weight per group. linear predictors represent log group numbers density (\\(\\log (n)\\)) log weight per group (\\(\\log (w)\\)): \\[ \\begin{aligned} \\log (n) &= \\boldsymbol{X_1} \\boldsymbol{\\beta_1} + \\ldots,\\\\ \\log (w) &= \\boldsymbol{X_2} \\boldsymbol{\\beta_2} + \\ldots, \\end{aligned} \\] \\(\\boldsymbol{X}\\)s represent design matrices, \\(\\boldsymbol{\\beta}\\)s represent coefficient vectors, \\(\\ldots\\) represents possible model components linear link space including random fields. transformed process equations generate \\(p\\) \\(r\\). model evaluates following likelihood components. encounter probability \\(p\\) \\[ \\operatorname{Bernoulli} (p), \\] equivalently \\[ \\operatorname{Binomial} (1, p), \\] positive rate \\(r\\), either gamma lognormal likelihood \\[ \\operatorname{Gamma} \\left( \\phi, \\frac{r}{\\phi} \\right), \\] \\(\\phi\\) represents gamma shape \\(r / \\phi\\) represents scale, \\[ \\operatorname{Lognormal} \\left( \\log (r) - \\frac{\\phi^2}{2}, \\phi^2 \\right), \\] \\(\\log (r) - \\frac{\\phi^2}{2}\\) represents bias-corrected log mean \\(\\phi\\) represents standard deviation log-space.","code":""},{"path":[]},{"path":"https://pbs-assess.github.io/sdmTMB/articles/web_only/poisson-link.html","id":"data","dir":"Articles > Web_only","previous_headings":"An example","what":"Data","title":"Poisson-link delta models","text":"work Pacific Spiny Dogfish observations trawl survey west coast Vancouver Island, Canada. dataset dogfish contained within sdmTMB package.","code":"library(ggplot2) theme_set(theme_light()) library(dplyr) library(sdmTMB) dogfish$log_depth <- log(dogfish$depth) head(dogfish) #> # A tibble: 6 × 10 #> year longitude latitude X Y present catch_weight area_swept depth #> #> 1 2004 -125. 48.7 780. 5399. 1 22.7 0.103 73 #> 2 2004 -126. 48.2 735. 5346. 0 0 0.103 455 #> 3 2004 -126. 48.3 738. 5355. 0 0 0.116 171 #> 4 2004 -126. 48.3 749. 5354. 1 221. 0.122 137 #> 5 2004 -126. 48.4 744. 5362. 1 440. 0.0964 140 #> 6 2004 -126. 48.4 737. 5362. 1 48.2 0.122 145 #> # ℹ 1 more variable: log_depth mesh <- make_mesh(dogfish, c(\"X\", \"Y\"), cutoff = 10)"},{"path":"https://pbs-assess.github.io/sdmTMB/articles/web_only/poisson-link.html","id":"a-conventional-delta-gamma-model","dir":"Articles > Web_only","previous_headings":"An example","what":"A conventional delta-gamma model:","title":"Poisson-link delta models","text":"First, lets fit conventional delta-gamma model logit log links illustrate limitation. can show effect depth catch weight? one curve, two components use different links (logit + log), different catch-weight depth relationship depending expected catch weight coefficients (year also given point space spatial random field). example, curves look different years: one curve! depends value year every point space different random field value.","code":"fit_dg <- sdmTMB(catch_weight ~ 0 + as.factor(year) + s(log_depth), family = delta_gamma(), spatial = \"on\", mesh = mesh, data = dogfish, anisotropy = FALSE, reml = TRUE, offset = log(dogfish$area_swept), silent = FALSE ) sanity(fit_dg) #> ✔ Non-linear minimizer suggests successful convergence #> ✔ Hessian matrix is positive definite #> ✔ No extreme or very small eigenvalues detected #> ✔ No gradients with respect to fixed effects are >= 0.001 #> ✔ No fixed-effect standard errors are NA #> ✔ No standard errors look unreasonably large #> ✔ No sigma parameters are < 0.01 #> ✔ No sigma parameters are > 100 #> ✔ Range parameters don't look unreasonably large nd <- expand.grid( log_depth = seq(min(dogfish$log_depth), max(dogfish$log_depth), length.out = 200), year = as.factor(unique(dogfish$year)) ) p <- predict(fit_dg, newdata = nd, re_form = NA) ggplot(p, aes(log_depth, log(plogis(est1) * exp(est2)), colour = year)) + geom_line() + ylab(\"log expected catch weight\")"},{"path":"https://pbs-assess.github.io/sdmTMB/articles/web_only/poisson-link.html","id":"a-poisson-link-delta-gamma-alternative","dir":"Articles > Web_only","previous_headings":"An example","what":"A Poisson-link-delta-gamma alternative","title":"Poisson-link delta models","text":"Instead, let’s fit Poisson-link delta-gamma model: make plot: Note lines now parallel. predictors shift curve affect shape combined prediction use log link.","code":"fit_dpg <- sdmTMB(catch_weight ~ 0 + as.factor(year) + s(log_depth), family = delta_gamma(type = \"poisson-link\"), spatial = \"on\", mesh = mesh, data = dogfish, anisotropy = TRUE, reml = TRUE, offset = log(dogfish$area_swept), silent = FALSE ) sanity(fit_dpg) #> ✔ Non-linear minimizer suggests successful convergence #> ✔ Hessian matrix is positive definite #> ✔ No extreme or very small eigenvalues detected #> ✔ No gradients with respect to fixed effects are >= 0.001 #> ✔ No fixed-effect standard errors are NA #> ✔ No standard errors look unreasonably large #> ✔ No sigma parameters are < 0.01 #> ✔ No sigma parameters are > 100 #> ✔ Range parameters don't look unreasonably large summary(fit_dpg) #> Spatial model fit by REML ['sdmTMB'] #> Formula: catch_weight ~ 0 + as.factor(year) + s(log_depth) #> Mesh: mesh (anisotropic covariance) #> Data: dogfish #> Family: delta_gamma(link1 = 'log', link2 = 'log', type = 'poisson-link') #> #> Delta/hurdle model 1: ----------------------------------- #> Family: binomial(link = 'log') #> coef.est coef.se #> as.factor(year)2004 1.96 0.77 #> as.factor(year)2006 2.89 0.77 #> as.factor(year)2008 2.57 0.76 #> as.factor(year)2010 2.43 0.77 #> as.factor(year)2012 1.94 0.76 #> as.factor(year)2014 1.93 0.76 #> as.factor(year)2016 2.20 0.76 #> as.factor(year)2018 2.03 0.76 #> as.factor(year)2021 1.18 0.76 #> as.factor(year)2022 1.75 0.76 #> slog_depth 0.15 0.54 #> #> Smooth terms: #> Std. Dev. #> sds(log_depth) 2.38 #> #> Matérn anisotropic range (spatial): 26.1 to 381.1 at 142 deg. #> Spatial SD: 1.18 #> #> Delta/hurdle model 2: ----------------------------------- #> Family: Gamma(link = 'log') #> coef.est coef.se #> as.factor(year)2004 3.59 0.37 #> as.factor(year)2006 2.78 0.33 #> as.factor(year)2008 2.47 0.32 #> as.factor(year)2010 3.50 0.34 #> as.factor(year)2012 2.77 0.33 #> as.factor(year)2014 2.45 0.33 #> as.factor(year)2016 2.09 0.32 #> as.factor(year)2018 2.60 0.32 #> as.factor(year)2021 1.69 0.33 #> as.factor(year)2022 1.98 0.33 #> slog_depth -0.20 0.82 #> #> Smooth terms: #> Std. Dev. #> sds(log_depth) 5.09 #> #> Dispersion parameter: 0.59 #> Matérn anisotropic range (spatial): 4.2 to 60.8 at 142 deg. #> Spatial SD: 1.97 #> #> REML criterion at convergence: 5851.613 #> #> See ?tidy.sdmTMB to extract these values as a data frame. #> See ?plot_anisotropy to plot the anisotropic range. p_dpg <- predict(fit_dpg, newdata = nd, re_form = NA) ggplot(p_dpg, aes(log_depth, est1 + est2, colour = year)) + geom_line() + ylab(\"log expected catch weight\")"},{"path":"https://pbs-assess.github.io/sdmTMB/articles/web_only/poisson-link.html","id":"examining-the-model-components-and-how-they-combine","dir":"Articles > Web_only","previous_headings":"An example","what":"Examining the model components and how they combine","title":"Poisson-link delta models","text":"’ll make predictions across depths single year simplicity: can extract components theoretical deconstruction catch group numbers density, weight per group, encounter probability, positive rate: can come overall predictions two ways: n * w: group numbers \\(\\times\\) weight per group. equivalently exp(est1 + est2), est1 est2 linear predictors link space. p * r: encounter probability \\(\\times\\) positive catch rate. give identical answers: Let’s plot components possible combinations:","code":"nd2010 <- dplyr::filter(nd, year == 2010) p_pdg <- predict(fit_dpg, newdata = nd2010, re_form = NA) n <- exp(p_pdg$est1) w <- exp(p_pdg$est2) p <- 1 - exp(-n) r <- (n * w) / p lims <- c(0, max(p * r)) plot(n * w, p * r, xlim = lims, ylim = lims) abline(0, 1) g1 <- ggplot(p_pdg, aes(log_depth, n)) + geom_line() + ggtitle(\"Expected group density\") g2 <- ggplot(p_pdg, aes(log_depth, w)) + geom_line() + ggtitle(\"Expected weight per group\") g3 <- ggplot(p_pdg, aes(log_depth, p)) + geom_line() + ylim(0, 1) + ggtitle(\"Expected encounter probability\") g4 <- ggplot(p_pdg, aes(log_depth, r)) + geom_line() + ggtitle(\"Expected catch given encounter\") g5 <- ggplot(p_pdg, aes(log_depth, n * w)) + geom_line() + ggtitle(\"Expected catch\") g6 <- ggplot(p_pdg, aes(log_depth, p * r)) + geom_line() + ggtitle(\"Expected catch\") cowplot::plot_grid(g1, g2, g3, g4, g5, g6, ncol = 2)"},{"path":[]},{"path":"https://pbs-assess.github.io/sdmTMB/articles/web_only/poisson-link.html","id":"what-is-a-group-and-how-does-the-model-know-about-numbers","dir":"Articles > Web_only","previous_headings":"FAQ","what":"What is a ‘group’ and how does the model know about numbers!?","title":"Poisson-link delta models","text":"model represents process groups fish (observations, course) encountered certain numbers (“group numbers density”) group certain weight (“weight per group”). theoretical construct reflecting model can best parse two components according properties observed data. Since predicted response value \\(n \\cdot w\\), can get value multiplying \\(n\\) dividing \\(w\\) amount. numbers less weight per group fewer numbers weight per group get place. However, balance two, given covariates random effects, best fits data entered likelihood \\(p\\) \\(r\\).","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/articles/web_only/poisson-link.html","id":"what-is-the-connection-to-the-complementary-log-log-link-cloglog-and-the-poisson","dir":"Articles > Web_only","previous_headings":"FAQ","what":"What is the connection to the complementary log-log link (‘cloglog’) and the Poisson?","title":"Poisson-link delta models","text":"Wikipedia: “cloglog model corresponds applications observe either zero events (e.g., defects) one , number events assumed follow Poisson distribution.” following based around entry. Poisson assumes probability observing zero given expected number \\(n\\) \\(\\exp(-n)\\). .e., \\[ \\mathrm{Pr}(0) = \\exp(-n) \\] \\(p\\) probability observing non-zero (.e., encounter), \\[ \\exp(-n) = 1 - p = \\mathrm{Pr}(0). \\] can re-arrange \\[ n = -\\log(1 - p). \\] since want linear predictor take values negative positive infinity (therefore keep predicted number densities positive exponentiating ), work log space: \\[ \\log (n) = \\log(-\\log(1 - p)). \\] right side known ‘cloglog’ link function. link following inverse link function: \\[ p = 1 - \\exp(-\\exp(\\log n)). \\] can check R: can see cloglog inverse link first part Poisson-link delta model. However, ‘trick’ group density \\(n\\) affects encounter probability \\(p\\) (shown ) positive catch rates \\(r\\): \\[ r = \\frac{n w}{p}. \\] , going linear predictors (log links) expected values going data likelihood (binomial + lognormal/gamma), first linear predictor plays double duty appears delta model expected values.","code":"p <- 0.78 log_n <- log(-log(1 - p)) log_n #> [1] 0.4148395 1 - exp(-exp(log_n)) #> [1] 0.78"},{"path":"https://pbs-assess.github.io/sdmTMB/articles/web_only/poisson-link.html","id":"what-are-the-advantages-of-using-such-a-model","dir":"Articles > Web_only","previous_headings":"FAQ","what":"What are the advantages of using such a model?","title":"Poisson-link delta models","text":"poisson-link families flexibility delta model, practice often helpful better predictions. checked AIC, better yet, cross validation. Like delta models, decomposition process two theoretical parts. can help interpretation (can make things complicated interpret!). , parts can rearranged represent two ways thinking two components (\\(n\\) \\(w\\) \\(p\\) \\(r\\)). Compared traditional delta-gamma model logit log link, two linear predictors log links, coefficients interpretable multiplicative effects, effects can combined. E.g., coefficients can added log space predictions can added log space generate single response curve given predictor.","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/articles/web_only/poisson-link.html","id":"where-does-the-offset-come-in","dir":"Articles > Web_only","previous_headings":"FAQ","what":"Where does the offset come in?","title":"Poisson-link delta models","text":"offset added linear predictor \\(\\log n\\), .e., exp(offset) * n. example \\(\\) represents area swept (measure effort) log area swept entered model ‘offset’, , \\[ p = 1 - \\exp(-\\cdot n). \\] Therefore, encounter probability goes 1 area swept (effort) underlying group numbers density increase towards infinity. gets carried positive rate \\(r\\) via \\(p\\) \\(r = nw/p\\).","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/articles/web_only/poisson-link.html","id":"why-call-it-a-poisson-link","dir":"Articles > Web_only","previous_headings":"FAQ","what":"Why call it a Poisson-link?","title":"Poisson-link delta models","text":"can guess. cloglog function derived theory Poisson although cloglog function appears ‘Poisson-link’ model, interaction two model components makes different pure cloglog assume needed another name.","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/articles/web_only/poisson-link.html","id":"these-equations-dont-look-anything-like-the-sdmtmb-source-code","dir":"Articles > Web_only","previous_headings":"FAQ","what":"These equations don’t look anything like the sdmTMB source code!","title":"Poisson-link delta models","text":"’s calculations done log space computational stability calculations fastest stable done slightly differently easily described equations .","code":""},{"path":[]},{"path":"https://pbs-assess.github.io/sdmTMB/articles/web_only/residual-checking.html","id":"residual-checking-with-worked-examples","dir":"Articles > Web_only","previous_headings":"","what":"Residual checking with worked examples","title":"Residual checking with sdmTMB","text":"start data simulated scratch. simulate NB2 negative binomial observation model, spatial random field, intercept, one predictor named ‘a1’ linear effect observed data. Next, fit model configurations various families predictors. first model use Poisson instead NB2. 2nd model match simulated data. third model missing ‘a1’ predictor. ’ll use penalized complexity (PC) prior Matérn parameters aid estimation. can see just looking fits Poisson model inflates spatial random field standard deviation (SD) compared truth. model missing ‘a1’ predictor lesser degree.","code":"library(sdmTMB) set.seed(1) predictor_dat <- data.frame(X = runif(1000), Y = runif(1000), a1 = rnorm(1000)) mesh <- make_mesh(predictor_dat, xy_cols = c(\"X\", \"Y\"), cutoff = 0.1) dat <- sdmTMB_simulate( formula = ~ 1 + a1, data = predictor_dat, mesh = mesh, family = nbinom2(link = \"log\"), phi = 0.4, range = 0.4, sigma_O = 0.4, seed = 1, B = c(0.2, 0.8) # B0 = intercept, B1 = a1 slope ) pc <- pc_matern(range_gt = 0.1, sigma_lt = 1) fit_pois <- sdmTMB(observed ~ 1 + a1, data = dat, family = poisson(), mesh = mesh, priors = sdmTMBpriors(matern_s = pc)) fit_pois #> Spatial model fit by ML ['sdmTMB'] #> Formula: observed ~ 1 + a1 #> Mesh: mesh (isotropic covariance) #> Data: dat #> Family: poisson(link = 'log') #> #> coef.est coef.se #> (Intercept) 0.27 0.17 #> a1 0.82 0.02 #> #> Matérn range: 0.12 #> Spatial SD: 1.20 #> ML criterion at convergence: 2887.957 #> #> See ?tidy.sdmTMB to extract these values as a data frame. fit_nb2 <- update(fit_pois, family = nbinom2()) fit_nb2 #> Spatial model fit by ML ['sdmTMB'] #> Formula: observed ~ 1 + a1 #> Mesh: mesh (isotropic covariance) #> Data: dat #> Family: nbinom2(link = 'log') #> #> coef.est coef.se #> (Intercept) 0.52 0.14 #> a1 0.75 0.06 #> #> Dispersion parameter: 0.41 #> Matérn range: 0.29 #> Spatial SD: 0.42 #> ML criterion at convergence: 1735.452 #> #> See ?tidy.sdmTMB to extract these values as a data frame. fit_nb2_miss <- update(fit_nb2, formula. = observed ~ 1) fit_nb2_miss #> Spatial model fit by ML ['sdmTMB'] #> Formula: observed ~ 1 #> Mesh: mesh (isotropic covariance) #> Data: dat #> Family: nbinom2(link = 'log') #> #> coef.est coef.se #> (Intercept) 0.77 0.14 #> #> Dispersion parameter: 0.30 #> Matérn range: 0.21 #> Spatial SD: 0.56 #> ML criterion at convergence: 1817.332 #> #> See ?tidy.sdmTMB to extract these values as a data frame."},{"path":"https://pbs-assess.github.io/sdmTMB/articles/web_only/residual-checking.html","id":"analytical-randomized-quantile-residuals","dir":"Articles > Web_only","previous_headings":"Residual checking with worked examples","what":"Analytical randomized-quantile residuals","title":"Residual checking with sdmTMB","text":"randomized quantile residuals fixed effect MLEs (Maximum Likelihood Estimates) random effects taken single sample approximate distribution (details ): use randomized quantile approach Dunn Smyth (1996). also known PIT (probability-integral-transform) residuals. apply randomization integer response values, transform residuals using distribution function (e.g., pnorm()) reflect uniform(0, 1) distribution, transform values normal(0, 1) consistent model. can see source code https://github.com/pbs-assess/sdmTMB/blob/master/R/residuals.R can see likely issues Poisson model tails.","code":"set.seed(123) rq_res <- residuals(fit_pois, type = \"mle-mvn\") rq_res <- rq_res[is.finite(rq_res)] # some Inf qqnorm(rq_res);abline(0, 1) set.seed(123) rq_res <- residuals(fit_nb2, type = \"mle-mvn\") qqnorm(rq_res);abline(0, 1)"},{"path":"https://pbs-assess.github.io/sdmTMB/articles/web_only/residual-checking.html","id":"mcmc-based-randomized-quantile-residuals","dir":"Articles > Web_only","previous_headings":"Residual checking with worked examples","what":"MCMC-based randomized-quantile residuals","title":"Residual checking with sdmTMB","text":"approach assumes observation random effects can approximated multivariate normal distribution. want relax assumption, can sample random effects MCMC fixed effects held MLEs. sdmTMBextra::predict_mle_mcmc() function sdmTMBextra.","code":"set.seed(123) samps <- sdmTMBextra::predict_mle_mcmc(fit_nb2, mcmc_iter = 800, mcmc_warmup = 400) #> #> SAMPLING FOR MODEL 'tmb_generic' NOW (CHAIN 1). #> Chain 1: #> Chain 1: Gradient evaluation took 0.000965 seconds #> Chain 1: 1000 transitions using 10 leapfrog steps per transition would take 9.65 seconds. #> Chain 1: Adjust your expectations accordingly! #> Chain 1: #> Chain 1: #> Chain 1: Iteration: 1 / 800 [ 0%] (Warmup) #> Chain 1: Iteration: 80 / 800 [ 10%] (Warmup) #> Chain 1: Iteration: 160 / 800 [ 20%] (Warmup) #> Chain 1: Iteration: 240 / 800 [ 30%] (Warmup) #> Chain 1: Iteration: 320 / 800 [ 40%] (Warmup) #> Chain 1: Iteration: 400 / 800 [ 50%] (Warmup) #> Chain 1: Iteration: 401 / 800 [ 50%] (Sampling) #> Chain 1: Iteration: 480 / 800 [ 60%] (Sampling) #> Chain 1: Iteration: 560 / 800 [ 70%] (Sampling) #> Chain 1: Iteration: 640 / 800 [ 80%] (Sampling) #> Chain 1: Iteration: 720 / 800 [ 90%] (Sampling) #> Chain 1: Iteration: 800 / 800 [100%] (Sampling) #> Chain 1: #> Chain 1: Elapsed Time: 6.699 seconds (Warm-up) #> Chain 1: 4.761 seconds (Sampling) #> Chain 1: 11.46 seconds (Total) #> Chain 1: mcmc_res <- residuals(fit_nb2, type = \"mle-mcmc\", mcmc_samples = samps) qqnorm(mcmc_res) abline(0, 1)"},{"path":"https://pbs-assess.github.io/sdmTMB/articles/web_only/residual-checking.html","id":"simulation-based-randomized-quantile-residuals","dir":"Articles > Web_only","previous_headings":"Residual checking with worked examples","what":"Simulation-based randomized-quantile residuals","title":"Residual checking with sdmTMB","text":"can also take simulations fitted model use simulation-based randomized quantile residuals: return matrix row represents row data column simulation draw: can look whether fitted models consistent observed number zeros: obviously zeros data simulated Poisson model NB2 model seems reasonable. Plot DHARMa residuals: also return DHARMa object, lets us use DHARMa tools: QQ residual plots clearly see evidence overdispersion compared Poisson. Note values clumping near 1.0 observed axis deviating downwards towards 0.0 observed. indicative many zeros variance scaling rapidly mean (resulting large outlying value) Poisson distribution. Lets try correct model: Everything looks fine. model missing predictor? plot right represents simulated residuals prediction without random effects, just intercept. Lets try plotting residuals missing predictor: can see trend residuals ‘a1’ since missed including model. can also see difference log likelihood AIC: AIC also supports including ‘a1’ predictor. help interpreting DHARMa residual plots, see vignette(\"DHARMa\", package=\"DHARMa\").","code":"s_pois <- simulate(fit_pois, nsim = 500, type = \"mle-mvn\") s_nb2_miss <- simulate(fit_nb2_miss, nsim = 500, type = \"mle-mvn\") s_nb2 <- simulate(fit_nb2, nsim = 500, type = \"mle-mvn\") dim(s_pois) #> [1] 1000 500 sum(dat$observed == 0) / length(dat$observed) #> [1] 0.527 sum(s_pois == 0)/length(s_pois) #> [1] 0.292788 sum(s_nb2 == 0)/length(s_nb2) #> [1] 0.524644 dharma_residuals(s_pois, fit_pois) r_pois <- dharma_residuals(s_pois, fit_pois, return_DHARMa = TRUE) plot(r_pois) DHARMa::testResiduals(r_pois) #> $uniformity #> #> Asymptotic one-sample Kolmogorov-Smirnov test #> #> data: simulationOutput$scaledResiduals #> D = 0.23338, p-value < 2.2e-16 #> alternative hypothesis: two-sided #> #> #> $dispersion #> #> DHARMa nonparametric dispersion test via sd of residuals fitted vs. #> simulated #> #> data: simulationOutput #> dispersion = 8.6989, p-value < 2.2e-16 #> alternative hypothesis: two.sided #> #> #> $outliers #> #> DHARMa outlier test based on exact binomial test with approximate #> expectations #> #> data: simulationOutput #> outliers at both margin(s) = 111, observations = 1000, p-value < #> 2.2e-16 #> alternative hypothesis: true probability of success is not equal to 0.003992016 #> 95 percent confidence interval: #> 0.09219791 0.13212606 #> sample estimates: #> frequency of outliers (expected: 0.00399201596806387 ) #> 0.111 #> $uniformity #> #> Asymptotic one-sample Kolmogorov-Smirnov test #> #> data: simulationOutput$scaledResiduals #> D = 0.23338, p-value < 2.2e-16 #> alternative hypothesis: two-sided #> #> #> $dispersion #> #> DHARMa nonparametric dispersion test via sd of residuals fitted vs. #> simulated #> #> data: simulationOutput #> dispersion = 8.6989, p-value < 2.2e-16 #> alternative hypothesis: two.sided #> #> #> $outliers #> #> DHARMa outlier test based on exact binomial test with approximate #> expectations #> #> data: simulationOutput #> outliers at both margin(s) = 111, observations = 1000, p-value < #> 2.2e-16 #> alternative hypothesis: true probability of success is not equal to 0.003992016 #> 95 percent confidence interval: #> 0.09219791 0.13212606 #> sample estimates: #> frequency of outliers (expected: 0.00399201596806387 ) #> 0.111 DHARMa::testSpatialAutocorrelation(r_pois, x = dat$X, y = dat$Y) #> #> DHARMa Moran's I test for distance-based autocorrelation #> #> data: r_pois #> observed = -0.0022978, expected = -0.0010010, sd = 0.0026264, p-value = #> 0.6215 #> alternative hypothesis: Distance-based autocorrelation DHARMa::testZeroInflation(r_pois) #> #> DHARMa zero-inflation test via comparison to expected zeros with #> simulation under H0 = fitted model #> #> data: simulationOutput #> ratioObsSim = 1.7999, p-value < 2.2e-16 #> alternative hypothesis: two.sided r_nb2 <- dharma_residuals(s_nb2, fit_nb2, return_DHARMa = TRUE) plot(r_nb2) DHARMa::testZeroInflation(r_nb2) #> #> DHARMa zero-inflation test via comparison to expected zeros with #> simulation under H0 = fitted model #> #> data: simulationOutput #> ratioObsSim = 1.0045, p-value = 0.908 #> alternative hypothesis: two.sided r_nb2_miss <- dharma_residuals(s_nb2_miss, fit_nb2_miss, return_DHARMa = TRUE) plot(r_nb2_miss) DHARMa::plotResiduals(r_nb2_miss, form = dat$a1) AIC(fit_nb2_miss, fit_nb2) #> df AIC #> fit_nb2_miss 4 3642.665 #> fit_nb2 5 3480.904"},{"path":"https://pbs-assess.github.io/sdmTMB/articles/web_only/residual-checking.html","id":"the-need-for-one-sample-residuals","dir":"Articles > Web_only","previous_headings":"","what":"The need for one-sample residuals","title":"Residual checking with sdmTMB","text":"used random effects drawn observed . approach described Waagepetersen (2006) summarized nicely unexported function oneSamplePosterior within TMB. Thygesen et al. (2017) also describes context one-sample MCMC residuals. show necessary. ’ll start simulating data Gaussian observation error spatial spatiotemporal random effects. use empirical Bayes (EB) random effect values (values random effects maximize log likelihood conditional estimated fixed effects), residuals look even though model perfectly matched simulated data: Indeed, test (incorrectly) rejects null hypothesis \\(r_1 \\sim \\operatorname{N}(0, 1)\\), calculated correctly know come \\(\\operatorname{N}(0, 1)\\) instead returned single sample assumed MVN random effect distribution, get ‘correct’ residuals: , (correctly) fail reject hypothesis \\(r_2 \\sim \\operatorname{N}(0, 1)\\). also sample observation random effects using MCMC (fixed effects still held MLEs), relaxes assumptions, much time intensive large models. gets us something similar (correctly) fail reject hypothesis \\(r_3 \\sim \\operatorname{N}(0, 1)\\). similar issue applies simulation-based quantile residuals, implemented DHARMa package. Instead can use draw random effects ‘posterior’ assuming MVN distribution. now looks correct. However, happens sample random effects simulation? get back something incorrect distribution comparison! , need single random effects sample per set simulations.","code":"set.seed(123) predictor_dat <- data.frame( X = runif(1000), Y = runif(1000), year = rep(1:5, each = 200) ) mesh <- make_mesh(predictor_dat, xy_cols = c(\"X\", \"Y\"), cutoff = 0.1) sim_dat <- sdmTMB_simulate( formula = ~ 1, data = predictor_dat, time = \"year\", mesh = mesh, family = gaussian(), range = 0.3, sigma_E = 0.3, phi = 0.1, sigma_O = 0.4, seed = 1, B = 0.2 # intercept ) fit <- sdmTMB(observed ~ 1, data = sim_dat, time = \"year\", mesh = mesh) set.seed(1) r1 <- residuals(fit, type = \"mle-eb\") qqnorm(r1);abline(0, 1) ks.test(r1, pnorm) #> #> Asymptotic one-sample Kolmogorov-Smirnov test #> #> data: r1 #> D = 0.058629, p-value = 0.002067 #> alternative hypothesis: two-sided set.seed(1) r2 <- residuals(fit, type = \"mle-mvn\") qqnorm(r2);abline(0, 1) ks.test(r2, pnorm) #> #> Asymptotic one-sample Kolmogorov-Smirnov test #> #> data: r2 #> D = 0.020639, p-value = 0.7879 #> alternative hypothesis: two-sided samp <- sdmTMBextra::predict_mle_mcmc(fit, mcmc_iter = 400, mcmc_warmup = 200) #> #> SAMPLING FOR MODEL 'tmb_generic' NOW (CHAIN 1). #> Chain 1: #> Chain 1: Gradient evaluation took 0.002334 seconds #> Chain 1: 1000 transitions using 10 leapfrog steps per transition would take 23.34 seconds. #> Chain 1: Adjust your expectations accordingly! #> Chain 1: #> Chain 1: #> Chain 1: Iteration: 1 / 400 [ 0%] (Warmup) #> Chain 1: Iteration: 40 / 400 [ 10%] (Warmup) #> Chain 1: Iteration: 80 / 400 [ 20%] (Warmup) #> Chain 1: Iteration: 120 / 400 [ 30%] (Warmup) #> Chain 1: Iteration: 160 / 400 [ 40%] (Warmup) #> Chain 1: Iteration: 200 / 400 [ 50%] (Warmup) #> Chain 1: Iteration: 201 / 400 [ 50%] (Sampling) #> Chain 1: Iteration: 240 / 400 [ 60%] (Sampling) #> Chain 1: Iteration: 280 / 400 [ 70%] (Sampling) #> Chain 1: Iteration: 320 / 400 [ 80%] (Sampling) #> Chain 1: Iteration: 360 / 400 [ 90%] (Sampling) #> Chain 1: Iteration: 400 / 400 [100%] (Sampling) #> Chain 1: #> Chain 1: Elapsed Time: 30.393 seconds (Warm-up) #> Chain 1: 29.823 seconds (Sampling) #> Chain 1: 60.216 seconds (Total) #> Chain 1: #> Warning: The largest R-hat is 1.08, indicating chains have not mixed. #> Running the chains for more iterations may help. See #> https://mc-stan.org/misc/warnings.html#r-hat #> Warning: Bulk Effective Samples Size (ESS) is too low, indicating posterior means and medians may be unreliable. #> Running the chains for more iterations may help. See #> https://mc-stan.org/misc/warnings.html#bulk-ess #> Warning: Tail Effective Samples Size (ESS) is too low, indicating posterior variances and tail quantiles may be unreliable. #> Running the chains for more iterations may help. See #> https://mc-stan.org/misc/warnings.html#tail-ess r3 <- residuals(fit, type = \"mle-mcmc\", mcmc_samples = samp) qqnorm(r3);abline(0, 1) ks.test(r3, pnorm) #> #> Asymptotic one-sample Kolmogorov-Smirnov test #> #> data: r3 #> D = 0.024522, p-value = 0.5845 #> alternative hypothesis: two-sided set.seed(1) simulate(fit, nsim = 500, type = \"mle-eb\") |> dharma_residuals(fit) #> Warning: It is recommended to use `simulate.sdmTMB(fit, type = 'mle-mvn')` if simulating #> for DHARMa residuals. See the description in ?residuals.sdmTMB under the types #> of residuals section. set.seed(1) simulate(fit, nsim = 500, type = \"mle-mvn\") |> dharma_residuals(fit) set.seed(1) s <- replicate(200, simulate(fit, nsim = 1, type = \"mle-mvn\"), simplify = \"matrix\") attr(s, \"type\") <- \"mle-mvn\" dharma_residuals(s, fit)"},{"path":"https://pbs-assess.github.io/sdmTMB/articles/web_only/residual-checking.html","id":"how-much-this-matters-depends-on-the-ratio-of-observation-error-variance-vs--random-effect-variance","dir":"Articles > Web_only","previous_headings":"","what":"How much this matters depends on the ratio of observation error variance vs. random effect variance","title":"Residual checking with sdmTMB","text":"Lets simulate data large observation error (phi , Gaussian error SD case) smaller levels random field variance (sigma_E sigma_O): Now, doesn’t really matter since ‘incorrect’ random effect distribution swamped observation error effect distribution. Technically, first set ‘wrong’ second set ‘right’, functionally ’d come similar conclusion case.","code":"set.seed(123) sim_dat2 <- sdmTMB_simulate( formula = ~ 1, data = predictor_dat, time = \"year\", mesh = mesh, family = gaussian(), range = 0.3, sigma_E = 0.1, # smaller than before sigma_O = 0.1, # smaller than before phi = 0.5, # bigger than before seed = 1, B = 0.2 ) fit2 <- sdmTMB(observed ~ 1, data = sim_dat2, time = \"year\", mesh = mesh) sanity(fit2) #> ✔ Non-linear minimizer suggests successful convergence #> ✔ Hessian matrix is positive definite #> ✔ No extreme or very small eigenvalues detected #> ✔ No gradients with respect to fixed effects are >= 0.001 #> ✔ No fixed-effect standard errors are NA #> ✔ No standard errors look unreasonably large #> ✔ No sigma parameters are < 0.01 #> ✔ No sigma parameters are > 100 #> ✔ Range parameter doesn't look unreasonably large set.seed(1) r1 <- residuals(fit2, type = \"mle-eb\") qqnorm(r1);abline(0, 1) ks.test(r1, pnorm) #> #> Asymptotic one-sample Kolmogorov-Smirnov test #> #> data: r1 #> D = 0.021085, p-value = 0.7656 #> alternative hypothesis: two-sided r2 <- residuals(fit2, type = \"mle-mvn\") qqnorm(r2);abline(0, 1) ks.test(r2, pnorm) #> #> Asymptotic one-sample Kolmogorov-Smirnov test #> #> data: r2 #> D = 0.016869, p-value = 0.9385 #> alternative hypothesis: two-sided"},{"path":"https://pbs-assess.github.io/sdmTMB/articles/web_only/residual-checking.html","id":"notes-on-uniform-vs--normal-quantile-residuals","dir":"Articles > Web_only","previous_headings":"","what":"Notes on uniform vs. normal quantile residuals","title":"Residual checking with sdmTMB","text":"randomized quantile residuals residuals.sdmTMB() returned normal(0, 1) model consistent data. DHARMa residuals, however, returned uniform(0, 1) circumstances. valid, use preference, ’s important appreciate changes appearance expected residuals. Analytical normal(0, 1): Analytical uniform(0, 1): Simulation-based uniform(0, 1): Simulation-based normal(0, 1): Conclusions: normal(0, 1) residuals probably familiar people. normal(0, 1) residuals put emphasis tails. good bad: ’s easier examine tail behaviour, can often look ‘’ even model fine (example) observations tails distribution definition rarely observed. Uniform(0, 1) residuals give data points equal visual weight emphasize consistency overall distribution rather tails. Either valid, need switch mindset expect see accordingly. example, poor tail behaviour may look like minor issue uniform residuals; conversely tails normal(0, 1) residuals unlikely ever look ‘perfect’ without large sample sizes.","code":"r2 <- residuals(fit2, type = \"mle-mvn\") hist(r2) qqnorm(r2) abline(0, 1) ks.test(r2, pnorm) #> #> Asymptotic one-sample Kolmogorov-Smirnov test #> #> data: r2 #> D = 0.025614, p-value = 0.528 #> alternative hypothesis: two-sided u <- pnorm(r2) hist(u) n <- length(u) m <- seq_len(n) / (n + 1) qqplot(m, u) ks.test(u, punif) #> #> Asymptotic one-sample Kolmogorov-Smirnov test #> #> data: u #> D = 0.025614, p-value = 0.528 #> alternative hypothesis: two-sided abline(0, 1) set.seed(1) s <- simulate(fit2, nsim = 500, type = \"mle-mvn\") |> dharma_residuals(fit2, return_DHARMa = TRUE) hist(s$scaledResiduals) u <- s$scaledResiduals m <- seq_len(length(u)) / (length(u)+ 1) qqplot(m, u) abline(0, 1) ks.test(u, punif) #> #> Asymptotic one-sample Kolmogorov-Smirnov test #> #> data: u #> D = 0.026, p-value = 0.5085 #> alternative hypothesis: two-sided set.seed(1) s <- simulate(fit2, nsim = 1500, type = \"mle-mvn\") |> dharma_residuals(fit2, return_DHARMa = TRUE) u <- s$scaledResiduals r <- qnorm(u) qqnorm(r) abline(0, 1) ks.test(u, punif) #> #> Asymptotic one-sample Kolmogorov-Smirnov test #> #> data: u #> D = 0.028, p-value = 0.4131 #> alternative hypothesis: two-sided"},{"path":"https://pbs-assess.github.io/sdmTMB/articles/web_only/residual-checking.html","id":"how-do-those-randomized-quantile-residuals-work","dir":"Articles > Web_only","previous_headings":"","what":"How do those randomized-quantile residuals work?","title":"Residual checking with sdmTMB","text":"Let’s work simple example gamma distribution: get quantile residuals, transform uniform(0, 1) using pgamma() (optionally) convert normal(0, 1) qnorm(): works distribution can define quantile function. integer values, need add randomization additional step. Let’s work Poisson sample: gamma example illustrated can use distribution function (pgamma()) take gamma-distributed variable turn uniform-distributed variable. Now need Poisson equivalent: ppois(). ppois(y) function gives us cumulative probability density value y. Say Poisson variable mean (lambda) 5 observation value 3. can calculate density value 3 : .e., : , naively apply ppois() observed values ’ll end discrete cumulative probabilities aren’t useful comparing continuous uniform. Instead, need get probability density value one observed “fill ” values observed value desired uniformly distributed samples. First, get density value observed value: can add randomization using runif() since expectation uniform distribution stage fill values : optionally apply qnorm():","code":"set.seed(1) mu <- rep(2, 500) phi <- 0.5 y <- rgamma(length(mu), shape = phi, scale = mu / phi) u <- pgamma(q = y, shape = phi, scale = mu / phi) hist(u) r <- qnorm(u) hist(r) set.seed(1) mu <- rep(2, 500) y <- rpois(length(mu), mu) lambda <- 5 ppois(3, lambda) #> [1] 0.2650259 dpois(0, lambda) + dpois(1, lambda) + dpois(2, lambda) + dpois(3, lambda) #> [1] 0.2650259 hist(ppois(y, mu)) a <- ppois(y - 1, mu) hist(a) b <- ppois(y, mu) hist(b) u <- runif(n = length(y), min = a, max = b) hist(u) r <- qnorm(u) hist(r)"},{"path":"https://pbs-assess.github.io/sdmTMB/articles/web_only/residual-checking.html","id":"how-do-those-simulation-based-residuals-work","dir":"Articles > Web_only","previous_headings":"","what":"How do those simulation-based residuals work?","title":"Residual checking with sdmTMB","text":"DHARMa uses simulation-based quantile residuals. result, don’t need define quantile function analytically. , instead line like : simulate model repeatedly, see observation falls within simulated values, get quantile way. example, instead : :","code":"u <- pgamma(q = y, shape = phi, scale = mu / phi) pnorm(2.2, mean = 0.5, sd = 1) #> [1] 0.9554345 set.seed(1) s <- rnorm(1e6, 0.5, 1) # equivalent of simulate()ing from our model mean(s < 2.2) # equivalent of pnorm() #> [1] 0.955308"},{"path":"https://pbs-assess.github.io/sdmTMB/articles/web_only/residual-checking.html","id":"references","dir":"Articles > Web_only","previous_headings":"","what":"References","title":"Residual checking with sdmTMB","text":"Dunn, P.K., Smyth, G.K. 1996. Randomized Quantile Residuals. Journal Computational Graphical Statistics 5(3): 236–244. Waagepetersen, R. 2006. Simulation-Based Goodness--Fit Test Random Effects Generalized Linear Mixed Models. Scandinavian Journal Statistics 33(4): 721–731. Thygesen, U.H., Albertsen, C.M., Berg, C.W., Kristensen, K., Nielsen, . 2017. Validation ecological state space models using Laplace approximation. Environ Ecol Stat 24(2): 317–339.","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/articles/web_only/visreg.html","id":"delta-models","dir":"Articles > Web_only","previous_headings":"","what":"Delta models","title":"Visualizing sdmTMB conditional effects using visreg","text":"visreg() can also used plot output delta models sdmTMB() using similar code previous plots, using sdmTMB wrapper function visreg_delta() specifying model = 1 encounter (0 vs. non-zero) model model = 2 positive component model (e.g., Gamma, lognormal). example: Note plotting visreg_delta(), categorical variables like year need designated factor data frame, example fyear, rather model formula (e.g., + .factor(year)).","code":"fit_dg <- sdmTMB( density ~ s(depth_scaled, year, k = 8), data = pcod_2011, mesh = pcod_mesh_2011, spatial = \"off\", # for vignette speed family = delta_gamma() ) visreg_delta(fit_dg, xvar = \"depth_scaled\", model = 1, gg = TRUE) #> These are residuals for delta model component 1. Use the `model` argument to #> select the other component. visreg_delta(fit_dg, xvar = \"depth_scaled\", model = 2, gg = TRUE) #> These are residuals for delta model component 2. Use the `model` argument to #> select the other component."},{"path":"https://pbs-assess.github.io/sdmTMB/authors.html","id":null,"dir":"","previous_headings":"","what":"Authors","title":"Authors and Citation","text":"Sean C. Anderson. Author, maintainer. Eric J. Ward. Author. Philina . English. Author. Lewis . K. Barnett. Author. James T. Thorson. Author, copyright holder. VAST author Joe Watson. Contributor. Censored Poisson Julia Indivero. Contributor. Vignette writing Jillian C. Dunic. Contributor. Cole C. Monnahan. Contributor, copyright holder. VAST contributor Mollie Brooks. Contributor, copyright holder. glmmTMB author Ben Bolker. Contributor, copyright holder. glmmTMB author Kasper Kristensen. Contributor, copyright holder. TMB/glmmTMB author Martin Maechler. Contributor, copyright holder. glmmTMB author Arni Magnusson. Contributor, copyright holder. glmmTMB author Hans J. Skaug. Contributor, copyright holder. glmmTMB author, SPDE barrier Anders Nielsen. Contributor, copyright holder. glmmTMB author Casper Berg. Contributor, copyright holder. glmmTMB author Koen van Bentham. Contributor, copyright holder. glmmTMB author Olav Nikolai Breivik. Contributor, copyright holder. SPDE barrier Simon Wood. Contributor, copyright holder. mgcv: smoother prediction Paul-Christian Bürkner. Contributor, copyright holder. brms: smoother matrix parsing Majesty King Right Canada, represented Minister Department Fisheries Oceans. Copyright holder.","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/authors.html","id":"citation","dir":"","previous_headings":"","what":"Citation","title":"Authors and Citation","text":"Anderson SC, Ward EJ, English PA, Barnett LAK, Thorson JT (2024). “sdmTMB: R package fast, flexible, user-friendly generalized linear mixed effects models spatial spatiotemporal random fields.” bioRxiv, 2022.03.24.485545. doi:10.1101/2022.03.24.485545.","code":"@Article{, title = {sdmTMB: an R package for fast, flexible, and user-friendly generalized linear mixed effects models with spatial and spatiotemporal random fields}, author = {Sean C. Anderson and Eric J. Ward and Philina A. English and Lewis A. K. Barnett and James T. Thorson}, year = {2024}, journal = {bioRxiv}, volume = {2022.03.24.485545}, doi = {10.1101/2022.03.24.485545}, }"},{"path":"https://pbs-assess.github.io/sdmTMB/header.html","id":null,"dir":"","previous_headings":"","what":"sdmTMB ","title":"sdmTMB ","text":"Spatial spatiotemporal GLMMs TMB sdmTMB R package fits spatial spatiotemporal GLMMs (Generalized Linear Mixed Effects Models) using Template Model Builder (TMB), R-INLA, Gaussian Markov random fields. One common application species distribution models (SDMs). See documentation site preprint: Anderson, S.C., E.J. Ward, P.. English, L..K. Barnett, J.T. Thorson. 2024. sdmTMB: R package fast, flexible, user-friendly generalized linear mixed effects models spatial spatiotemporal random fields. bioRxiv 2022.03.24.485545; doi: https://doi.org/10.1101/2022.03.24.485545","code":""},{"path":[]},{"path":"https://pbs-assess.github.io/sdmTMB/index.html","id":"sdmtmb-","dir":"","previous_headings":"","what":"Spatial and Spatiotemporal SPDE-Based GLMMs with TMB","title":"Spatial and Spatiotemporal SPDE-Based GLMMs with TMB","text":"Spatial spatiotemporal GLMMs TMB sdmTMB R package fits spatial spatiotemporal GLMMs (Generalized Linear Mixed Effects Models) using Template Model Builder (TMB), R-INLA, Gaussian Markov random fields. One common application species distribution models (SDMs). See documentation site preprint: Anderson, S.C., E.J. Ward, P.. English, L..K. Barnett, J.T. Thorson. 2024. sdmTMB: R package fast, flexible, user-friendly generalized linear mixed effects models spatial spatiotemporal random fields. bioRxiv 2022.03.24.485545; doi: https://doi.org/10.1101/2022.03.24.485545","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/index.html","id":"table-of-contents","dir":"","previous_headings":"","what":"Table of contents","title":"Spatial and Spatiotemporal SPDE-Based GLMMs with TMB","text":"Installation Overview Getting help Citation Related software Basic use Time-varying coefficients Spatially varying coefficients (SVC) Random intercepts Breakpoint threshold effects Simulating data Sampling joint precision matrix Calculating uncertainty spatial predictions Cross validation Priors Bayesian MCMC sampling Stan Turning random fields Using custom fmesher mesh Barrier meshes","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/index.html","id":"installation","dir":"","previous_headings":"","what":"Installation","title":"Spatial and Spatiotemporal SPDE-Based GLMMs with TMB","text":"sdmTMB can installed CRAN: Assuming C++ compiler installed, development version can installed: extra utilities sdmTMBextra package. Importantly, recommended use optimized BLAS library, result major speed improvements TMB () models R (e.g., often 8-fold speed increases sdmTMB models). Suggested installation instructions Mac users, Linux users, Windows users. check ’ve successfully linked optimized BLAS, start new session run: result take fraction second (e.g., 0.03 s), multiple seconds.","code":"install.packages(\"sdmTMB\", dependencies = TRUE) # install.packages(\"remotes\") remotes::install_github(\"pbs-assess/sdmTMB\", dependencies = TRUE) m <- 1e4; n <- 1e3; k <- 3e2 X <- matrix(rnorm(m*k), nrow=m); Y <- matrix(rnorm(n*k), ncol=n) system.time(X %*% Y)"},{"path":"https://pbs-assess.github.io/sdmTMB/index.html","id":"overview","dir":"","previous_headings":"","what":"Overview","title":"Spatial and Spatiotemporal SPDE-Based GLMMs with TMB","text":"Analyzing geostatistical data (coordinate-referenced observations underlying spatial process) becoming increasingly common ecology. sdmTMB implements geostatistical spatial spatiotemporal GLMMs using TMB model fitting R-INLA set SPDE (stochastic partial differential equation) matrices. One common application species distribution models (SDMs), hence package name. goal sdmTMB provide fast, flexible, user-friendly interface—similar popular R package glmmTMB—focus spatial spatiotemporal models SPDE approach. extend generalized linear mixed models (GLMMs) familiar ecologists include following optional features: spatial random fields spatiotemporal random fields may independent year modelled random walks autoregressive processes smooth terms covariates, using familiar s() notation mgcv breakpoint (hockey-stick) logistic covariates time-varying covariates (coefficients modelled random walks) spatially varying coefficient models (SVCs) interpolation forecasting missing future time slices wide range families: standard R families plus tweedie(), nbinom1(), nbinom2(), lognormal(), student(), plus truncated censored families delta/hurdle models including delta_gamma(), delta_lognormal(), delta_truncated_nbinom2() Estimation performed sdmTMB via maximum marginal likelihood objective function calculated TMB minimized R via stats::nlminb() random effects integrated via Laplace approximation. sdmTMB package also allows models passed Stan via tmbstan, allowing Bayesian model estimation. See ?sdmTMB ?predict.sdmTMB complete examples. Also see vignettes (‘Articles’) documentation site preprint appendices linked .","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/index.html","id":"getting-help","dir":"","previous_headings":"","what":"Getting help","title":"Spatial and Spatiotemporal SPDE-Based GLMMs with TMB","text":"questions use sdmTMB interpret models, please post discussion board. email question, likely respond discussion board anonymized version question (without data) think helpful others. Please let us know don’t want us . bugs feature requests, please post issue tracker. Slides recordings workshop sdmTMB.","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/index.html","id":"citation","dir":"","previous_headings":"","what":"Citation","title":"Spatial and Spatiotemporal SPDE-Based GLMMs with TMB","text":"cite sdmTMB publications use: Anderson, S.C., E.J. Ward, P.. English, L..K. Barnett., J.T. Thorson. 2024. sdmTMB: R package fast, flexible, user-friendly generalized linear mixed effects models spatial spatiotemporal random fields. bioRxiv 2022.03.24.485545; doi: https://doi.org/10.1101/2022.03.24.485545 list (known) publications use sdmTMB can found . Please use citation can track publications.","code":"citation(\"sdmTMB\")"},{"path":"https://pbs-assess.github.io/sdmTMB/index.html","id":"related-software","dir":"","previous_headings":"","what":"Related software","title":"Spatial and Spatiotemporal SPDE-Based GLMMs with TMB","text":"sdmTMB heavily inspired VAST R package: Thorson, J.T. 2019. Guidance decisions using Vector Autoregressive Spatio-Temporal (VAST) package stock, ecosystem, habitat climate assessments. Fisheries Research 210: 143–161. https://doi.org/10.1016/j.fishres.2018.10.013. glmmTMB R package: Brooks, M.E., Kristensen, K., van Benthem, K.J., Magnusson, ., Berg, C.W., Nielsen, ., Skaug, H.J., Maechler, M., Bolker, B.M. 2017. glmmTMB balances speed flexibility among packages zero-inflated generalized linear mixed modeling. R Journal 9(2): 378–400. https://doi.org/10.32614/rj-2017-066. INLA inlabru can fit many models sdmTMB (many ) approximate Bayesian inference framework. mgcv can fit similar SPDE-based Gaussian random field models code included Miller et al. (2019). table sdmTMB preprint describes functionality timing comparisons sdmTMB, VAST, INLA/inlabru, mgcv discussion makes suggestions might choose one package another.","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/index.html","id":"basic-use","dir":"","previous_headings":"","what":"Basic use","title":"Spatial and Spatiotemporal SPDE-Based GLMMs with TMB","text":"sdmTMB model requires data frame contains response column, columns predictors, columns spatial coordinates. usually makes sense convert spatial coordinates equidistant projection UTMs distance remains constant throughout study region [e.g., using sf::st_transform()]. , illustrate spatial model fit Pacific cod (Gadus macrocephalus) trawl survey data Queen Charlotte Sound, BC, Canada. model contains main effect depth penalized smoother, spatial random field, Tweedie observation error. data frame pcod (built package) column year year survey, density density Pacific cod given survey tow, present whether density > 0, depth depth meters tow, spatial coordinates X Y, UTM coordinates kilometres. start creating mesh object contains matrices apply SPDE approach. , cutoff defines minimum allowed distance points units X Y (km). Alternatively, created mesh via fmesher INLA packages supplied make_mesh(). can inspect mesh object associated plotting method plot(mesh). Fit spatial model smoother depth: Print model fit: output indicates model fit maximum (marginal) likelihood (ML). also see formula, mesh, fitted data, family. Next see estimated main effects including linear component smoother (sdepth), standard deviation smoother weights (sds(depth)), Tweedie dispersion power parameters, Matérn range distance (distance points effectively independent), marginal spatial field standard deviation, negative log likelihood convergence. can extract parameters data frame: Run basic sanity checks model: Use ggeffects package plot smoother effect: depth effect parametric penalized smoother, alternatively used ggeffects::ggeffect() fast marginal effect plot. Predict new data: switch presence-absence model changing response column family: hurdle/delta model changing family: instead fit spatiotemporal model specifying time column spatiotemporal structure: wanted create area-weighted standardized population index, predict grid covering entire survey (qcs_grid) grid cell area 4 (2 x 2 km) pass predictions get_index(): center gravity: basic features, see vignettes Intro modelling sdmTMB Index standardization sdmTMB.","code":"library(dplyr) library(ggplot2) library(sdmTMB) head(pcod) #> # A tibble: 3 × 6 #> year density present depth X Y #> #> 1 2003 113. 1 201 446. 5793. #> 2 2003 41.7 1 212 446. 5800. #> 3 2003 0 0 220 449. 5802. mesh <- make_mesh(pcod, xy_cols = c(\"X\", \"Y\"), cutoff = 10) fit <- sdmTMB( density ~ s(depth), data = pcod, mesh = mesh, family = tweedie(link = \"log\"), spatial = \"on\" ) fit #> Spatial model fit by ML ['sdmTMB'] #> Formula: density ~ s(depth) #> Mesh: mesh (isotropic covariance) #> Data: pcod #> Family: tweedie(link = 'log') #> #> coef.est coef.se #> (Intercept) 2.37 0.21 #> sdepth 0.62 2.53 #> #> Smooth terms: #> Std. Dev. #> sds(depth) 13.93 #> #> Dispersion parameter: 12.69 #> Tweedie p: 1.58 #> Matérn range: 16.39 #> Spatial SD: 1.86 #> ML criterion at convergence: 6402.136 #> #> See ?tidy.sdmTMB to extract these values as a data frame. tidy(fit, conf.int = TRUE) #> # A tibble: 1 × 5 #> term estimate std.error conf.low conf.high #> #> 1 (Intercept) 2.37 0.215 1.95 2.79 tidy(fit, effects = \"ran_pars\", conf.int = TRUE) #> # A tibble: 4 × 5 #> term estimate std.error conf.low conf.high #> #> 1 range 16.4 4.47 9.60 28.0 #> 2 phi 12.7 0.406 11.9 13.5 #> 3 sigma_O 1.86 0.218 1.48 2.34 #> 4 tweedie_p 1.58 0.00998 1.56 1.60 sanity(fit) #> ✔ Non-linear minimizer suggests successful convergence #> ✔ Hessian matrix is positive definite #> ✔ No extreme or very small eigenvalues detected #> ✔ No gradients with respect to fixed effects are >= 0.001 #> ✔ No fixed-effect standard errors are NA #> ✔ No standard errors look unreasonably large #> ✔ No sigma parameters are < 0.01 #> ✔ No sigma parameters are > 100 #> ✔ Range parameter doesn't look unreasonably large ggeffects::ggpredict(fit, \"depth [50:400, by=2]\") |> plot() p <- predict(fit, newdata = qcs_grid) head(p) #> # A tibble: 3 × 7 #> X Y depth est est_non_rf est_rf omega_s #> #> 1 456 5636 347. -3.06 -3.08 0.0172 0.0172 #> 2 458 5636 223. 2.03 1.99 0.0460 0.0460 #> 3 460 5636 204. 2.89 2.82 0.0747 0.0747 ggplot(p, aes(X, Y, fill = exp(est))) + geom_raster() + scale_fill_viridis_c(trans = \"sqrt\") fit <- sdmTMB( present ~ s(depth), data = pcod, mesh = mesh, family = binomial(link = \"logit\") ) fit <- sdmTMB( density ~ s(depth), data = pcod, mesh = mesh, family = delta_gamma(link1 = \"logit\", link2 = \"log\"), ) fit_spatiotemporal <- sdmTMB( density ~ s(depth, k = 5), data = pcod, mesh = mesh, time = \"year\", family = tweedie(link = \"log\"), spatial = \"off\", spatiotemporal = \"ar1\" ) grid_yrs <- replicate_df(qcs_grid, \"year\", unique(pcod$year)) p_st <- predict(fit_spatiotemporal, newdata = grid_yrs, return_tmb_object = TRUE) index <- get_index(p_st, area = rep(4, nrow(grid_yrs))) ggplot(index, aes(year, est)) + geom_ribbon(aes(ymin = lwr, ymax = upr), fill = \"grey90\") + geom_line(lwd = 1, colour = \"grey30\") + labs(x = \"Year\", y = \"Biomass (kg)\") cog <- get_cog(p_st, format = \"wide\") ggplot(cog, aes(est_x, est_y, colour = year)) + geom_pointrange(aes(xmin = lwr_x, xmax = upr_x)) + geom_pointrange(aes(ymin = lwr_y, ymax = upr_y)) + scale_colour_viridis_c()"},{"path":[]},{"path":"https://pbs-assess.github.io/sdmTMB/index.html","id":"time-varying-coefficients","dir":"","previous_headings":"Advanced functionality","what":"Time-varying coefficients","title":"Spatial and Spatiotemporal SPDE-Based GLMMs with TMB","text":"Time-varying intercept: Time-varying (random walk) effect depth: See vignette Intro modelling sdmTMB details.","code":"fit <- sdmTMB( density ~ 0 + s(depth, k = 5), time_varying = ~ 1, data = pcod, mesh = mesh, time = \"year\", family = tweedie(link = \"log\"), silent = FALSE # see progress ) fit <- sdmTMB( density ~ 1, time_varying = ~ 0 + depth_scaled + depth_scaled2, data = pcod, mesh = mesh, time = \"year\", family = tweedie(link = \"log\"), spatial = \"off\", spatiotemporal = \"ar1\", silent = FALSE )"},{"path":"https://pbs-assess.github.io/sdmTMB/index.html","id":"spatially-varying-coefficients-svc","dir":"","previous_headings":"Advanced functionality","what":"Spatially varying coefficients (SVC)","title":"Spatial and Spatiotemporal SPDE-Based GLMMs with TMB","text":"Spatially varying effect time: See zeta_s output, represents coefficient varying space. ’ll want ensure set model ballpark mean 0 (e.g., including formula ). See vignette Fitting spatial trend models sdmTMB details.","code":"pcod$year_scaled <- as.numeric(scale(pcod$year)) fit <- sdmTMB( density ~ s(depth, k = 5) + year_scaled, spatial_varying = ~ year_scaled, data = pcod, mesh = mesh, time = \"year\", family = tweedie(link = \"log\"), spatiotemporal = \"off\" ) grid_yrs <- replicate_df(qcs_grid, \"year\", unique(pcod$year)) grid_yrs$year_scaled <- (grid_yrs$year - mean(pcod$year)) / sd(pcod$year) p <- predict(fit, newdata = grid_yrs) %>% subset(year == 2011) # any year ggplot(p, aes(X, Y, fill = zeta_s_year_scaled)) + geom_raster() + scale_fill_gradient2()"},{"path":"https://pbs-assess.github.io/sdmTMB/index.html","id":"random-intercepts","dir":"","previous_headings":"Advanced functionality","what":"Random intercepts","title":"Spatial and Spatiotemporal SPDE-Based GLMMs with TMB","text":"can use syntax (1 | group) lme4 glmmTMB fit random intercepts:","code":"pcod$year_factor <- as.factor(pcod$year) fit <- sdmTMB( density ~ s(depth, k = 5) + (1 | year_factor), data = pcod, mesh = mesh, time = \"year\", family = tweedie(link = \"log\") )"},{"path":"https://pbs-assess.github.io/sdmTMB/index.html","id":"breakpoint-and-threshold-effects","dir":"","previous_headings":"Advanced functionality","what":"Breakpoint and threshold effects","title":"Spatial and Spatiotemporal SPDE-Based GLMMs with TMB","text":"See vignette Threshold modeling sdmTMB details.","code":"fit <- sdmTMB( present ~ 1 + breakpt(depth_scaled), data = pcod, mesh = mesh, family = binomial(link = \"logit\") ) fit <- sdmTMB( present ~ 1 + logistic(depth_scaled), data = pcod, mesh = mesh, family = binomial(link = \"logit\") )"},{"path":[]},{"path":"https://pbs-assess.github.io/sdmTMB/index.html","id":"simulating-data-from-scratch","dir":"","previous_headings":"Advanced functionality > Simulating data","what":"Simulating data from scratch","title":"Spatial and Spatiotemporal SPDE-Based GLMMs with TMB","text":"Fit simulated data: See ?sdmTMB_simulate details.","code":"predictor_dat <- expand.grid( X = seq(0, 1, length.out = 100), Y = seq(0, 1, length.out = 100) ) mesh <- make_mesh(predictor_dat, xy_cols = c(\"X\", \"Y\"), cutoff = 0.05) sim_dat <- sdmTMB_simulate( formula = ~ 1, data = predictor_dat, mesh = mesh, family = poisson(link = \"log\"), range = 0.3, sigma_O = 0.4, seed = 1, B = 1 # B0 = intercept ) head(sim_dat) #> # A tibble: 6 × 7 #> X Y omega_s mu eta observed `(Intercept)` #> #> 1 0 0 -0.154 2.33 0.846 1 1 #> 2 0.0101 0 -0.197 2.23 0.803 0 1 #> 3 0.0202 0 -0.240 2.14 0.760 2 1 #> 4 0.0303 0 -0.282 2.05 0.718 2 1 #> 5 0.0404 0 -0.325 1.96 0.675 3 1 #> 6 0.0505 0 -0.367 1.88 0.633 2 1 # sample 200 points for fitting: set.seed(1) sim_dat_obs <- sim_dat[sample(seq_len(nrow(sim_dat)), 200), ] ggplot(sim_dat, aes(X, Y)) + geom_raster(aes(fill = exp(eta))) + # mean without observation error geom_point(aes(size = observed), data = sim_dat_obs, pch = 21) + scale_fill_viridis_c() + scale_size_area() + coord_cartesian(expand = FALSE) mesh <- make_mesh(sim_dat_obs, xy_cols = c(\"X\", \"Y\"), cutoff = 0.05) fit <- sdmTMB( observed ~ 1, data = sim_dat_obs, mesh = mesh, family = poisson() )"},{"path":"https://pbs-assess.github.io/sdmTMB/index.html","id":"simulating-from-an-existing-fit","dir":"","previous_headings":"Advanced functionality > Simulating data","what":"Simulating from an existing fit","title":"Spatial and Spatiotemporal SPDE-Based GLMMs with TMB","text":"See vignette Residual checking sdmTMB, ?simulate.sdmTMB, ?dharma_residuals details.","code":"s <- simulate(fit, nsim = 500) dim(s) #> [1] 969 500 s[1:3,1:4] #> [,1] [,2] [,3] [,4] #> [1,] 0 59.40310 83.20888 0.00000 #> [2,] 0 34.56408 0.00000 19.99839 #> [3,] 0 0.00000 0.00000 0.00000"},{"path":"https://pbs-assess.github.io/sdmTMB/index.html","id":"sampling-from-the-joint-precision-matrix","dir":"","previous_headings":"Advanced functionality","what":"Sampling from the joint precision matrix","title":"Spatial and Spatiotemporal SPDE-Based GLMMs with TMB","text":"can take samples implied parameter distribution assuming MVN covariance matrix internal parameterization: See ?gather_sims ?get_index_sims details.","code":"samps <- gather_sims(fit, nsim = 1000) ggplot(samps, aes(.value)) + geom_histogram() + facet_wrap(~.variable, scales = \"free_x\") #> `stat_bin()` using `bins = 30`. Pick better value with `binwidth`."},{"path":"https://pbs-assess.github.io/sdmTMB/index.html","id":"calculating-uncertainty-on-spatial-predictions","dir":"","previous_headings":"Advanced functionality","what":"Calculating uncertainty on spatial predictions","title":"Spatial and Spatiotemporal SPDE-Based GLMMs with TMB","text":"fastest way get point-wise prediction uncertainty use MVN samples:","code":"p <- predict(fit, newdata = predictor_dat, nsim = 500) predictor_dat$se <- apply(p, 1, sd) ggplot(predictor_dat, aes(X, Y, fill = se)) + geom_raster() + scale_fill_viridis_c(option = \"A\") + coord_cartesian(expand = FALSE)"},{"path":"https://pbs-assess.github.io/sdmTMB/index.html","id":"cross-validation","dir":"","previous_headings":"Advanced functionality","what":"Cross validation","title":"Spatial and Spatiotemporal SPDE-Based GLMMs with TMB","text":"sdmTMB built-functionality cross-validation. set future::plan(), folds fit parallel: See ?sdmTMB_cv details.","code":"mesh <- make_mesh(pcod, c(\"X\", \"Y\"), cutoff = 10) ## Set parallel processing if desired: # library(future) # plan(multisession) m_cv <- sdmTMB_cv( density ~ s(depth, k = 5), data = pcod, mesh = mesh, family = tweedie(link = \"log\"), k_folds = 2 ) #> Running fits with `future.apply()`. #> Set a parallel `future::plan()` to use parallel processing. # Sum of log likelihoods of left-out data: m_cv$sum_loglik #> [1] -6756.28"},{"path":"https://pbs-assess.github.io/sdmTMB/index.html","id":"priors","dir":"","previous_headings":"Advanced functionality","what":"Priors","title":"Spatial and Spatiotemporal SPDE-Based GLMMs with TMB","text":"Priors/penalties can placed parameters. example, place PC (penalized complexity) prior Matérn random field parameters, standard normal prior effect depth, Normal(0, 10^2) prior intercept, half-normal prior Tweedie dispersion parameter (phi): can visualize PC Matérn prior: See ?sdmTMBpriors details.","code":"mesh <- make_mesh(pcod, c(\"X\", \"Y\"), cutoff = 10) fit <- sdmTMB( density ~ depth_scaled, data = pcod, mesh = mesh, family = tweedie(), priors = sdmTMBpriors( matern_s = pc_matern(range_gt = 10, sigma_lt = 5), b = normal(c(0, 0), c(1, 10)), phi = halfnormal(0, 15) ) ) plot_pc_matern(range_gt = 10, sigma_lt = 5)"},{"path":"https://pbs-assess.github.io/sdmTMB/index.html","id":"bayesian-mcmc-sampling-with-stan","dir":"","previous_headings":"Advanced functionality","what":"Bayesian MCMC sampling with Stan","title":"Spatial and Spatiotemporal SPDE-Based GLMMs with TMB","text":"fitted model can passed tmbstan package sample posterior Stan. See Bayesian vignette.","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/index.html","id":"turning-off-random-fields","dir":"","previous_headings":"Advanced functionality","what":"Turning off random fields","title":"Spatial and Spatiotemporal SPDE-Based GLMMs with TMB","text":"can turn random fields model comparison:","code":"fit_sdmTMB <- sdmTMB( present ~ poly(depth_scaled, 2), data = pcod, mesh = mesh, spatial = \"off\", family = binomial() ) fit_glm <- glm( present ~ poly(depth_scaled, 2), data = pcod, family = binomial() ) tidy(fit_sdmTMB) #> # A tibble: 3 × 3 #> term estimate std.error #> #> 1 (Intercept) -0.426 0.0573 #> 2 poly(depth_scaled, 2)1 -31.7 3.03 #> 3 poly(depth_scaled, 2)2 -66.9 4.09 broom::tidy(fit_glm) #> # A tibble: 3 × 5 #> term estimate std.error statistic p.value #> #> 1 (Intercept) -0.426 0.0573 -7.44 1.03e-13 #> 2 poly(depth_scaled, 2)1 -31.7 3.03 -10.5 1.20e-25 #> 3 poly(depth_scaled, 2)2 -66.9 4.09 -16.4 3.50e-60"},{"path":"https://pbs-assess.github.io/sdmTMB/index.html","id":"using-a-custom-fmesher-mesh","dir":"","previous_headings":"Advanced functionality","what":"Using a custom fmesher mesh","title":"Spatial and Spatiotemporal SPDE-Based GLMMs with TMB","text":"Defining mesh directly INLA:","code":"bnd <- INLA::inla.nonconvex.hull(cbind(pcod$X, pcod$Y), convex = -0.1) mesh_inla <- INLA::inla.mesh.2d( boundary = bnd, max.edge = c(25, 50) ) mesh <- make_mesh(pcod, c(\"X\", \"Y\"), mesh = mesh_inla) plot(mesh) fit <- sdmTMB( density ~ s(depth, k = 5), data = pcod, mesh = mesh, family = tweedie(link = \"log\") )"},{"path":"https://pbs-assess.github.io/sdmTMB/index.html","id":"barrier-meshes","dir":"","previous_headings":"Advanced functionality","what":"Barrier meshes","title":"Spatial and Spatiotemporal SPDE-Based GLMMs with TMB","text":"barrier mesh limits correlation across barriers (e.g., land water). See add_barrier_mesh() sdmTMBextra.","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/reference/Effect.sdmTMB.html","id":null,"dir":"Reference","previous_headings":"","what":"Calculate effects — Effect.sdmTMB","title":"Calculate effects — Effect.sdmTMB","text":"Used effects package","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/reference/Effect.sdmTMB.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"Calculate effects — Effect.sdmTMB","text":"","code":"Effect.sdmTMB(focal.predictors, mod, ...)"},{"path":"https://pbs-assess.github.io/sdmTMB/reference/Effect.sdmTMB.html","id":"arguments","dir":"Reference","previous_headings":"","what":"Arguments","title":"Calculate effects — Effect.sdmTMB","text":"focal.predictors character vector one predictors model order. mod regression model object. specific method exists class mod, Effect.default called. ... arguments passed .","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/reference/Effect.sdmTMB.html","id":"value","dir":"Reference","previous_headings":"","what":"Value","title":"Calculate effects — Effect.sdmTMB","text":"Output effects::effect(). Can plotted associated plot() method.","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/reference/Effect.sdmTMB.html","id":"ref-examples","dir":"Reference","previous_headings":"","what":"Examples","title":"Calculate effects — Effect.sdmTMB","text":"","code":"fit <- sdmTMB(present ~ depth_scaled, data = pcod_2011, family = binomial(), spatial = \"off\") effects::effect(\"depth_scaled\", fit) #> #> depth_scaled effect #> depth_scaled #> -3 -2 -0.1 1 3 #> 0.7511661 0.6634557 0.4673280 0.3544365 0.1897228 plot(effects::effect(\"depth_scaled\", fit))"},{"path":"https://pbs-assess.github.io/sdmTMB/reference/add_barrier_mesh.html","id":null,"dir":"Reference","previous_headings":"","what":"Transform a mesh object into a mesh with correlation barriers — add_barrier_mesh","title":"Transform a mesh object into a mesh with correlation barriers — add_barrier_mesh","text":"Moved sdmTMBextra package. Make sure load sdmTMBextra sdmTMB.","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/reference/add_barrier_mesh.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"Transform a mesh object into a mesh with correlation barriers — add_barrier_mesh","text":"","code":"add_barrier_mesh( spde_obj = deprecated(), barrier_sf = deprecated(), range_fraction = 0.2, proj_scaling = 1, plot = FALSE )"},{"path":"https://pbs-assess.github.io/sdmTMB/reference/add_barrier_mesh.html","id":"arguments","dir":"Reference","previous_headings":"","what":"Arguments","title":"Transform a mesh object into a mesh with correlation barriers — add_barrier_mesh","text":"spde_obj Output make_mesh(). barrier_sf sf object polygons defining barriers. example, coastline dataset ocean data. Note object must projection data used generate x y columns spde_obj. range_fraction fraction spatial range barrier triangles . proj_scaling spde_obj created scaling coordinates projection (e.g., dividing UTMs 1000 spatial range reasonable scale) x y values spde_obj multiplied scaling factor applying projection barrier_sf. plot Logical.","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/reference/add_barrier_mesh.html","id":"value","dir":"Reference","previous_headings":"","what":"Value","title":"Transform a mesh object into a mesh with correlation barriers — add_barrier_mesh","text":"Deprecated. See sdmTMBextra package.","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/reference/add_barrier_mesh.html","id":"ref-examples","dir":"Reference","previous_headings":"","what":"Examples","title":"Transform a mesh object into a mesh with correlation barriers — add_barrier_mesh","text":"","code":"if (FALSE) { add_barrier_mesh() }"},{"path":"https://pbs-assess.github.io/sdmTMB/reference/add_utm_columns.html","id":null,"dir":"Reference","previous_headings":"","what":"Add UTM coordinates to a data frame — add_utm_columns","title":"Add UTM coordinates to a data frame — add_utm_columns","text":"Add UTM (Universal Transverse Mercator) coordinates data frame. useful since geostatistical modeling generally performed equal-distance projection. can separately sf::st_as_sf(), sf::st_transform(), sf::st_coordinates() functions sf package.","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/reference/add_utm_columns.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"Add UTM coordinates to a data frame — add_utm_columns","text":"","code":"add_utm_columns( dat, ll_names = c(\"longitude\", \"latitude\"), ll_crs = 4326, utm_names = c(\"X\", \"Y\"), utm_crs = get_crs(dat, ll_names), units = c(\"km\", \"m\") ) get_crs(dat, ll_names = c(\"longitude\", \"latitude\"))"},{"path":"https://pbs-assess.github.io/sdmTMB/reference/add_utm_columns.html","id":"arguments","dir":"Reference","previous_headings":"","what":"Arguments","title":"Add UTM coordinates to a data frame — add_utm_columns","text":"dat Data frame contains longitude latitude columns. ll_names Longitude latitude column names. Note order. ll_crs Input CRS value ll_names. utm_names Output column names UTM columns. utm_crs Output CRS value UTM zone; tries detect get_crs() can specified manually. units UTM units.","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/reference/add_utm_columns.html","id":"value","dir":"Reference","previous_headings":"","what":"Value","title":"Add UTM coordinates to a data frame — add_utm_columns","text":"copy input data frame new columns UTM coordinates.","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/reference/add_utm_columns.html","id":"details","dir":"Reference","previous_headings":"","what":"Details","title":"Add UTM coordinates to a data frame — add_utm_columns","text":"Note longitudes west prime meridian encoded running -180 0 degrees. may wish work km's rather standard UTM meters range parameter estimate small, can cause computational issues. depends scale data.","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/reference/add_utm_columns.html","id":"ref-examples","dir":"Reference","previous_headings":"","what":"Examples","title":"Add UTM coordinates to a data frame — add_utm_columns","text":"","code":"d <- data.frame(lat = c(52.1, 53.4), lon = c(-130.0, -131.4)) get_crs(d, c(\"lon\", \"lat\")) #> Detected UTM zone 9N; CRS = 32609. #> Visit https://epsg.io/32609 to verify. #> [1] 32609 add_utm_columns(d, c(\"lon\", \"lat\")) #> Detected UTM zone 9N; CRS = 32609. #> Visit https://epsg.io/32609 to verify. #> lat lon X Y #> 1 52.1 -130.0 431.5034 5772.632 #> 2 53.4 -131.4 340.4411 5919.452"},{"path":"https://pbs-assess.github.io/sdmTMB/reference/dharma_residuals.html","id":null,"dir":"Reference","previous_headings":"","what":"DHARMa residuals — dharma_residuals","title":"DHARMa residuals — dharma_residuals","text":"Plot (possibly return) DHARMa residuals. wrapper function around DHARMa::createDHARMa() facilitate use sdmTMB() models. Note: recommended set type = \"mle-mvn\" simulate.sdmTMB() resulting residuals expected distribution. default.","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/reference/dharma_residuals.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"DHARMa residuals — dharma_residuals","text":"","code":"dharma_residuals( simulated_response, object, plot = TRUE, return_DHARMa = FALSE, test_uniformity = TRUE, test_outliers = FALSE, test_dispersion = FALSE, ... )"},{"path":"https://pbs-assess.github.io/sdmTMB/reference/dharma_residuals.html","id":"arguments","dir":"Reference","previous_headings":"","what":"Arguments","title":"DHARMa residuals — dharma_residuals","text":"simulated_response Output simulate.sdmTMB(). recommended set type = \"mle-mvn\" call simulate.sdmTMB() residuals expected distribution. object Output sdmTMB(). plot Logical. Calls DHARMa::plotQQunif(). return_DHARMa Logical. Return object DHARMa::createDHARMa()? test_uniformity Passed testUniformity DHARMa::plotQQunif(). test_outliers Passed testOutliers DHARMa::plotQQunif(). test_dispersion Passed testDispersion DHARMa::plotQQunif(). ... arguments pass DHARMa::createDHARMa().","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/reference/dharma_residuals.html","id":"value","dir":"Reference","previous_headings":"","what":"Value","title":"DHARMa residuals — dharma_residuals","text":"data frame observed expected values invisibly returned can assign output object plot residuals . See examples. return_DHARMa = TRUE, object DHARMa::createDHARMa() returned subsequent DHARMa functions can applied.","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/reference/dharma_residuals.html","id":"details","dir":"Reference","previous_headings":"","what":"Details","title":"DHARMa residuals — dharma_residuals","text":"See residuals vignette. Advantages residuals ones residuals.sdmTMB() method (1) work delta/hurdle models combined predictions, just two parts separately, (2) work families, just families worked analytical quantile function, (3) can used various diagnostic tools plots DHARMa package. Disadvantages (1) slower calculate since one must first simulate model, (2) stability distribution residuals depends sufficient number simulation draws, (3) uniformly distributed residuals put less emphasis tails visually (may desired). Note DHARMa returns residuals uniform(0, 1) data consistent model whereas randomized quantile residuals residuals.sdmTMB() expected normal(0, 1). experimental option expected_distribution included transform distributions normal(0, 1) expectation.","code":""},{"path":[]},{"path":"https://pbs-assess.github.io/sdmTMB/reference/dharma_residuals.html","id":"ref-examples","dir":"Reference","previous_headings":"","what":"Examples","title":"DHARMa residuals — dharma_residuals","text":"","code":"# Try Tweedie family: fit <- sdmTMB(density ~ as.factor(year) + s(depth, k = 3), data = pcod_2011, mesh = pcod_mesh_2011, family = tweedie(link = \"log\"), spatial = \"on\") # The `simulated_response` argument is first so the output from # simulate() can be piped to `dharma_residuals()`. # We will work with 100 simulations for fast examples, but you'll # likely want to work with more than this (enough that the results # are stable from run to run). # not great: set.seed(123) simulate(fit, nsim = 100, type = \"mle-mvn\") |> dharma_residuals(fit) # \\donttest{ # delta-lognormal looks better: set.seed(123) fit_dl <- update(fit, family = delta_lognormal()) simulate(fit_dl, nsim = 100, type = \"mle-mvn\") |> dharma_residuals(fit) # or skip the pipe: set.seed(123) s <- simulate(fit_dl, nsim = 100, type = \"mle-mvn\") # and manually plot it: r <- dharma_residuals(s, fit_dl, plot = FALSE) head(r) #> observed expected #> 1 0.0002512885 0.001030928 #> 2 0.0003998549 0.002061856 #> 3 0.0017916820 0.003092784 #> 4 0.0030243763 0.004123711 #> 5 0.0032777465 0.005154639 #> 6 0.0060709249 0.006185567 plot(r$expected, r$observed) abline(0, 1) # return the DHARMa object and work with the DHARMa methods ret <- simulate(fit_dl, nsim = 100, type = \"mle-mvn\") |> dharma_residuals(fit, return_DHARMa = TRUE) plot(ret) # }"},{"path":"https://pbs-assess.github.io/sdmTMB/reference/emmeans.sdmTMB.html","id":null,"dir":"Reference","previous_headings":"","what":"Estimated marginal means with the emmeans package with sdmTMB — emmeans.sdmTMB","title":"Estimated marginal means with the emmeans package with sdmTMB — emmeans.sdmTMB","text":"Methods using emmeans package sdmTMB. emmeans package computes estimated marginal means fixed effects.","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/reference/emmeans.sdmTMB.html","id":"references","dir":"Reference","previous_headings":"","what":"References","title":"Estimated marginal means with the emmeans package with sdmTMB — emmeans.sdmTMB","text":"https://aosmith.rbind.io/2019/03/25/getting-started--emmeans/","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/reference/emmeans.sdmTMB.html","id":"ref-examples","dir":"Reference","previous_headings":"","what":"Examples","title":"Estimated marginal means with the emmeans package with sdmTMB — emmeans.sdmTMB","text":"","code":"mesh <- make_mesh(pcod_2011, c(\"X\", \"Y\"), cutoff = 20) fit <- sdmTMB( present ~ as.factor(year), data = pcod_2011, mesh = mesh, family = binomial() ) fit #> Spatial model fit by ML ['sdmTMB'] #> Formula: present ~ as.factor(year) #> Mesh: mesh (isotropic covariance) #> Data: pcod_2011 #> Family: binomial(link = 'logit') #> #> coef.est coef.se #> (Intercept) -0.61 0.55 #> as.factor(year)2013 0.99 0.22 #> as.factor(year)2015 0.75 0.22 #> as.factor(year)2017 0.01 0.22 #> #> Matérn range: 48.45 #> Spatial SD: 1.84 #> ML criterion at convergence: 564.495 #> #> See ?tidy.sdmTMB to extract these values as a data frame. emmeans::emmeans(fit, ~ year) #> year emmean SE df lower.CL upper.CL #> 2011 -0.606 0.551 963 -1.688 0.475 #> 2013 0.384 0.550 963 -0.695 1.463 #> 2015 0.143 0.551 963 -0.937 1.224 #> 2017 -0.594 0.551 963 -1.674 0.487 #> #> Results are given on the logit (not the response) scale. #> Confidence level used: 0.95 emmeans::emmeans(fit, pairwise ~ year) #> $emmeans #> year emmean SE df lower.CL upper.CL #> 2011 -0.606 0.551 963 -1.688 0.475 #> 2013 0.384 0.550 963 -0.695 1.463 #> 2015 0.143 0.551 963 -0.937 1.224 #> 2017 -0.594 0.551 963 -1.674 0.487 #> #> Results are given on the logit (not the response) scale. #> Confidence level used: 0.95 #> #> $contrasts #> contrast estimate SE df t.ratio p.value #> year2011 - year2013 -0.9901 0.222 963 -4.467 0.0001 #> year2011 - year2015 -0.7496 0.220 963 -3.404 0.0039 #> year2011 - year2017 -0.0127 0.221 963 -0.057 0.9999 #> year2013 - year2015 0.2405 0.217 963 1.110 0.6837 #> year2013 - year2017 0.9774 0.223 963 4.390 0.0001 #> year2015 - year2017 0.7369 0.222 963 3.320 0.0052 #> #> Results are given on the log odds ratio (not the response) scale. #> P value adjustment: tukey method for comparing a family of 4 estimates #> emmeans::emmeans(fit, pairwise ~ year, type = \"response\") #> $emmeans #> year prob SE df lower.CL upper.CL #> 2011 0.353 0.126 963 0.156 0.617 #> 2013 0.595 0.133 963 0.333 0.812 #> 2015 0.536 0.137 963 0.281 0.773 #> 2017 0.356 0.126 963 0.158 0.619 #> #> Confidence level used: 0.95 #> Intervals are back-transformed from the logit scale #> #> $contrasts #> contrast odds.ratio SE df null t.ratio p.value #> year2011 / year2013 0.372 0.0823 963 1 -4.467 0.0001 #> year2011 / year2015 0.473 0.1041 963 1 -3.404 0.0039 #> year2011 / year2017 0.987 0.2182 963 1 -0.057 0.9999 #> year2013 / year2015 1.272 0.2757 963 1 1.110 0.6837 #> year2013 / year2017 2.658 0.5917 963 1 4.390 0.0001 #> year2015 / year2017 2.089 0.4638 963 1 3.320 0.0052 #> #> P value adjustment: tukey method for comparing a family of 4 estimates #> Tests are performed on the log odds ratio scale #> emmeans::emmeans(fit, pairwise ~ year, adjust = \"none\") #> $emmeans #> year emmean SE df lower.CL upper.CL #> 2011 -0.606 0.551 963 -1.688 0.475 #> 2013 0.384 0.550 963 -0.695 1.463 #> 2015 0.143 0.551 963 -0.937 1.224 #> 2017 -0.594 0.551 963 -1.674 0.487 #> #> Results are given on the logit (not the response) scale. #> Confidence level used: 0.95 #> #> $contrasts #> contrast estimate SE df t.ratio p.value #> year2011 - year2013 -0.9901 0.222 963 -4.467 <.0001 #> year2011 - year2015 -0.7496 0.220 963 -3.404 0.0007 #> year2011 - year2017 -0.0127 0.221 963 -0.057 0.9543 #> year2013 - year2015 0.2405 0.217 963 1.110 0.2675 #> year2013 - year2017 0.9774 0.223 963 4.390 <.0001 #> year2015 - year2017 0.7369 0.222 963 3.320 0.0009 #> #> Results are given on the log odds ratio (not the response) scale. #> e <- emmeans::emmeans(fit, ~ year) plot(e) e <- emmeans::emmeans(fit, pairwise ~ year) confint(e) #> $emmeans #> year emmean SE df lower.CL upper.CL #> 2011 -0.606 0.551 963 -1.688 0.475 #> 2013 0.384 0.550 963 -0.695 1.463 #> 2015 0.143 0.551 963 -0.937 1.224 #> 2017 -0.594 0.551 963 -1.674 0.487 #> #> Results are given on the logit (not the response) scale. #> Confidence level used: 0.95 #> #> $contrasts #> contrast estimate SE df lower.CL upper.CL #> year2011 - year2013 -0.9901 0.222 963 -1.560 -0.420 #> year2011 - year2015 -0.7496 0.220 963 -1.316 -0.183 #> year2011 - year2017 -0.0127 0.221 963 -0.581 0.556 #> year2013 - year2015 0.2405 0.217 963 -0.317 0.798 #> year2013 - year2017 0.9774 0.223 963 0.404 1.550 #> year2015 - year2017 0.7369 0.222 963 0.166 1.308 #> #> Results are given on the log odds ratio (not the response) scale. #> Confidence level used: 0.95 #> Conf-level adjustment: tukey method for comparing a family of 4 estimates #> summary(e, infer = TRUE) #> $emmeans #> year emmean SE df lower.CL upper.CL t.ratio p.value #> 2011 -0.606 0.551 963 -1.688 0.475 -1.100 0.2715 #> 2013 0.384 0.550 963 -0.695 1.463 0.698 0.4854 #> 2015 0.143 0.551 963 -0.937 1.224 0.260 0.7948 #> 2017 -0.594 0.551 963 -1.674 0.487 -1.078 0.2812 #> #> Results are given on the logit (not the response) scale. #> Confidence level used: 0.95 #> #> $contrasts #> contrast estimate SE df lower.CL upper.CL t.ratio p.value #> year2011 - year2013 -0.9901 0.222 963 -1.560 -0.420 -4.467 0.0001 #> year2011 - year2015 -0.7496 0.220 963 -1.316 -0.183 -3.404 0.0039 #> year2011 - year2017 -0.0127 0.221 963 -0.581 0.556 -0.057 0.9999 #> year2013 - year2015 0.2405 0.217 963 -0.317 0.798 1.110 0.6837 #> year2013 - year2017 0.9774 0.223 963 0.404 1.550 4.390 0.0001 #> year2015 - year2017 0.7369 0.222 963 0.166 1.308 3.320 0.0052 #> #> Results are given on the log odds ratio (not the response) scale. #> Confidence level used: 0.95 #> Conf-level adjustment: tukey method for comparing a family of 4 estimates #> P value adjustment: tukey method for comparing a family of 4 estimates #> as.data.frame(e) #> Warning: Note: 'as.data.frame' has combined your 2 sets of results into one object, #> and this affects things like adjusted P values. Refer to the annotations. #> year contrast emmean SE df lower.CL upper.CL #> 2011 . -0.6063952 0.5511170 963 -2.1569729 0.9441824 #> 2013 . 0.3836816 0.5497815 963 -1.1631386 1.9305018 #> 2015 . 0.1431957 0.5505275 963 -1.4057236 1.6921150 #> 2017 . -0.5937169 0.5506319 963 -2.1429298 0.9554960 #> . year2011 - year2013 -0.9900768 0.2216228 963 -1.6136166 -0.3665369 #> . year2011 - year2015 -0.7495909 0.2201979 963 -1.3691217 -0.1300601 #> . year2011 - year2017 -0.0126783 0.2209364 963 -0.6342868 0.6089302 #> . year2013 - year2015 0.2404859 0.2167459 963 -0.3693327 0.8503044 #> . year2013 - year2017 0.9773985 0.2226493 963 0.3509706 1.6038264 #> . year2015 - year2017 0.7369126 0.2219749 963 0.1123823 1.3614430 #> #> Results are given on the logit (not the response) scale. #> Confidence level used: 0.95 #> Conf-level adjustment: bonferroni method for 10 estimates # interaction of factor with continuous predictor: fit2 <- sdmTMB( present ~ depth_scaled * as.factor(year), data = pcod_2011, mesh = mesh, family = binomial() ) fit2 #> Spatial model fit by ML ['sdmTMB'] #> Formula: present ~ depth_scaled * as.factor(year) #> Mesh: mesh (isotropic covariance) #> Data: pcod_2011 #> Family: binomial(link = 'logit') #> #> coef.est coef.se #> (Intercept) -0.75 0.48 #> depth_scaled -0.98 0.25 #> as.factor(year)2013 1.03 0.23 #> as.factor(year)2015 0.79 0.23 #> as.factor(year)2017 0.01 0.23 #> depth_scaled:as.factor(year)2013 -0.16 0.26 #> depth_scaled:as.factor(year)2015 0.03 0.26 #> depth_scaled:as.factor(year)2017 -0.01 0.26 #> #> Matérn range: 33.38 #> Spatial SD: 2.19 #> ML criterion at convergence: 546.074 #> #> See ?tidy.sdmTMB to extract these values as a data frame. # slopes for each level: emmeans::emtrends(fit2, ~ year, var = \"depth_scaled\") #> year depth_scaled.trend SE df lower.CL upper.CL #> 2011 -0.980 0.250 959 -1.47 -0.490 #> 2013 -1.140 0.247 959 -1.63 -0.655 #> 2015 -0.950 0.238 959 -1.42 -0.483 #> 2017 -0.987 0.244 959 -1.47 -0.507 #> #> Confidence level used: 0.95 # test difference in slopes: emmeans::emtrends(fit2, pairwise ~ year, var = \"depth_scaled\") #> $emtrends #> year depth_scaled.trend SE df lower.CL upper.CL #> 2011 -0.980 0.250 959 -1.47 -0.490 #> 2013 -1.140 0.247 959 -1.63 -0.655 #> 2015 -0.950 0.238 959 -1.42 -0.483 #> 2017 -0.987 0.244 959 -1.47 -0.507 #> #> Confidence level used: 0.95 #> #> $contrasts #> contrast estimate SE df t.ratio p.value #> year2011 - year2013 0.16009 0.261 959 0.613 0.9281 #> year2011 - year2015 -0.03052 0.265 959 -0.115 0.9995 #> year2011 - year2017 0.00651 0.261 959 0.025 1.0000 #> year2013 - year2015 -0.19061 0.258 959 -0.738 0.8817 #> year2013 - year2017 -0.15358 0.259 959 -0.593 0.9342 #> year2015 - year2017 0.03703 0.263 959 0.141 0.9990 #> #> P value adjustment: tukey method for comparing a family of 4 estimates #> emmeans::emmip(fit2, year ~ depth_scaled, at = list(depth_scaled = seq(-2.5, 2.5, length.out = 50)), CIs = TRUE)"},{"path":"https://pbs-assess.github.io/sdmTMB/reference/extract_mcmc.html","id":null,"dir":"Reference","previous_headings":"","what":"Extract MCMC samples from a model fit with tmbstan. — extract_mcmc","title":"Extract MCMC samples from a model fit with tmbstan. — extract_mcmc","text":"Moved sdmTMBextra package","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/reference/extract_mcmc.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"Extract MCMC samples from a model fit with tmbstan. — extract_mcmc","text":"","code":"extract_mcmc(object = deprecated())"},{"path":"https://pbs-assess.github.io/sdmTMB/reference/extract_mcmc.html","id":"arguments","dir":"Reference","previous_headings":"","what":"Arguments","title":"Extract MCMC samples from a model fit with tmbstan. — extract_mcmc","text":"object Deprecated See sdmTMBextra package. Make sure load sdmTMBextra sdmTMB.","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/reference/extract_mcmc.html","id":"value","dir":"Reference","previous_headings":"","what":"Value","title":"Extract MCMC samples from a model fit with tmbstan. — extract_mcmc","text":"Deprecated. See sdmTMBextra package.","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/reference/extract_mcmc.html","id":"ref-examples","dir":"Reference","previous_headings":"","what":"Examples","title":"Extract MCMC samples from a model fit with tmbstan. — extract_mcmc","text":"","code":"if (FALSE) { extract_mcmc() }"},{"path":"https://pbs-assess.github.io/sdmTMB/reference/families.html","id":null,"dir":"Reference","previous_headings":"","what":"Additional families — Families","title":"Additional families — Families","text":"Additional families compatible sdmTMB().","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/reference/families.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"Additional families — Families","text":"","code":"Beta(link = \"logit\") lognormal(link = \"log\") gengamma(link = \"log\") gamma_mix(link = \"log\") lognormal_mix(link = \"log\") nbinom2_mix(link = \"log\") nbinom2(link = \"log\") nbinom1(link = \"log\") truncated_nbinom2(link = \"log\") truncated_nbinom1(link = \"log\") student(link = \"identity\", df = 3) tweedie(link = \"log\") censored_poisson(link = \"log\") delta_gamma(link1, link2 = \"log\", type = c(\"standard\", \"poisson-link\")) delta_gamma_mix(link1 = \"logit\", link2 = \"log\") delta_gengamma(link1, link2 = \"log\", type = c(\"standard\", \"poisson-link\")) delta_lognormal(link1, link2 = \"log\", type = c(\"standard\", \"poisson-link\")) delta_lognormal_mix(link1, link2 = \"log\", type = c(\"standard\", \"poisson-link\")) delta_truncated_nbinom2(link1 = \"logit\", link2 = \"log\") delta_truncated_nbinom1(link1 = \"logit\", link2 = \"log\") delta_poisson_link_gamma(link1 = \"log\", link2 = \"log\") delta_poisson_link_lognormal(link1 = \"log\", link2 = \"log\") delta_beta(link1 = \"logit\", link2 = \"logit\")"},{"path":"https://pbs-assess.github.io/sdmTMB/reference/families.html","id":"arguments","dir":"Reference","previous_headings":"","what":"Arguments","title":"Additional families — Families","text":"link Link. df Student-t degrees freedom fixed value parameter. link1 Link first part delta/hurdle model. Defaults \"logit\" type = \"standard\" \"log\" type = \"poisson-link\". link2 Link second part delta/hurdle model. type Delta/hurdle family type. \"standard\" classic hurdle model. \"poisson-link\" Poisson-link delta model (Thorson 2018).","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/reference/families.html","id":"value","dir":"Reference","previous_headings":"","what":"Value","title":"Additional families — Families","text":"list elements common standard R family objects including family, link, linkfun, linkinv. Delta/hurdle model families also elements delta (logical) type (standard vs. Poisson-link).","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/reference/families.html","id":"details","dir":"Reference","previous_headings":"","what":"Details","title":"Additional families — Families","text":"delta_poisson_link_gamma() delta_poisson_link_lognormal() deprecated favour delta_gamma(type = \"poisson-link\") delta_lognormal(type = \"poisson-link\"). gengamma() family implemented J.T. Thorson uses Prentice (1974) parameterization lognormal occurs internal parameter gengamma_Q (reported print() \"Generalized gamma lambda\") approaches 0. families ending _mix() 2-component mixtures distribution mean shared scale parameter. (Thorson et al. 2011). See model-description vignette details. parameter plogis(log_p_mix) probability extreme (larger) mean exp(log_ratio_mix) + 1 ratio larger extreme mean \"regular\" mean. can see parameters model$sd_report. nbinom2 negative binomial parameterization NB2 variance grows quadratically mean (Hilbe 2011). nbinom1 negative binomial parameterization lets variance grow linearly mean (Hilbe 2011). student(), degrees freedom parameter currently estimated fixed df.","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/reference/families.html","id":"references","dir":"Reference","previous_headings":"","what":"References","title":"Additional families — Families","text":"Generalized gamma family: Prentice, R.L. 1974. log gamma model maximum likelihood estimation. Biometrika 61(3): 539–544. doi:10.1093/biomet/61.3.539 Stacy, E.W. 1962. Generalization Gamma Distribution. Annals Mathematical Statistics 33(3): 1187–1192. Institute Mathematical Statistics. Families ending _mix(): Thorson, J.T., Stewart, .J., Punt, .E. 2011. Accounting fish shoals single- multi-species survey data using mixture distribution models. Can. J. Fish. Aquat. Sci. 68(9): 1681–1693. doi:10.1139/f2011-086 . Negative binomial families: Hilbe, J. M. 2011. Negative binomial regression. Cambridge University Press. Poisson-link delta families: Thorson, J.T. 2018. Three problems conventional delta-model biomass sampling data, computationally efficient alternative. Canadian Journal Fisheries Aquatic Sciences, 75(9), 1369-1382. doi:10.1139/cjfas-2017-0266","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/reference/families.html","id":"ref-examples","dir":"Reference","previous_headings":"","what":"Examples","title":"Additional families — Families","text":"","code":"Beta(link = \"logit\") #> #> Family: Beta #> Link function: logit #> lognormal(link = \"log\") #> #> Family: lognormal #> Link function: log #> gengamma(link = \"log\") #> #> Family: gengamma #> Link function: log #> gamma_mix(link = \"log\") #> #> Family: gamma_mix #> Link function: log #> lognormal_mix(link = \"log\") #> #> Family: lognormal_mix #> Link function: log #> nbinom2_mix(link = \"log\") #> #> Family: nbinom2_mix #> Link function: log #> nbinom2(link = \"log\") #> #> Family: nbinom2 #> Link function: log #> nbinom1(link = \"log\") #> #> Family: nbinom1 #> Link function: log #> truncated_nbinom2(link = \"log\") #> #> Family: truncated_nbinom2 #> Link function: log #> truncated_nbinom1(link = \"log\") #> #> Family: truncated_nbinom1 #> Link function: log #> student(link = \"identity\") #> #> Family: student #> Link function: identity #> tweedie(link = \"log\") #> #> Family: tweedie #> Link function: log #> censored_poisson(link = \"log\") #> #> Family: censored_poisson #> Link function: log #> delta_gamma() #> #> Family: binomial Gamma #> Link function: logit log #> delta_gamma_mix() #> #> Family: binomial gamma_mix #> Link function: logit log #> delta_gengamma() #> #> Family: binomial gengamma #> Link function: logit log #> delta_lognormal() #> #> Family: binomial lognormal #> Link function: logit log #> delta_lognormal_mix() #> #> Family: binomial lognormal_mix #> Link function: logit log #> delta_truncated_nbinom2() #> #> Family: binomial truncated_nbinom2 #> Link function: logit log #> delta_truncated_nbinom1() #> #> Family: binomial truncated_nbinom1 #> Link function: logit log #> delta_beta() #> #> Family: binomial Beta #> Link function: logit logit #>"},{"path":"https://pbs-assess.github.io/sdmTMB/reference/gather_sims.html","id":null,"dir":"Reference","previous_headings":"","what":"Extract parameter simulations from the joint precision matrix — spread_sims","title":"Extract parameter simulations from the joint precision matrix — spread_sims","text":"spread_sims() returns wide-format data frame. gather_sims() returns long-format data frame. format matches format tidybayes spread_draws() gather_draws() functions.","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/reference/gather_sims.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"Extract parameter simulations from the joint precision matrix — spread_sims","text":"","code":"spread_sims(object, nsim = 200, n_sims = deprecated()) gather_sims(object, nsim = 200, n_sims = deprecated())"},{"path":"https://pbs-assess.github.io/sdmTMB/reference/gather_sims.html","id":"arguments","dir":"Reference","previous_headings":"","what":"Arguments","title":"Extract parameter simulations from the joint precision matrix — spread_sims","text":"object Output sdmTMB(). nsim number simulation draws. n_sims Deprecated: please use nsim.","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/reference/gather_sims.html","id":"value","dir":"Reference","previous_headings":"","what":"Value","title":"Extract parameter simulations from the joint precision matrix — spread_sims","text":"data frame. gather_sims() returns long-format data frame: .iteration: sample ID .variable: parameter name .value: parameter sample value spread_sims() returns wide-format data frame: .iteration: sample ID columns parameter sample per row","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/reference/gather_sims.html","id":"ref-examples","dir":"Reference","previous_headings":"","what":"Examples","title":"Extract parameter simulations from the joint precision matrix — spread_sims","text":"","code":"m <- sdmTMB(density ~ depth_scaled, data = pcod_2011, mesh = pcod_mesh_2011, family = tweedie()) head(spread_sims(m, nsim = 10)) #> .iteration X.Intercept. depth_scaled range phi tweedie_p sigma_O #> 1 1 2.947281 -0.6113464 21.28292 14.96395 1.579234 2.649385 #> 2 2 2.867277 -0.7282022 50.56038 15.63864 1.583225 1.871312 #> 3 3 2.570547 -0.5759993 81.02526 13.86308 1.582175 1.204293 #> 4 4 3.305925 -0.5416168 26.01008 16.21788 1.588699 1.674452 #> 5 5 2.870912 -0.7513359 35.35486 15.06749 1.595577 2.000633 #> 6 6 2.901625 -0.7516155 25.05987 14.38858 1.614982 2.741510 head(gather_sims(m, nsim = 10)) #> .iteration .variable .value #> 1 1 X.Intercept. 3.120602 #> 2 2 X.Intercept. 3.072890 #> 3 3 X.Intercept. 2.688049 #> 4 4 X.Intercept. 3.006871 #> 5 5 X.Intercept. 3.120740 #> 6 6 X.Intercept. 2.509646 samps <- gather_sims(m, nsim = 1000) if (require(\"ggplot2\", quietly = TRUE)) { ggplot(samps, aes(.value)) + geom_histogram() + facet_wrap(~.variable, scales = \"free_x\") } #> `stat_bin()` using `bins = 30`. Pick better value with `binwidth`."},{"path":"https://pbs-assess.github.io/sdmTMB/reference/get_index.html","id":null,"dir":"Reference","previous_headings":"","what":"Extract a relative biomass/abundance index or a center of gravity — get_index","title":"Extract a relative biomass/abundance index or a center of gravity — get_index","text":"Extract relative biomass/abundance index center gravity","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/reference/get_index.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"Extract a relative biomass/abundance index or a center of gravity — get_index","text":"","code":"get_index( obj, bias_correct = FALSE, level = 0.95, area = 1, silent = TRUE, ... ) get_cog( obj, bias_correct = FALSE, level = 0.95, format = c(\"long\", \"wide\"), area = 1, silent = TRUE, ... )"},{"path":"https://pbs-assess.github.io/sdmTMB/reference/get_index.html","id":"arguments","dir":"Reference","previous_headings":"","what":"Arguments","title":"Extract a relative biomass/abundance index or a center of gravity — get_index","text":"obj Output predict.sdmTMB() return_tmb_object = TRUE. bias_correct bias correction implemented TMB::sdreport()? level confidence level. area Grid cell area. vector length newdata predict.sdmTMB() value length 1, repeated internally match. silent Silent? ... Passed TMB::sdreport(). format Long wide.","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/reference/get_index.html","id":"value","dir":"Reference","previous_headings":"","what":"Value","title":"Extract a relative biomass/abundance index or a center of gravity — get_index","text":"get_index(): data frame columns time, estimate, lower upper confidence intervals, log estimate, standard error log estimate. get_cog(): data frame columns time, estimate (center gravity x y coordinates), lower upper confidence intervals, standard error center gravity coordinates.","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/reference/get_index.html","id":"references","dir":"Reference","previous_headings":"","what":"References","title":"Extract a relative biomass/abundance index or a center of gravity — get_index","text":"Geostatistical random-field model-based indices abundance (along many newer papers): Shelton, .O., Thorson, J.T., Ward, E.J., Feist, B.E. 2014. Spatial semiparametric models improve estimates species abundance distribution. Canadian Journal Fisheries Aquatic Sciences 71(11): 1655--1666. doi:10.1139/cjfas-2013-0508 Thorson, J.T., Shelton, .O., Ward, E.J., Skaug, H.J. 2015. Geostatistical delta-generalized linear mixed models improve precision estimated abundance indices West Coast groundfishes. ICES J. Mar. Sci. 72(5): 1297–1310. doi:10.1093/icesjms/fsu243 Geostatistical model-based centre gravity: Thorson, J.T., Pinsky, M.L., Ward, E.J. 2016. Model-based inference estimating shifts species distribution, area occupied centre gravity. Methods Ecol Evol 7(8): 990–1002. doi:10.1111/2041-210X.12567 Bias correction: Thorson, J.T., Kristensen, K. 2016. Implementing generic method bias correction statistical models using random effects, spatial population dynamics examples. Fisheries Research 175: 66–74. doi:10.1016/j.fishres.2015.11.016","code":""},{"path":[]},{"path":"https://pbs-assess.github.io/sdmTMB/reference/get_index.html","id":"ref-examples","dir":"Reference","previous_headings":"","what":"Examples","title":"Extract a relative biomass/abundance index or a center of gravity — get_index","text":"","code":"# \\donttest{ # Use a small number of knots for this example to make it fast: pcod_spde <- make_mesh(pcod, c(\"X\", \"Y\"), n_knots = 60, type = \"kmeans\") m <- sdmTMB( data = pcod, formula = density ~ 0 + as.factor(year), time = \"year\", mesh = pcod_spde, family = tweedie(link = \"log\") ) # make prediction grid: nd <- replicate_df(qcs_grid, \"year\", unique(pcod$year)) # Note `return_tmb_object = TRUE` and the prediction grid: predictions <- predict(m, newdata = nd, return_tmb_object = TRUE) ind <- get_index(predictions) #> Bias correction is turned off. #> It is recommended to turn this on for final inference. if (require(\"ggplot2\", quietly = TRUE)) { ggplot(ind, aes(year, est)) + geom_line() + geom_ribbon(aes(ymin = lwr, ymax = upr), alpha = 0.4) } cog <- get_cog(predictions) cog #> year est lwr upr se coord #> 1 2003 463.5260 446.4142 480.6378 8.730670 X #> 2 2004 476.7402 466.4506 487.0297 5.249871 X #> 3 2005 470.6887 457.7494 483.6280 6.601796 X #> 4 2007 480.8948 464.5560 497.2336 8.336280 X #> 5 2009 477.2029 457.9185 496.4872 9.839144 X #> 6 2011 470.5112 457.6004 483.4221 6.587284 X #> 7 2013 471.9877 455.6078 488.3676 8.357252 X #> 8 2015 463.0289 449.6443 476.4136 6.829028 X #> 9 2017 470.5219 455.4189 485.6249 7.705734 X #> 10 2003 5757.8611 5739.8547 5775.8675 9.187109 Y #> 11 2004 5732.5037 5720.8788 5744.1285 5.931164 Y #> 12 2005 5763.0315 5750.1526 5775.9104 6.570990 Y #> 13 2007 5738.2313 5716.8428 5759.6199 10.912733 Y #> 14 2009 5734.0288 5713.3608 5754.6968 10.545101 Y #> 15 2011 5747.1037 5733.6282 5760.5792 6.875389 Y #> 16 2013 5747.6446 5728.9691 5766.3201 9.528488 Y #> 17 2015 5753.9697 5736.8439 5771.0956 8.737855 Y #> 18 2017 5755.9728 5739.6443 5772.3014 8.331045 Y # }"},{"path":"https://pbs-assess.github.io/sdmTMB/reference/get_index_sims.html","id":null,"dir":"Reference","previous_headings":"","what":"Calculate a population index via simulation from the joint precision matrix — get_index_sims","title":"Calculate a population index via simulation from the joint precision matrix — get_index_sims","text":"Calculate population index via simulation joint precision matrix. Compared get_index(), version can faster bias correction turned get_index() approximately equivalent. experimental function. function usually works reasonably well, make guarantees. recommended use get_index() bias_correct = TRUE final inference.","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/reference/get_index_sims.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"Calculate a population index via simulation from the joint precision matrix — get_index_sims","text":"","code":"get_index_sims( obj, level = 0.95, return_sims = FALSE, area = rep(1, nrow(obj)), est_function = stats::median, area_function = function(x, area) x + log(area), agg_function = function(x) sum(exp(x)) )"},{"path":"https://pbs-assess.github.io/sdmTMB/reference/get_index_sims.html","id":"arguments","dir":"Reference","previous_headings":"","what":"Arguments","title":"Calculate a population index via simulation from the joint precision matrix — get_index_sims","text":"obj predict.sdmTMB() output nsim > 0. level confidence level. return_sims Logical. Return simulation draws? default (FALSE) quantile summary simulation draws. area vector grid cell/polyon areas year-grid cell (row data) obj. Adjust cells unit area area (e.g., cells partially land/water). Note area vector added log(area) raw values obj. words, function assumes log link, typically makes sense. est_function Function summarize estimate (expected value). mean() alternative median(). area_function Function apply area weighting. Assuming log link, function(x, area) x + log(area) default makes sense. natural space, function(x, area) x * area makes sense. agg_function Function aggregate samples within time slice. Assuming log link, function(x) sum(exp(x)) default makes sense. natural space, function(x) sum(x) makes sense.","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/reference/get_index_sims.html","id":"value","dir":"Reference","previous_headings":"","what":"Value","title":"Calculate a population index via simulation from the joint precision matrix — get_index_sims","text":"data frame. return_sims = FALSE: name column (e.g. year) supplied sdmTMB() time argument est: estimate lwr: lower confidence interval value upr: upper confidence interval value log_est: log estimate se: standard error log estimate return_sims = TRUE, samples index values long-format data frame: name column (e.g. year) supplied sdmTMB() time argument .value: sample value .iteration: sample number","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/reference/get_index_sims.html","id":"details","dir":"Reference","previous_headings":"","what":"Details","title":"Calculate a population index via simulation from the joint precision matrix — get_index_sims","text":"Can also used produce index model fit tmbstan. function nothing summarize reshape matrix simulation draws data frame.","code":""},{"path":[]},{"path":"https://pbs-assess.github.io/sdmTMB/reference/get_index_sims.html","id":"ref-examples","dir":"Reference","previous_headings":"","what":"Examples","title":"Calculate a population index via simulation from the joint precision matrix — get_index_sims","text":"","code":"# \\donttest{ m <- sdmTMB(density ~ 0 + as.factor(year), data = pcod_2011, mesh = pcod_mesh_2011, family = tweedie(link = \"log\"), time = \"year\" ) qcs_grid_2011 <- replicate_df(qcs_grid, \"year\", unique(pcod_2011$year)) p <- predict(m, newdata = qcs_grid_2011, nsim = 100) x <- get_index_sims(p) #> We generally recommend using `get_index(..., bias_correct = TRUE)` #> rather than `get_index_sims()`. x_sims <- get_index_sims(p, return_sims = TRUE) #> We generally recommend using `get_index(..., bias_correct = TRUE)` #> rather than `get_index_sims()`. if (require(\"ggplot2\", quietly = TRUE)) { ggplot(x, aes(year, est, ymin = lwr, ymax = upr)) + geom_line() + geom_ribbon(alpha = 0.4) ggplot(x_sims, aes(as.factor(year), .value)) + geom_violin() } # Demo custom functions if working in natural space: ind <- get_index_sims( exp(p), agg_function = function(x) sum(x), area_function = function(x, area) x * area ) #> We generally recommend using `get_index(..., bias_correct = TRUE)` #> rather than `get_index_sims()`. # }"},{"path":"https://pbs-assess.github.io/sdmTMB/reference/get_pars.html","id":null,"dir":"Reference","previous_headings":"","what":"Get TMB parameter list — get_pars","title":"Get TMB parameter list — get_pars","text":"Get TMB parameter list","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/reference/get_pars.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"Get TMB parameter list — get_pars","text":"","code":"get_pars(object)"},{"path":"https://pbs-assess.github.io/sdmTMB/reference/get_pars.html","id":"arguments","dir":"Reference","previous_headings":"","what":"Arguments","title":"Get TMB parameter list — get_pars","text":"object Fit sdmTMB()","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/reference/get_pars.html","id":"value","dir":"Reference","previous_headings":"","what":"Value","title":"Get TMB parameter list — get_pars","text":"named list parameter values","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/reference/get_pars.html","id":"ref-examples","dir":"Reference","previous_headings":"","what":"Examples","title":"Get TMB parameter list — get_pars","text":"","code":"fit <- sdmTMB(present ~ 1, data = pcod_2011, family = binomial(), spatial = \"off\") pars <- get_pars(fit) names(pars) #> [1] \"ln_H_input\" \"b_j\" \"b_j2\" #> [4] \"bs\" \"ln_tau_O\" \"ln_tau_Z\" #> [7] \"ln_tau_E\" \"ln_kappa\" \"thetaf\" #> [10] \"gengamma_Q\" \"logit_p_mix\" \"log_ratio_mix\" #> [13] \"ln_phi\" \"ln_tau_V\" \"rho_time_unscaled\" #> [16] \"ar1_phi\" \"ln_tau_G\" \"RE\" #> [19] \"b_rw_t\" \"omega_s\" \"zeta_s\" #> [22] \"epsilon_st\" \"b_threshold\" \"b_epsilon\" #> [25] \"ln_epsilon_re_sigma\" \"epsilon_re\" \"b_smooth\" #> [28] \"ln_smooth_sigma\""},{"path":"https://pbs-assess.github.io/sdmTMB/reference/ggplot2_installed.html","id":null,"dir":"Reference","previous_headings":"","what":"Check if ggplot2 installed — ggplot2_installed","title":"Check if ggplot2 installed — ggplot2_installed","text":"Check ggplot2 installed","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/reference/ggplot2_installed.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"Check if ggplot2 installed — ggplot2_installed","text":"","code":"ggplot2_installed()"},{"path":"https://pbs-assess.github.io/sdmTMB/reference/ggplot2_installed.html","id":"value","dir":"Reference","previous_headings":"","what":"Value","title":"Check if ggplot2 installed — ggplot2_installed","text":"Returns TRUE FALSE.","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/reference/make_mesh.html","id":null,"dir":"Reference","previous_headings":"","what":"Construct an SPDE mesh for sdmTMB — make_mesh","title":"Construct an SPDE mesh for sdmTMB — make_mesh","text":"Construct SPDE mesh use sdmTMB.","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/reference/make_mesh.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"Construct an SPDE mesh for sdmTMB — make_mesh","text":"","code":"make_mesh( data, xy_cols, type = c(\"kmeans\", \"cutoff\", \"cutoff_search\"), cutoff, n_knots, seed = 42, mesh = NULL, fmesher_func = fmesher::fm_rcdt_2d_inla, convex = NULL, concave = convex, ... ) # S3 method for sdmTMBmesh plot(x, ...)"},{"path":"https://pbs-assess.github.io/sdmTMB/reference/make_mesh.html","id":"arguments","dir":"Reference","previous_headings":"","what":"Arguments","title":"Construct an SPDE mesh for sdmTMB — make_mesh","text":"data data frame. xy_cols character vector x y column names contained data. likely equal distance projection. helper function convert UTMs, see add_utm_columns(). type Method create mesh. Also see mesh argument supply mesh. cutoff optional cutoff type \"cutoff\". minimum allowed triangle edge length. n_knots number desired knots type \"cutoff\". seed Random seed. Affects stats::kmeans() determination knot locations type = \"kmeans\". mesh optional mesh created via fmesher instead using convenience options. fmesher_func fmesher function use. Options include fmesher::fm_rcdt_2d_inla() fmesher::fm_mesh_2d_inla() along version without _inla end. convex specified, passed fmesher::fm_nonconvex_hull(). Distance extend non-convex hull data. concave specified, passed fmesher::fm_nonconvex_hull(). \"Minimum allowed reentrant curvature\". Defaults convex. ... Passed graphics::plot(). x Output make_mesh().","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/reference/make_mesh.html","id":"value","dir":"Reference","previous_headings":"","what":"Value","title":"Construct an SPDE mesh for sdmTMB — make_mesh","text":"make_mesh(): list class sdmTMBmesh. element mesh output fmesher_func (default fmesher::fm_mesh_2d_inla()). See mesh$mesh$n number vertices. plot.sdmTMBmesh(): plot mesh data points. ggplot2 installed, ggplot2 object returned, otherwise base graphics R plot returned. make , pass your_mesh$mesh inlabru::gg().","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/reference/make_mesh.html","id":"ref-examples","dir":"Reference","previous_headings":"","what":"Examples","title":"Construct an SPDE mesh for sdmTMB — make_mesh","text":"","code":"# Extremely simple cutoff: mesh <- make_mesh(pcod, c(\"X\", \"Y\"), cutoff = 5, type = \"cutoff\") plot(mesh) # Using a k-means algorithm to assign vertices: mesh <- make_mesh(pcod, c(\"X\", \"Y\"), n_knots = 50, type = \"kmeans\") plot(mesh) # \\donttest{ # But, it's better to develop more tailored meshes: # Pass arguments via '...' to fmesher::fm_mesh_2d_inla(): mesh <- make_mesh( pcod, c(\"X\", \"Y\"), fmesher_func = fmesher::fm_mesh_2d_inla, cutoff = 8, # minimum triangle edge length max.edge = c(20, 40), # inner and outer max triangle lengths offset = c(5, 40) # inner and outer border widths ) plot(mesh) # Or define a mesh directly with fmesher (formerly in INLA): inla_mesh <- fmesher::fm_mesh_2d_inla( loc = cbind(pcod$X, pcod$Y), # coordinates max.edge = c(25, 50), # max triangle edge length; inner and outer meshes offset = c(5, 25), # inner and outer border widths cutoff = 5 # minimum triangle edge length ) mesh <- make_mesh(pcod, c(\"X\", \"Y\"), mesh = inla_mesh) plot(mesh) # }"},{"path":"https://pbs-assess.github.io/sdmTMB/reference/plot_anisotropy.html","id":null,"dir":"Reference","previous_headings":"","what":"Plot anisotropy from an sdmTMB model — plot_anisotropy","title":"Plot anisotropy from an sdmTMB model — plot_anisotropy","text":"Anisotropy spatial correlation directionally dependent. sdmTMB(), default spatial correlation isotropic, anisotropy can enabled anisotropy = TRUE. plotting functions help visualize estimated anisotropy.","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/reference/plot_anisotropy.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"Plot anisotropy from an sdmTMB model — plot_anisotropy","text":"","code":"plot_anisotropy(object, return_data = FALSE) plot_anisotropy2(object, model = 1)"},{"path":"https://pbs-assess.github.io/sdmTMB/reference/plot_anisotropy.html","id":"arguments","dir":"Reference","previous_headings":"","what":"Arguments","title":"Plot anisotropy from an sdmTMB model — plot_anisotropy","text":"object object sdmTMB(). return_data Logical. Return data frame? plot_anisotropy() . model model delta model (plot_anisotropy2(); plot_anisotropy() always plots ).","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/reference/plot_anisotropy.html","id":"value","dir":"Reference","previous_headings":"","what":"Value","title":"Plot anisotropy from an sdmTMB model — plot_anisotropy","text":"plot_anisotropy(): One ellipses illustrating estimated anisotropy. ellipses centered coordinates zero space X-Y coordinates modeled. ellipses show spatial /spatiotemporal range (distance correlation effectively independent) direction zero. Uses ggplot2. anisotropy turned fitting model, NULL returned instead ggplot2 object. plot_anisotropy2(): plot eigenvectors illustrating estimated anisotropy. list plotted data invisibly returned. Uses base graphics. anisotropy turned fitting model, NULL returned instead plot object.","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/reference/plot_anisotropy.html","id":"references","dir":"Reference","previous_headings":"","what":"References","title":"Plot anisotropy from an sdmTMB model — plot_anisotropy","text":"Code adapted VAST R package","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/reference/plot_anisotropy.html","id":"ref-examples","dir":"Reference","previous_headings":"","what":"Examples","title":"Plot anisotropy from an sdmTMB model — plot_anisotropy","text":"","code":"mesh <- make_mesh(pcod_2011, c(\"X\", \"Y\"), n_knots = 80, type = \"kmeans\") fit <- sdmTMB( data = pcod_2011, formula = density ~ 1, mesh = mesh, family = tweedie(), share_range = FALSE, time = \"year\", anisotropy = TRUE #< ) plot_anisotropy(fit) plot_anisotropy2(fit)"},{"path":"https://pbs-assess.github.io/sdmTMB/reference/plot_pc_matern.html","id":null,"dir":"Reference","previous_headings":"","what":"Plot PC Matérn priors — plot_pc_matern","title":"Plot PC Matérn priors — plot_pc_matern","text":"Plot PC Matérn priors","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/reference/plot_pc_matern.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"Plot PC Matérn priors — plot_pc_matern","text":"","code":"plot_pc_matern( range_gt, sigma_lt, range_prob = 0.05, sigma_prob = 0.05, range_lims = c(range_gt * 0.1, range_gt * 10), sigma_lims = c(0, sigma_lt * 2), plot = TRUE )"},{"path":"https://pbs-assess.github.io/sdmTMB/reference/plot_pc_matern.html","id":"arguments","dir":"Reference","previous_headings":"","what":"Arguments","title":"Plot PC Matérn priors — plot_pc_matern","text":"range_gt value one expects spatial spatiotemporal range greater 1 - range_prob probability. sigma_lt value one expects spatial spatiotemporal marginal standard deviation (sigma_O sigma_E internally) less 1 - sigma_prob probability. range_prob Probability. See description range_gt. sigma_prob Probability. See description sigma_lt. range_lims Plot range variable limits. sigma_lims Plot sigma variable limits. plot Logical controlling whether plot drawn (defaults TRUE).","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/reference/plot_pc_matern.html","id":"value","dir":"Reference","previous_headings":"","what":"Value","title":"Plot PC Matérn priors — plot_pc_matern","text":"plot image(). Invisibly returns underlying matrix data. rows sigmas. columns ranges. Column row names provided.","code":""},{"path":[]},{"path":"https://pbs-assess.github.io/sdmTMB/reference/plot_pc_matern.html","id":"ref-examples","dir":"Reference","previous_headings":"","what":"Examples","title":"Plot PC Matérn priors — plot_pc_matern","text":"","code":"plot_pc_matern(range_gt = 5, sigma_lt = 1) plot_pc_matern(range_gt = 5, sigma_lt = 10) plot_pc_matern(range_gt = 5, sigma_lt = 1, sigma_prob = 0.2) plot_pc_matern(range_gt = 5, sigma_lt = 1, range_prob = 0.2)"},{"path":"https://pbs-assess.github.io/sdmTMB/reference/plot_smooth.html","id":null,"dir":"Reference","previous_headings":"","what":"Plot a smooth term from an sdmTMB model — plot_smooth","title":"Plot a smooth term from an sdmTMB model — plot_smooth","text":"Deprecated: use visreg::visreg(). See visreg_delta() examples.","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/reference/plot_smooth.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"Plot a smooth term from an sdmTMB model — plot_smooth","text":"","code":"plot_smooth( object, select = 1, n = 100, level = 0.95, ggplot = FALSE, rug = TRUE, return_data = FALSE )"},{"path":"https://pbs-assess.github.io/sdmTMB/reference/plot_smooth.html","id":"arguments","dir":"Reference","previous_headings":"","what":"Arguments","title":"Plot a smooth term from an sdmTMB model — plot_smooth","text":"object sdmTMB() model. select smoother term plot. n number equally spaced points evaluate smoother along. level confidence level. ggplot Logical: use ggplot2 package? rug Logical: add rug lines along lower axis? return_data Logical: return predicted data instead making plot?","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/reference/plot_smooth.html","id":"value","dir":"Reference","previous_headings":"","what":"Value","title":"Plot a smooth term from an sdmTMB model — plot_smooth","text":"plot smoother term.","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/reference/plot_smooth.html","id":"details","dir":"Reference","previous_headings":"","what":"Details","title":"Plot a smooth term from an sdmTMB model — plot_smooth","text":"Note: numeric predictor set mean factor predictor set first-level value time element (present) set minimum value x y coordinates set mean values","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/reference/plot_smooth.html","id":"ref-examples","dir":"Reference","previous_headings":"","what":"Examples","title":"Plot a smooth term from an sdmTMB model — plot_smooth","text":"","code":"d <- subset(pcod, year >= 2000 & density > 0) pcod_spde <- make_mesh(d, c(\"X\", \"Y\"), cutoff = 30) m <- sdmTMB( data = d, formula = log(density) ~ s(depth_scaled) + s(year, k = 5), mesh = pcod_spde ) plot_smooth(m) #> This function may be deprecated. #> Consider using `visreg::visreg()` or `visreg_delta()`. #> See ?visreg_delta() for examples."},{"path":"https://pbs-assess.github.io/sdmTMB/reference/predict.sdmTMB.html","id":null,"dir":"Reference","previous_headings":"","what":"Predict from an sdmTMB model — predict.sdmTMB","title":"Predict from an sdmTMB model — predict.sdmTMB","text":"Make predictions sdmTMB model; can predict original new data.","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/reference/predict.sdmTMB.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"Predict from an sdmTMB model — predict.sdmTMB","text":"","code":"# S3 method for sdmTMB predict( object, newdata = NULL, type = c(\"link\", \"response\"), se_fit = FALSE, re_form = NULL, re_form_iid = NULL, nsim = 0, sims_var = \"est\", model = c(NA, 1, 2), offset = NULL, mcmc_samples = NULL, return_tmb_object = FALSE, return_tmb_report = FALSE, return_tmb_data = FALSE, tmbstan_model = deprecated(), sims = deprecated(), area = deprecated(), ... )"},{"path":"https://pbs-assess.github.io/sdmTMB/reference/predict.sdmTMB.html","id":"arguments","dir":"Reference","previous_headings":"","what":"Arguments","title":"Predict from an sdmTMB model — predict.sdmTMB","text":"object model fitted sdmTMB(). newdata data frame make predictions . data frame predictor columns fitted data time column (spatiotemporal model) name fitted data. type est column link (default) response space? se_fit standard errors predictions new locations given newdata calculated? Warning: current implementation can slow large data sets high-resolution projections unless re_form = NA (omitting random fields). faster option approximate point-wise uncertainty use nsim argument. re_form NULL specify including spatial/spatiotemporal random effects predictions. ~0 NA population-level predictions. Likely used conjunction se_fit = TRUE. affect get_index() calculations. re_form_iid NULL specify including random intercepts predictions. ~0 NA population-level predictions. options (e.g., random intercepts) implemented yet. affects predictions newdata. affects get_index(). nsim > 0, simulate joint precision matrix nsim draws. Returns matrix nrow(data) nsim representing estimates linear predictor (.e., link space). Can useful deriving uncertainty predictions (e.g., apply(x, 1, sd)) propagating uncertainty. currently fastest way characterize uncertainty predictions space sdmTMB. sims_var Experimental: TMB reported variable model extracted joint precision matrix simulation draws? Defaults link-space predictions. Options include: \"omega_s\", \"zeta_s\", \"epsilon_st\", \"est_rf\" (described ). options passed verbatim. model Type prediction delta/hurdle model nsim > 0 mcmc_samples supplied: NA returns combined prediction components link scale positive component; 1 2 return first second model component link response scale depending argument type. regular prediction delta models, sets predictions returned. offset numeric vector optional offset values. left default NULL, offset implicitly left 0. mcmc_samples See extract_mcmc() sdmTMBextra package details Bayesian vignette. specified, predict function return matrix similar form nsim > 0 representing Bayesian posterior samples Stan model. return_tmb_object Logical. TRUE, include TMB object list format output. Necessary get_index() get_cog() functions. return_tmb_report Logical: return output TMB report? regular prediction, reported variables MLE parameter values. nsim > 0 mcmc_samples supplied, list element sample contents element output report sample. return_tmb_data Logical: return formatted data TMB? Used internally. tmbstan_model Deprecated. See mcmc_samples. sims Deprecated. Please use nsim instead. area Deprecated. Please use area get_index(). ... implemented.","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/reference/predict.sdmTMB.html","id":"value","dir":"Reference","previous_headings":"","what":"Value","title":"Predict from an sdmTMB model — predict.sdmTMB","text":"return_tmb_object = FALSE (nsim = 0 mcmc_samples = NULL): data frame: est: Estimate link space (everything link space) est_non_rf: Estimate everything random field est_rf: Estimate random fields combined omega_s: Spatial (intercept) random field constant time zeta_s: Spatial slope random field epsilon_st: Spatiotemporal (intercept) random fields, (zero), IID, AR1, random walk return_tmb_object = TRUE (nsim = 0 mcmc_samples = NULL): list: data: data frame described report: TMB report parameter values obj: TMB object returned prediction run fit_obj: original TMB model object case, likely need data element end user. elements included functions. nsim > 0 mcmc_samples NULL: matrix: Columns represent samples Rows represent predictions one row per row newdata","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/reference/predict.sdmTMB.html","id":"ref-examples","dir":"Reference","previous_headings":"","what":"Examples","title":"Predict from an sdmTMB model — predict.sdmTMB","text":"","code":"d <- pcod_2011 mesh <- make_mesh(d, c(\"X\", \"Y\"), cutoff = 30) # a coarse mesh for example speed m <- sdmTMB( data = d, formula = density ~ 0 + as.factor(year) + depth_scaled + depth_scaled2, time = \"year\", mesh = mesh, family = tweedie(link = \"log\") ) # Predictions at original data locations ------------------------------- predictions <- predict(m) head(predictions) #> # A tibble: 6 × 17 #> year X Y depth density present lat lon depth_mean depth_sd #> #> 1 2011 435. 5718. 241 245. 1 51.6 -130. 5.16 0.445 #> 2 2011 487. 5719. 52 0 0 51.6 -129. 5.16 0.445 #> 3 2011 490. 5717. 47 0 0 51.6 -129. 5.16 0.445 #> 4 2011 545. 5717. 157 0 0 51.6 -128. 5.16 0.445 #> 5 2011 404. 5720. 398 0 0 51.6 -130. 5.16 0.445 #> 6 2011 420. 5721. 486 0 0 51.6 -130. 5.16 0.445 #> # ℹ 7 more variables: depth_scaled , depth_scaled2 , est , #> # est_non_rf , est_rf , omega_s , epsilon_st predictions$resids <- residuals(m) # randomized quantile residuals #> Note what used to be the default sdmTMB residuals (before version 0.4.3.9005) #> are now `type = 'mle-eb'`. We recommend using the current default `'mle-mvn'`, #> which takes one sample from the approximate posterior of the random effects or #> `dharma_residuals()` using a similar approach. library(ggplot2) ggplot(predictions, aes(X, Y, col = resids)) + scale_colour_gradient2() + geom_point() + facet_wrap(~year) hist(predictions$resids) qqnorm(predictions$resids);abline(a = 0, b = 1) # Predictions onto new data -------------------------------------------- qcs_grid_2011 <- replicate_df(qcs_grid, \"year\", unique(pcod_2011$year)) predictions <- predict(m, newdata = qcs_grid_2011) # \\donttest{ # A short function for plotting our predictions: plot_map <- function(dat, column = est) { ggplot(dat, aes(X, Y, fill = {{ column }})) + geom_raster() + facet_wrap(~year) + coord_fixed() } plot_map(predictions, exp(est)) + scale_fill_viridis_c(trans = \"sqrt\") + ggtitle(\"Prediction (fixed effects + all random effects)\") plot_map(predictions, exp(est_non_rf)) + ggtitle(\"Prediction (fixed effects and any time-varying effects)\") + scale_fill_viridis_c(trans = \"sqrt\") plot_map(predictions, est_rf) + ggtitle(\"All random field estimates\") + scale_fill_gradient2() plot_map(predictions, omega_s) + ggtitle(\"Spatial random effects only\") + scale_fill_gradient2() plot_map(predictions, epsilon_st) + ggtitle(\"Spatiotemporal random effects only\") + scale_fill_gradient2() # Visualizing a marginal effect ---------------------------------------- # See the visreg package or the ggeffects::ggeffect() or # ggeffects::ggpredict() functions # To do this manually: nd <- data.frame(depth_scaled = seq(min(d$depth_scaled), max(d$depth_scaled), length.out = 100)) nd$depth_scaled2 <- nd$depth_scaled^2 # Because this is a spatiotemporal model, you'll need at least one time # element. If time isn't also a fixed effect then it doesn't matter what you pick: nd$year <- 2011L # L: integer to match original data p <- predict(m, newdata = nd, se_fit = TRUE, re_form = NA) ggplot(p, aes(depth_scaled, exp(est), ymin = exp(est - 1.96 * est_se), ymax = exp(est + 1.96 * est_se))) + geom_line() + geom_ribbon(alpha = 0.4) # Plotting marginal effect of a spline --------------------------------- m_gam <- sdmTMB( data = d, formula = density ~ 0 + as.factor(year) + s(depth_scaled, k = 5), time = \"year\", mesh = mesh, family = tweedie(link = \"log\") ) if (require(\"visreg\", quietly = TRUE)) { visreg::visreg(m_gam, \"depth_scaled\") } # or manually: nd <- data.frame(depth_scaled = seq(min(d$depth_scaled), max(d$depth_scaled), length.out = 100)) nd$year <- 2011L p <- predict(m_gam, newdata = nd, se_fit = TRUE, re_form = NA) ggplot(p, aes(depth_scaled, exp(est), ymin = exp(est - 1.96 * est_se), ymax = exp(est + 1.96 * est_se))) + geom_line() + geom_ribbon(alpha = 0.4) # Forecasting ---------------------------------------------------------- mesh <- make_mesh(d, c(\"X\", \"Y\"), cutoff = 15) unique(d$year) #> [1] 2011 2013 2015 2017 m <- sdmTMB( data = d, formula = density ~ 1, spatiotemporal = \"AR1\", # using an AR1 to have something to forecast with extra_time = 2019L, # `L` for integer to match our data spatial = \"off\", time = \"year\", mesh = mesh, family = tweedie(link = \"log\") ) # Add a year to our grid: grid2019 <- qcs_grid_2011[qcs_grid_2011$year == max(qcs_grid_2011$year), ] grid2019$year <- 2019L # `L` because `year` is an integer in the data qcsgrid_forecast <- rbind(qcs_grid_2011, grid2019) predictions <- predict(m, newdata = qcsgrid_forecast) plot_map(predictions, exp(est)) + scale_fill_viridis_c(trans = \"log10\") plot_map(predictions, epsilon_st) + scale_fill_gradient2() # Estimating local trends ---------------------------------------------- d <- pcod d$year_scaled <- as.numeric(scale(d$year)) mesh <- make_mesh(pcod, c(\"X\", \"Y\"), cutoff = 25) m <- sdmTMB(data = d, formula = density ~ depth_scaled + depth_scaled2, mesh = mesh, family = tweedie(link = \"log\"), spatial_varying = ~ 0 + year_scaled, time = \"year\", spatiotemporal = \"off\") nd <- replicate_df(qcs_grid, \"year\", unique(pcod$year)) nd$year_scaled <- (nd$year - mean(d$year)) / sd(d$year) p <- predict(m, newdata = nd) plot_map(subset(p, year == 2003), zeta_s_year_scaled) + # pick any year ggtitle(\"Spatial slopes\") + scale_fill_gradient2() plot_map(p, est_rf) + ggtitle(\"Random field estimates\") + scale_fill_gradient2() plot_map(p, exp(est_non_rf)) + ggtitle(\"Prediction (fixed effects only)\") + scale_fill_viridis_c(trans = \"sqrt\") plot_map(p, exp(est)) + ggtitle(\"Prediction (fixed effects + all random effects)\") + scale_fill_viridis_c(trans = \"sqrt\") # }"},{"path":"https://pbs-assess.github.io/sdmTMB/reference/priors.html","id":null,"dir":"Reference","previous_headings":"","what":"Prior distributions — sdmTMBpriors","title":"Prior distributions — sdmTMBpriors","text":"Optional priors/penalties model parameters. results penalized likelihood within TMB can used priors model passed tmbstan (see Bayesian vignette). Note Jacobian adjustments made bayesian = TRUE sdmTMB() model fit. .e., final model fit tmbstan priors specified bayesian set TRUE. Otherwise, leave bayesian = FALSE. pc_matern() Penalized Complexity prior Matern covariance function.","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/reference/priors.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"Prior distributions — sdmTMBpriors","text":"","code":"sdmTMBpriors( matern_s = pc_matern(range_gt = NA, sigma_lt = NA), matern_st = pc_matern(range_gt = NA, sigma_lt = NA), phi = halfnormal(NA, NA), ar1_rho = normal(NA, NA), tweedie_p = normal(NA, NA), b = normal(NA, NA), sigma_G = halfnormal(NA, NA) ) normal(location = 0, scale = 1) halfnormal(location = 0, scale = 1) mvnormal(location = 0, scale = diag(length(location))) pc_matern(range_gt, sigma_lt, range_prob = 0.05, sigma_prob = 0.05)"},{"path":"https://pbs-assess.github.io/sdmTMB/reference/priors.html","id":"arguments","dir":"Reference","previous_headings":"","what":"Arguments","title":"Prior distributions — sdmTMBpriors","text":"matern_s PC (Penalized Complexity) prior (pc_matern()) spatial random field Matérn parameters. matern_st matern_s spatiotemporal random field. Note likely want set share_fields = FALSE choose set spatial spatiotemporal Matérn PC prior since include prior spatial range parameter. phi halfnormal() prior dispersion parameter observation distribution. ar1_rho normal() prior AR1 random field parameter. Note parameter support -1 < ar1_rho < 1. tweedie_p normal() prior Tweedie power parameter. Note parameter support 1 < tweedie_p < 2 choose mean appropriately. b normal() priors main population-level 'beta' effects. sigma_G halfnormal() priors random intercept SDs. location Location parameter(s). scale Scale parameter. normal()/halfnormal(): standard deviation(s). mvnormal(): variance-covariance matrix. range_gt value one expects spatial spatiotemporal range greater 1 - range_prob probability. sigma_lt value one expects spatial spatiotemporal marginal standard deviation (sigma_O sigma_E internally) less 1 - sigma_prob probability. range_prob Probability. See description range_gt. sigma_prob Probability. See description sigma_lt.","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/reference/priors.html","id":"value","dir":"Reference","previous_headings":"","what":"Value","title":"Prior distributions — sdmTMBpriors","text":"named list values specified priors.","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/reference/priors.html","id":"details","dir":"Reference","previous_headings":"","what":"Details","title":"Prior distributions — sdmTMBpriors","text":"Meant passed priors argument sdmTMB(). normal() halfnormal() define normal half-normal priors , point, must location (mean) parameter 0. halfnormal() normal() can used make syntax clearer. intended used parameters support > 0. See https://arxiv.org/abs/1503.00256 description PC prior Gaussian random fields. Quoting discussion (substituting argument names pc_matern()): \"simulation study observe good coverage equal-tailed 95% credible intervals prior satisfies P(sigma > sigma_lt) = 0.05 P(range < range_gt) = 0.05, sigma_lt 2.5 40 times true marginal standard deviation range_gt 1/10 1/2.5 true range.\" Keep mind range dependent units scale coordinate system. practice, may choose try fitting model without PC prior constraining model . better option simulate model given range sigma choose reasonable values system base prior knowledge model fit similar system spatial information data.","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/reference/priors.html","id":"references","dir":"Reference","previous_headings":"","what":"References","title":"Prior distributions — sdmTMBpriors","text":"Fuglstad, G.-., Simpson, D., Lindgren, F., Rue, H. (2016) Constructing Priors Penalize Complexity Gaussian Random Fields. arXiv:1503.00256 Simpson, D., Rue, H., Martins, T., Riebler, ., Sørbye, S. (2015) Penalising model component complexity: principled, practical approach constructing priors. arXiv:1403.4630","code":""},{"path":[]},{"path":"https://pbs-assess.github.io/sdmTMB/reference/priors.html","id":"ref-examples","dir":"Reference","previous_headings":"","what":"Examples","title":"Prior distributions — sdmTMBpriors","text":"","code":"normal(0, 1) #> [,1] [,2] #> [1,] 0 1 #> attr(,\"dist\") #> [1] \"normal\" halfnormal(0, 1) #> [,1] [,2] #> [1,] 0 1 #> attr(,\"dist\") #> [1] \"normal\" mvnormal(c(0, 0)) #> [,1] [,2] [,3] #> [1,] 0 1 0 #> [2,] 0 0 1 #> attr(,\"dist\") #> [1] \"mvnormal\" pc_matern(range_gt = 5, sigma_lt = 1) #> [1] 5.00 1.00 0.05 0.05 #> attr(,\"dist\") #> [1] \"pc_matern\" plot_pc_matern(range_gt = 5, sigma_lt = 1) # \\donttest{ d <- subset(pcod, year > 2011) pcod_spde <- make_mesh(d, c(\"X\", \"Y\"), cutoff = 30) # - no priors on population-level effects (`b`) # - halfnormal(0, 10) prior on dispersion parameter `phi` # - Matern PC priors on spatial `matern_s` and spatiotemporal # `matern_st` random field parameters m <- sdmTMB(density ~ s(depth, k = 3), data = d, mesh = pcod_spde, family = tweedie(), share_range = FALSE, time = \"year\", priors = sdmTMBpriors( phi = halfnormal(0, 10), matern_s = pc_matern(range_gt = 5, sigma_lt = 1), matern_st = pc_matern(range_gt = 5, sigma_lt = 1) ) ) # - no prior on intercept # - normal(0, 1) prior on depth coefficient # - no prior on the dispersion parameter `phi` # - Matern PC prior m <- sdmTMB(density ~ depth_scaled, data = d, mesh = pcod_spde, family = tweedie(), spatiotemporal = \"off\", priors = sdmTMBpriors( b = normal(c(NA, 0), c(NA, 1)), matern_s = pc_matern(range_gt = 5, sigma_lt = 1) ) ) # You get a prior, you get a prior, you get a prior! # (except on the annual means; see the `NA`s) m <- sdmTMB(density ~ 0 + depth_scaled + depth_scaled2 + as.factor(year), data = d, time = \"year\", mesh = pcod_spde, family = tweedie(link = \"log\"), share_range = FALSE, spatiotemporal = \"AR1\", priors = sdmTMBpriors( b = normal(c(0, 0, NA, NA, NA), c(2, 2, NA, NA, NA)), phi = halfnormal(0, 10), # tweedie_p = normal(1.5, 2), ar1_rho = normal(0, 1), matern_s = pc_matern(range_gt = 5, sigma_lt = 1), matern_st = pc_matern(range_gt = 5, sigma_lt = 1)) ) # }"},{"path":"https://pbs-assess.github.io/sdmTMB/reference/reexports.html","id":null,"dir":"Reference","previous_headings":"","what":"Objects exported from other packages — reexports","title":"Objects exported from other packages — reexports","text":"objects imported packages. Follow links see documentation. generics tidy","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/reference/replicate_df.html","id":null,"dir":"Reference","previous_headings":"","what":"Replicate a prediction data frame over time — replicate_df","title":"Replicate a prediction data frame over time — replicate_df","text":"Useful replicating prediction grids across time slices used model fitting.","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/reference/replicate_df.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"Replicate a prediction data frame over time — replicate_df","text":"","code":"replicate_df(dat, time_name, time_values)"},{"path":"https://pbs-assess.github.io/sdmTMB/reference/replicate_df.html","id":"arguments","dir":"Reference","previous_headings":"","what":"Arguments","title":"Replicate a prediction data frame over time — replicate_df","text":"dat Data frame. time_name Name time column output. time_values Time values replicate dat .","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/reference/replicate_df.html","id":"value","dir":"Reference","previous_headings":"","what":"Value","title":"Replicate a prediction data frame over time — replicate_df","text":"data frame replicated time_values new column based time_name.","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/reference/replicate_df.html","id":"ref-examples","dir":"Reference","previous_headings":"","what":"Examples","title":"Replicate a prediction data frame over time — replicate_df","text":"","code":"df <- data.frame(variable = c(\"a\", \"b\")) replicate_df(df, time_name = \"year\", time_values = 1:3) #> variable year #> 1 a 1 #> 2 b 1 #> 3 a 2 #> 4 b 2 #> 5 a 3 #> 6 b 3 head(qcs_grid) #> X Y depth depth_scaled depth_scaled2 #> 1 456 5636 347.0834 1.5608122 2.43613479 #> 2 458 5636 223.3348 0.5697699 0.32463771 #> 3 460 5636 203.7408 0.3633693 0.13203724 #> 4 462 5636 183.2987 0.1257046 0.01580166 #> 5 464 5636 182.9998 0.1220368 0.01489297 #> 6 466 5636 186.3892 0.1632882 0.02666303 nd <- replicate_df(qcs_grid, \"year\", unique(pcod$year)) head(nd) #> X Y depth depth_scaled depth_scaled2 year #> 1 456 5636 347.0834 1.5608122 2.43613479 2003 #> 2 458 5636 223.3348 0.5697699 0.32463771 2003 #> 3 460 5636 203.7408 0.3633693 0.13203724 2003 #> 4 462 5636 183.2987 0.1257046 0.01580166 2003 #> 5 464 5636 182.9998 0.1220368 0.01489297 2003 #> 6 466 5636 186.3892 0.1632882 0.02666303 2003 table(nd$year) #> #> 2003 2004 2005 2007 2009 2011 2013 2015 2017 #> 7314 7314 7314 7314 7314 7314 7314 7314 7314"},{"path":"https://pbs-assess.github.io/sdmTMB/reference/residuals.sdmTMB.html","id":null,"dir":"Reference","previous_headings":"","what":"Residuals method for sdmTMB models — residuals.sdmTMB","title":"Residuals method for sdmTMB models — residuals.sdmTMB","text":"See residual-checking vignette: browseVignettes(\"sdmTMB\") documentation site. See notes types residuals 'Details' section .","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/reference/residuals.sdmTMB.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"Residuals method for sdmTMB models — residuals.sdmTMB","text":"","code":"# S3 method for sdmTMB residuals( object, type = c(\"mle-mvn\", \"mle-eb\", \"mle-mcmc\", \"response\", \"pearson\"), model = c(1, 2), mcmc_samples = NULL, qres_func = NULL, ... )"},{"path":"https://pbs-assess.github.io/sdmTMB/reference/residuals.sdmTMB.html","id":"arguments","dir":"Reference","previous_headings":"","what":"Arguments","title":"Residuals method for sdmTMB models — residuals.sdmTMB","text":"object sdmTMB() model. type Residual type. See details. model delta/hurdle model component? mcmc_samples vector MCMC samples linear predictor link space. See predict_mle_mcmc() function sdmTMBextra package. qres_func custom quantile residuals function. Function take arguments object, y, mu, ... return vector length length(y). ... Passed custom qres_func function. Unused.","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/reference/residuals.sdmTMB.html","id":"value","dir":"Reference","previous_headings":"","what":"Value","title":"Residuals method for sdmTMB models — residuals.sdmTMB","text":"vector residuals. Note randomization single random effect posterior sample randomized quantile routines result different residuals call. suggested set randomization seed go \"fishing\" perfect residuals present inspected residuals.","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/reference/residuals.sdmTMB.html","id":"details","dir":"Reference","previous_headings":"","what":"Details","title":"Residuals method for sdmTMB models — residuals.sdmTMB","text":"Randomized quantile residuals: mle-mvn, mle-eb, mle-mcmc implementations randomized quantile residuals (Dunn & Smyth 1996), also known probability integral transform (PIT) residuals (Smith 1985). data consistent model assumptions, residuals distributed normal(0, 1). Randomization added account integer binary response observations. example, Poisson observation likelihood observations y mean predictions mu, create randomized quantile residuals : Types residuals: Acronyms: EB: Empirical Bayes MCMC: Markov chain Monte Carlo MLE: Maximum Likelihood Estimate MVN: Multivariate normal mle-mvn: Fixed effects held MLEs random effects taken single approximate posterior sample. \"approximate\" part refers sample taken random effects' assumed MVN distribution. practice, sample obtained based mode Hessian random effects taking advantage sparsity Hessian computational efficiency. sample taken obj$MC(), obj TMB object created TMB::MakeADFun(). See Waagepetersen (2006) description source code internal TMB function TMB:::oneSamplePosterior(). Residuals converted randomized quantile residuals described . mle-eb: Fixed effects held MLEs random effects taken EB estimates. used default residuals sdmTMB (called mle-laplace). available backwards compatibility research purposes recommended checking goodness fit. Residuals converted randomized quantile residuals described . mle-mcmc: Fixed effects held MLEs random effects taken single posterior sample obtained MCMC. excellent option since make assumption distribution random effects (compared mle-mvn option) can slow obtain. See Waagepetersen (2006) Thygesen et al. (2017). Residuals converted randomized quantile residuals described . See sdmTMBextra package function predict_mle_mcmc(), can generate MCMC samples pass mcmc_samples argument. Ideally MCMC run convergence last iteration can used residuals. defaults may sufficient many models. response: simple observed minus predicted residuals. pearson: Pearson residuals: response residuals scaled standard deviation. weights present, residuals multiplied sqrt(weights).","code":"a <- ppois(y - 1, mu) b <- ppois(y, mu) u <- runif(n = length(y), min = a, max = b) qnorm(u)"},{"path":"https://pbs-assess.github.io/sdmTMB/reference/residuals.sdmTMB.html","id":"references","dir":"Reference","previous_headings":"","what":"References","title":"Residuals method for sdmTMB models — residuals.sdmTMB","text":"Dunn, P.K. & Smyth, G.K. (1996). Randomized Quantile Residuals. Journal Computational Graphical Statistics, 5, 236–244. Smith, J.Q. (1985). Diagnostic checks non-standard time series models. Journal Forecasting, 4, 283–291. Waagepetersen, R. (2006). simulation-based goodness--fit test random effects generalized linear mixed models. Scandinavian Journal Statistics, 33(4), 721-731. Thygesen, U.H., Albertsen, C.M., Berg, C.W., Kristensen, K., Nielsen, . 2017. Validation ecological state space models using Laplace approximation. Environ Ecol Stat 24(2): 317–339. doi:10.1007/s10651-017-0372-4 Rufener, M.-C., Kristensen, K., Nielsen, J.R., Bastardie, F. 2021. Bridging gap commercial fisheries survey data model spatiotemporal dynamics marine species. Ecological Applications. e02453. doi:10.1002/eap.2453","code":""},{"path":[]},{"path":"https://pbs-assess.github.io/sdmTMB/reference/residuals.sdmTMB.html","id":"ref-examples","dir":"Reference","previous_headings":"","what":"Examples","title":"Residuals method for sdmTMB models — residuals.sdmTMB","text":"","code":"mesh <- make_mesh(pcod_2011, c(\"X\", \"Y\"), cutoff = 10) fit <- sdmTMB( present ~ as.factor(year) + poly(depth, 2), data = pcod_2011, mesh = mesh, family = binomial() ) # the default \"mle-mvn\" residuals use fixed effects at their MLE and a # single sample from the approximate random effect posterior: set.seed(9283) r <- residuals(fit, type = \"mle-mvn\") qqnorm(r) abline(0, 1) # response residuals will be not be normally distributed unless # the family is Gaussian: r <- residuals(fit, type = \"response\") qqnorm(r) abline(0, 1) # \"mle-eb\" are quick but are not expected to be N(0, 1); not recommended: set.seed(2321) r <- residuals(fit, type = \"mle-eb\") qqnorm(r) abline(0, 1) # see also \"mle-mcmc\" residuals with the help of the sdmTMBextra package # we can fake them here by taking a single sample from the joint precision # matrix and pretending they are MCMC samples: set.seed(82728) p <- predict(fit, nsim = 1) # pretend these are from sdmTMBextra::predict_mle_mcmc() r <- residuals(fit, mcmc_samples = p) #> Note what used to be the default sdmTMB residuals (before version 0.4.3.9005) #> are now `type = 'mle-eb'`. We recommend using the current default `'mle-mvn'`, #> which takes one sample from the approximate posterior of the random effects or #> `dharma_residuals()` using a similar approach. qqnorm(r) abline(0, 1)"},{"path":[]},{"path":"https://pbs-assess.github.io/sdmTMB/reference/run_extra_optimization.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"Run extra optimization on an already fitted object — run_extra_optimization","text":"","code":"run_extra_optimization(object, nlminb_loops = 0, newton_loops = 1)"},{"path":"https://pbs-assess.github.io/sdmTMB/reference/run_extra_optimization.html","id":"arguments","dir":"Reference","previous_headings":"","what":"Arguments","title":"Run extra optimization on an already fitted object — run_extra_optimization","text":"object object sdmTMB(). nlminb_loops many extra times run stats::nlminb() optimization. Sometimes restarting optimizer previous best values aids convergence. newton_loops many extra Newton optimization loops try stats::optimHess(). Sometimes aids convergence.","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/reference/run_extra_optimization.html","id":"value","dir":"Reference","previous_headings":"","what":"Value","title":"Run extra optimization on an already fitted object — run_extra_optimization","text":"updated model fit class sdmTMB.","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/reference/run_extra_optimization.html","id":"ref-examples","dir":"Reference","previous_headings":"","what":"Examples","title":"Run extra optimization on an already fitted object — run_extra_optimization","text":"","code":"# Run extra optimization steps to help convergence: # (Not typically needed) fit <- sdmTMB(density ~ 0 + poly(depth, 2) + as.factor(year), data = pcod_2011, mesh = pcod_mesh_2011, family = tweedie()) fit_1 <- run_extra_optimization(fit, newton_loops = 1) max(fit$gradients) #> [1] 5.791343e-09 max(fit_1$gradients) #> [1] 5.791343e-09"},{"path":"https://pbs-assess.github.io/sdmTMB/reference/sanity.html","id":null,"dir":"Reference","previous_headings":"","what":"Sanity check of an sdmTMB model — sanity","title":"Sanity check of an sdmTMB model — sanity","text":"Sanity check sdmTMB model","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/reference/sanity.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"Sanity check of an sdmTMB model — sanity","text":"","code":"sanity(object, big_sd_log10 = 2, gradient_thresh = 0.001, silent = FALSE)"},{"path":"https://pbs-assess.github.io/sdmTMB/reference/sanity.html","id":"arguments","dir":"Reference","previous_headings":"","what":"Arguments","title":"Sanity check of an sdmTMB model — sanity","text":"object Fitted model sdmTMB(). big_sd_log10 Value check size standard errors . value 2 indicate standard errors greater 10^2 (.e., 100) flagged. gradient_thresh Gradient threshold issue warning. silent Logical: suppress messages? Useful set TRUE running large numbers models just interested returning sanity list objects.","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/reference/sanity.html","id":"value","dir":"Reference","previous_headings":"","what":"Value","title":"Sanity check of an sdmTMB model — sanity","text":"invisible named list checks.","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/reference/sanity.html","id":"details","dir":"Reference","previous_headings":"","what":"Details","title":"Sanity check of an sdmTMB model — sanity","text":"object NA, NULL, class \"try-error\", sanity() return FALSE. facilitate using sanity() models try() tryCatch(). See examples section.","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/reference/sanity.html","id":"ref-examples","dir":"Reference","previous_headings":"","what":"Examples","title":"Sanity check of an sdmTMB model — sanity","text":"","code":"fit <- sdmTMB( present ~ s(depth), data = pcod_2011, mesh = pcod_mesh_2011, family = binomial() ) sanity(fit) #> ✔ Non-linear minimizer suggests successful convergence #> ✔ Hessian matrix is positive definite #> ✔ No extreme or very small eigenvalues detected #> ✔ No gradients with respect to fixed effects are >= 0.001 #> ✔ No fixed-effect standard errors are NA #> ✔ No standard errors look unreasonably large #> ✔ No sigma parameters are < 0.01 #> ✔ No sigma parameters are > 100 #> ✔ Range parameter doesn't look unreasonably large s <- sanity(fit) #> ✔ Non-linear minimizer suggests successful convergence #> ✔ Hessian matrix is positive definite #> ✔ No extreme or very small eigenvalues detected #> ✔ No gradients with respect to fixed effects are >= 0.001 #> ✔ No fixed-effect standard errors are NA #> ✔ No standard errors look unreasonably large #> ✔ No sigma parameters are < 0.01 #> ✔ No sigma parameters are > 100 #> ✔ Range parameter doesn't look unreasonably large s #> $hessian_ok #> [1] TRUE #> #> $eigen_values_ok #> [1] TRUE #> #> $nlminb_ok #> [1] TRUE #> #> $range_ok #> [1] TRUE #> #> $gradients_ok #> [1] TRUE #> #> $se_magnitude_ok #> [1] TRUE #> #> $se_na_ok #> [1] TRUE #> #> $sigmas_ok #> [1] TRUE #> #> $all_ok #> [1] TRUE #> # If fitting many models in a loop, you may want to wrap # sdmTMB() in try() to handle errors. sanity() will take an object # of class \"try-error\" and return FALSE. # Here, we will use stop() to simulate a failed sdmTMB() fit: failed_fit <- try(stop()) #> Error in try(stop()) : s2 <- sanity(failed_fit) all(unlist(s)) #> [1] TRUE all(unlist(s2)) #> [1] FALSE"},{"path":"https://pbs-assess.github.io/sdmTMB/reference/sdmTMB.html","id":null,"dir":"Reference","previous_headings":"","what":"Fit a spatial or spatiotemporal GLMM with TMB — sdmTMB","title":"Fit a spatial or spatiotemporal GLMM with TMB — sdmTMB","text":"Fit spatial spatiotemporal generalized linear mixed effects model (GLMM) TMB (Template Model Builder) R package SPDE (stochastic partial differential equation) approximation Gaussian random fields.","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/reference/sdmTMB.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"Fit a spatial or spatiotemporal GLMM with TMB — sdmTMB","text":"","code":"sdmTMB( formula, data, mesh, time = NULL, family = gaussian(link = \"identity\"), spatial = c(\"on\", \"off\"), spatiotemporal = c(\"iid\", \"ar1\", \"rw\", \"off\"), share_range = TRUE, time_varying = NULL, time_varying_type = c(\"rw\", \"rw0\", \"ar1\"), spatial_varying = NULL, weights = NULL, offset = NULL, extra_time = NULL, reml = FALSE, silent = TRUE, anisotropy = FALSE, control = sdmTMBcontrol(), priors = sdmTMBpriors(), knots = NULL, bayesian = FALSE, previous_fit = NULL, do_fit = TRUE, do_index = FALSE, predict_args = NULL, index_args = NULL, experimental = NULL )"},{"path":"https://pbs-assess.github.io/sdmTMB/reference/sdmTMB.html","id":"arguments","dir":"Reference","previous_headings":"","what":"Arguments","title":"Fit a spatial or spatiotemporal GLMM with TMB — sdmTMB","text":"formula Model formula. IID random intercepts possible using lme4 syntax, e.g., + (1 | g) g column class character factor representing groups. Penalized splines possible via mgcv s(). Optionally list delta (hurdle) models. See examples details . data data frame. mesh object make_mesh(). time optional time column name (character). Can left NULL model spatial random fields; however, data actually spatiotemporal wish use get_index() get_cog() downstream, supply time argument. family family link. Supports gaussian(), Gamma(), binomial(), poisson(), Beta(), nbinom2(), truncated_nbinom2(), nbinom1(), truncated_nbinom1(), censored_poisson(), gamma_mix(), lognormal_mix(), student(), tweedie(), gengamma(). Supports delta/hurdle models: delta_beta(), delta_gamma(), delta_gamma_mix(), delta_lognormal_mix(), delta_lognormal(), delta_truncated_nbinom2(), binomial family options, see 'Binomial families' Details section . spatial Estimate spatial random fields? Options '' / '' TRUE / FALSE. Optionally, list delta models, e.g. list('', ''). spatiotemporal Estimate spatiotemporal random fields 'iid' (independent identically distributed; default), stationary 'ar1' (first-order autoregressive), random walk ('rw'), fixed 0 ''. set '' time = NULL. delta model, can list. E.g., list('', 'ar1'). Note spatiotemporal standard deviation represents marginal steady-state standard deviation process case AR1. .e., scaled according correlation. See TMB documentation. AR1 correlation coefficient (rho) estimated close 1, say > 0.99, may wish switch random walk 'rw'. Capitalization ignored. TRUE gets converted 'iid' FALSE gets converted ''. share_range Logical: estimate shared spatial spatiotemporal range parameter (TRUE, default) independent range parameters (FALSE). delta model, can list. E.g., list(TRUE, FALSE). time_varying optional one-sided formula describing covariates modelled time-varying process. Set type process time_varying_type. See help time_varying_type warnings modelling first time step. Structure shared delta models. time_varying_type Type time-varying process apply time_varying formula. 'rw' indicates random walk first time step estimated independently (included legacy reasons), 'rw0' indicates random walk first time step estimated mean-zero normal prior, 'ar1' indicates stationary first-order autoregressive process first time step estimated mean-zero prior. case 'rw', careful include covariates (including intercept) main time-varying formula since first time step estimated independently. .e., case, least one ~ 0 ~ -1. Structure shared delta models. spatial_varying optional one-sided formula coefficients vary space random fields. Note likely want include fixed effect variable improve interpretability since random field assumed mean 0. (scaled) time column used, represent local-time-trend model. See doi:10.1111/ecog.05176 spatial trends vignette. Note predictor usually centered mean zero standard deviation approximately 1. spatial intercept controlled spatial argument; therefore, include exclude spatial intercept setting spatial = '' ''. time matters whether spatial_varying excludes intercept case factor predictors. case, spatial_varying excludes intercept (~ 0 ~ -1), set spatial = '' match. Structure must shared delta models. weights numeric vector representing optional likelihood weights conditional model. Implemented glmmTMB: weights sum one internally modified. Can also used trials binomial family; weights argument needs vector name variable data frame. See Details section . offset numeric vector representing model offset character value representing column name offset. delta/hurdle models, applies positive component. Usually log transformed variable. extra_time Optional extra time slices (e.g., years) include interpolation forecasting predict function. See Details section . reml Logical: use REML (restricted maximum likelihood) estimation rather maximum likelihood? Internally, adds fixed effects list random effects integrate . silent Silent include optimization details? Helpful set FALSE models take fit. anisotropy Logical: allow anisotropy (spatial correlation directionally dependent)? See plot_anisotropy(). Must shared across delta models. control Optimization control options via sdmTMBcontrol(). priors Optional penalties/priors via sdmTMBpriors(). Must currently shared across delta models. knots Optional named list containing knot values used basis construction smoothing terms. See mgcv::gam() mgcv::gamm(). E.g., s(x, bs = 'cc', k = 4), knots = list(x = c(1, 2, 3, 4)) bayesian Logical indicating model passed tmbstan. TRUE, Jacobian adjustments applied account parameter transformations priors applied. previous_fit previously fitted sdmTMB model initialize optimization . Can greatly speed fitting. Note model must set exactly way. However, data weights arguments can change, can useful cross-validation. do_fit Fit model (TRUE) return processed data without fitting (FALSE)? do_index index standardization calculations fitting? Saves memory time working large datasets projection grids since TMB object rebuilt predict.sdmTMB() get_index(). TRUE, predict_args must newdata element supplied area can supplied index_args. users can ignore option. fitted object can passed directly get_index(). predict_args list arguments pass predict.sdmTMB() do_index = TRUE. users can ignore option. index_args list arguments pass get_index() do_index = TRUE. Currently, area supported. Bias correction can done calling get_index() resulting fitted object. users can ignore option. experimental named list esoteric -development options. dragons.","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/reference/sdmTMB.html","id":"value","dir":"Reference","previous_headings":"","what":"Value","title":"Fit a spatial or spatiotemporal GLMM with TMB — sdmTMB","text":"object (list) class sdmTMB. Useful elements include: sd_report: output TMB::sdreport() gradients: marginal log likelihood gradients respect fixed effect model: output stats::nlminb() data: fitted data mesh: object supplied mesh argument family: family object, includes inverse link function family$linkinv() tmb_params: parameters list passed TMB::MakeADFun() tmb_map: 'map' list passed TMB::MakeADFun() tmb_data: data list passed TMB::MakeADFun() tmb_obj: TMB object created TMB::MakeADFun()","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/reference/sdmTMB.html","id":"details","dir":"Reference","previous_headings":"","what":"Details","title":"Fit a spatial or spatiotemporal GLMM with TMB — sdmTMB","text":"Model description See model description vignette relevant appendix preprint sdmTMB: doi:10.1101/2022.03.24.485545 Binomial families Following structure stats::glm() glmmTMB, binomial family can specified one 4 ways: (1) response may factor (model classifies first level versus others), (2) response may binomial (0/1), (3) response can matrix form cbind(success, failure), (4) response may observed proportions, 'weights' argument used specify Binomial size (N) parameter (prob ~ ..., weights = N). Smooth terms Smooth terms can included following GAMs (generalized additive models) using + s(x), implements smooth mgcv::s(). sdmTMB uses penalized smooths, constructed via mgcv::smooth2random(). similar approach implemented gamm4 brms, among packages. Within smooths, syntax commonly used mgcv::s() mgcv::t2() can applied, e.g. 2-dimensional smooths may constructed + s(x, y) + t2(x, y); smooths can specific various factor levels, + s(x, = group); basis function dimensions may specified, e.g. + s(x, k = 4); various types splines may constructed cyclic splines model seasonality (perhaps knots argument also supplied). Threshold models linear break-point relationship covariate can included via + breakpt(variable) formula, variable single covariate corresponding column data. case, relationship linear point constant (hockey-stick shaped). Similarly, logistic-function threshold model can included via + logistic(variable). option models relationship logistic function 50% 95% values. similar length- size-based selectivity fisheries, parameterized points f(x) = 0.5 0.95. See threshold vignette. Note single threshold covariate can included covariate included components delta families. Extra time: forecasting interpolating Extra time slices (e.g., years) can included interpolation forecasting predict function via extra_time argument. predict function requires time slices defined fitting model ensure various time indices set correctly. careful including extra time slices model remains identifiable. example, including + .factor(year) formula render model data inform expected value missing year. sdmTMB() makes attempt determine model makes sense forecasting interpolation. options time_varying, spatiotemporal = \"rw\", spatiotemporal = \"ar1\", smoother time column provide mechanisms predict missing time slices process error. extra_time can also used fill missing time steps purposes random walk AR(1) process gaps time steps uneven. extra_time can include extra time steps time steps including found fitted data. latter option may simpler. Regularization priors can achieve regularization via penalties (priors) fixed effect parameters. See sdmTMBpriors(). can fit model without penalties look output print(your_model) tidy(your_model) fit model do_fit = FALSE inspect head(your_model$tmb_data$X_ij[[1]]) want see formula translated fixed effect model matrix. Also see Bayesian vignette. Delta/hurdle models Delta models (also known hurdle models) can fit two separate models time using appropriate delta family. E.g.: delta_gamma(), delta_beta(), delta_lognormal(), delta_truncated_nbinom2(). fit delta family, default formula, spatial, spatiotemporal components shared. elements can specified independently two models using list format. include formula, spatial, spatiotemporal, share_range. first element list binomial component second element positive component (e.g., Gamma). elements must shared now (e.g., spatially varying coefficients, time-varying coefficients). Furthermore, currently limitations specifying two formulas list: two formulas smoothers, threshold effects, random intercepts. now, must specified single formula shared across two models. main advantage specifying models using delta family (compared fitting two separate models) (1) coding simplicity (2) calculation uncertainty derived quantities index abundance get_index() using generalized delta method within TMB. Also, selected parameters can shared across models. See delta-model vignette. Index standardization index standardization, may wish include 0 + .factor(year) (whatever time column called) formula. See basic example index standardization relevant package vignette. need specify time argument. See get_index().","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/reference/sdmTMB.html","id":"references","dir":"Reference","previous_headings":"","what":"References","title":"Fit a spatial or spatiotemporal GLMM with TMB — sdmTMB","text":"Main reference introducing package cite using sdmTMB: Anderson, S.C., E.J. Ward, P.. English, L..K. Barnett. 2022. sdmTMB: R package fast, flexible, user-friendly generalized linear mixed effects models spatial spatiotemporal random fields. bioRxiv 2022.03.24.485545; doi:10.1101/2022.03.24.485545 . Reference local trends: Barnett, L..K., E.J. Ward, S.C. Anderson. 2021. Improving estimates species distribution change incorporating local trends. Ecography. 44(3):427-439. doi:10.1111/ecog.05176 . explanation model application calculating climate velocities: English, P., E.J. Ward, C.N. Rooper, R.E. Forrest, L.. Rogers, K.L. Hunter, .M. Edwards, B.M. Connors, S.C. Anderson. 2021. Contrasting climate velocity impacts warm cool locations show effects marine warming worse already warmer temperate waters. Fish Fisheries. 23(1) 239-255. doi:10.1111/faf.12613 . Discussion illustration decision points fitting models: Commander, C.J.C., L..K. Barnett, E.J. Ward, S.C. Anderson, T.E. Essington. 2022. shadow model: small choices spatially explicit species distribution models affect predictions. PeerJ 10: e12783. doi:10.7717/peerj.12783 . Application description threshold/break-point models: Essington, T.E., S.C. Anderson, L..K. Barnett, H.M. Berger, S.. Siedlecki, E.J. Ward. 2022. Advancing statistical models reveal effect dissolved oxygen spatial distribution marine taxa using thresholds physiologically based index. Ecography. 2022: e06249 doi:10.1111/ecog.06249 . Application fish body condition: Lindmark, M., S.C. Anderson, M. Gogina, M. Casini. Evaluating drivers spatiotemporal individual condition bottom-associated marine fish. bioRxiv 2022.04.19.488709. doi:10.1101/2022.04.19.488709 . Several sections original TMB model code adapted VAST R package: Thorson, J.T. 2019. Guidance decisions using Vector Autoregressive Spatio-Temporal (VAST) package stock, ecosystem, habitat climate assessments. Fish. Res. 210:143–161. doi:10.1016/j.fishres.2018.10.013 . Code family R--TMB implementation, selected parameterizations observation likelihoods, general package structure inspiration, idea behind TMB prediction approach adapted glmmTMB R package: Brooks, M.E., K. Kristensen, K.J. van Benthem, . Magnusson, C.W. Berg, . Nielsen, H.J. Skaug, M. Maechler, B.M. Bolker. 2017. glmmTMB Balances Speed Flexibility Among Packages Zero-inflated Generalized Linear Mixed Modeling. R Journal, 9(2):378-400. doi:10.32614/rj-2017-066 . Implementation geometric anisotropy SPDE use random field GLMMs index standardization: Thorson, J.T., .O. Shelton, E.J. Ward, H.J. Skaug. 2015. Geostatistical delta-generalized linear mixed models improve precision estimated abundance indices West Coast groundfishes. ICES J. Mar. Sci. 72(5): 1297–1310. doi:10.1093/icesjms/fsu243 .","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/reference/sdmTMB.html","id":"ref-examples","dir":"Reference","previous_headings":"","what":"Examples","title":"Fit a spatial or spatiotemporal GLMM with TMB — sdmTMB","text":"","code":"library(sdmTMB) # Build a mesh to implement the SPDE approach: mesh <- make_mesh(pcod_2011, c(\"X\", \"Y\"), cutoff = 20) # - this example uses a fairly coarse mesh so these examples run quickly # - 'cutoff' is the minimum distance between mesh vertices in units of the # x and y coordinates # - 'cutoff = 10' might make more sense in applied situations for this dataset # - or build any mesh in 'fmesher' and pass it to the 'mesh' argument in make_mesh()` # - the mesh is not needed if you will be turning off all # spatial/spatiotemporal random fields # Quick mesh plot: plot(mesh) # Fit a Tweedie spatial random field GLMM with a smoother for depth: fit <- sdmTMB( density ~ s(depth), data = pcod_2011, mesh = mesh, family = tweedie(link = \"log\") ) fit #> Spatial model fit by ML ['sdmTMB'] #> Formula: density ~ s(depth) #> Mesh: mesh (isotropic covariance) #> Data: pcod_2011 #> Family: tweedie(link = 'log') #> #> coef.est coef.se #> (Intercept) 2.16 0.34 #> sdepth 1.94 3.13 #> #> Smooth terms: #> Std. Dev. #> sds(depth) 13.07 #> #> Dispersion parameter: 13.68 #> Tweedie p: 1.58 #> Matérn range: 16.84 #> Spatial SD: 2.20 #> ML criterion at convergence: 2937.789 #> #> See ?tidy.sdmTMB to extract these values as a data frame. # Extract coefficients: tidy(fit, conf.int = TRUE) #> # A tibble: 1 × 5 #> term estimate std.error conf.low conf.high #> #> 1 (Intercept) 2.16 0.340 1.50 2.83 tidy(fit, effects = \"ran_par\", conf.int = TRUE) #> # A tibble: 4 × 5 #> term estimate std.error conf.low conf.high #> #> 1 range 16.8 13.7 3.40 83.3 #> 2 phi 13.7 0.663 12.4 15.0 #> 3 sigma_O 2.20 1.23 0.735 6.59 #> 4 tweedie_p 1.58 0.0153 1.55 1.61 # Perform several 'sanity' checks: sanity(fit) #> ✔ Non-linear minimizer suggests successful convergence #> ✔ Hessian matrix is positive definite #> ✔ No extreme or very small eigenvalues detected #> ✔ No gradients with respect to fixed effects are >= 0.001 #> ✔ No fixed-effect standard errors are NA #> ✔ No standard errors look unreasonably large #> ✔ No sigma parameters are < 0.01 #> ✔ No sigma parameters are > 100 #> ✔ Range parameter doesn't look unreasonably large # Predict on the fitted data; see ?predict.sdmTMB p <- predict(fit) # Predict on new data: p <- predict(fit, newdata = qcs_grid) head(p) #> X Y depth depth_scaled depth_scaled2 est est_non_rf est_rf #> 1 456 5636 347.0834 1.5608122 2.43613479 -4.726638 -4.567385 -0.15925308 #> 2 458 5636 223.3348 0.5697699 0.32463771 2.342470 2.368314 -0.02584421 #> 3 460 5636 203.7408 0.3633693 0.13203724 3.087513 2.979948 0.10756466 #> 4 462 5636 183.2987 0.1257046 0.01580166 3.878560 3.637586 0.24097353 #> 5 464 5636 182.9998 0.1220368 0.01489297 4.020914 3.646532 0.37438240 #> 6 466 5636 186.3892 0.1632882 0.02666303 4.050895 3.543104 0.50779127 #> omega_s #> 1 -0.15925308 #> 2 -0.02584421 #> 3 0.10756466 #> 4 0.24097353 #> 5 0.37438240 #> 6 0.50779127 # \\donttest{ # Visualize the depth effect with ggeffects: ggeffects::ggpredict(fit, \"depth [all]\") |> plot() # Visualize depth effect with visreg: (see ?visreg_delta) visreg::visreg(fit, xvar = \"depth\") # link space; randomized quantile residuals visreg::visreg(fit, xvar = \"depth\", scale = \"response\") visreg::visreg(fit, xvar = \"depth\", scale = \"response\", gg = TRUE, rug = FALSE) # Add spatiotemporal random fields: fit <- sdmTMB( density ~ 0 + as.factor(year), time = \"year\", #< data = pcod_2011, mesh = mesh, family = tweedie(link = \"log\") ) fit #> Spatiotemporal model fit by ML ['sdmTMB'] #> Formula: density ~ 0 + as.factor(year) #> Mesh: mesh (isotropic covariance) #> Time column: year #> Data: pcod_2011 #> Family: tweedie(link = 'log') #> #> coef.est coef.se #> as.factor(year)2011 2.76 0.36 #> as.factor(year)2013 3.10 0.35 #> as.factor(year)2015 3.21 0.35 #> as.factor(year)2017 2.47 0.36 #> #> Dispersion parameter: 14.83 #> Tweedie p: 1.57 #> Matérn range: 13.31 #> Spatial SD: 3.16 #> Spatiotemporal IID SD: 1.79 #> ML criterion at convergence: 3007.552 #> #> See ?tidy.sdmTMB to extract these values as a data frame. # Make the fields AR1: fit <- sdmTMB( density ~ s(depth), time = \"year\", spatial = \"off\", spatiotemporal = \"ar1\", #< data = pcod_2011, mesh = mesh, family = tweedie(link = \"log\") ) fit #> Spatiotemporal model fit by ML ['sdmTMB'] #> Formula: density ~ s(depth) #> Mesh: mesh (isotropic covariance) #> Time column: year #> Data: pcod_2011 #> Family: tweedie(link = 'log') #> #> coef.est coef.se #> (Intercept) 1.84 0.33 #> sdepth 1.96 3.27 #> #> Smooth terms: #> Std. Dev. #> sds(depth) 13.6 #> #> Dispersion parameter: 12.84 #> Tweedie p: 1.55 #> Spatiotemporal AR1 correlation (rho): 0.67 #> Matérn range: 12.22 #> Spatiotemporal marginal AR1 SD: 3.28 #> ML criterion at convergence: 2914.393 #> #> See ?tidy.sdmTMB to extract these values as a data frame. # Make the fields a random walk: fit <- sdmTMB( density ~ s(depth), time = \"year\", spatial = \"off\", spatiotemporal = \"rw\", #< data = pcod_2011, mesh = mesh, family = tweedie(link = \"log\") ) fit #> Spatiotemporal model fit by ML ['sdmTMB'] #> Formula: density ~ s(depth) #> Mesh: mesh (isotropic covariance) #> Time column: year #> Data: pcod_2011 #> Family: tweedie(link = 'log') #> #> coef.est coef.se #> (Intercept) 1.95 0.34 #> sdepth 1.96 3.18 #> #> Smooth terms: #> Std. Dev. #> sds(depth) 13.22 #> #> Dispersion parameter: 12.84 #> Tweedie p: 1.56 #> Matérn range: 14.66 #> Spatiotemporal RW SD: 2.17 #> ML criterion at convergence: 2919.181 #> #> See ?tidy.sdmTMB to extract these values as a data frame. # Depth smoothers by year: fit <- sdmTMB( density ~ s(depth, by = as.factor(year)), #< time = \"year\", spatial = \"off\", spatiotemporal = \"rw\", data = pcod_2011, mesh = mesh, family = tweedie(link = \"log\") ) fit #> Spatiotemporal model fit by ML ['sdmTMB'] #> Formula: density ~ s(depth, by = as.factor(year)) #> Mesh: mesh (isotropic covariance) #> Time column: year #> Data: pcod_2011 #> Family: tweedie(link = 'log') #> #> coef.est coef.se #> (Intercept) 1.76 0.34 #> sdepth):as.factor(year)2011 0.07 4.02 #> sdepth):as.factor(year)2013 4.59 3.28 #> sdepth):as.factor(year)2015 5.97 6.01 #> sdepth):as.factor(year)2017 -1.97 3.22 #> #> Smooth terms: #> Std. Dev. #> sds(depth):as.factor(year)2011) 16.62 #> sds(depth):as.factor(year)2013) 13.57 #> sds(depth):as.factor(year)2015) 28.24 #> sds(depth):as.factor(year)2017) 18.65 #> #> Dispersion parameter: 12.70 #> Tweedie p: 1.55 #> Matérn range: 8.62 #> Spatiotemporal RW SD: 3.14 #> ML criterion at convergence: 2924.193 #> #> See ?tidy.sdmTMB to extract these values as a data frame. # 2D depth-year smoother: fit <- sdmTMB( density ~ s(depth, year), #< spatial = \"off\", data = pcod_2011, mesh = mesh, family = tweedie(link = \"log\") ) fit #> Model fit by ML ['sdmTMB'] #> Formula: density ~ s(depth, year) #> Mesh: mesh (isotropic covariance) #> Data: pcod_2011 #> Family: tweedie(link = 'log') #> #> coef.est coef.se #> (Intercept) 2.55 0.24 #> sdepthyear_1 -0.15 0.09 #> sdepthyear_2 -2.93 2.07 #> #> Smooth terms: #> Std. Dev. #> sds(depth,year) 6.08 #> #> Dispersion parameter: 14.95 #> Tweedie p: 1.60 #> ML criterion at convergence: 2974.143 #> #> See ?tidy.sdmTMB to extract these values as a data frame. # Turn off spatial random fields: fit <- sdmTMB( present ~ poly(log(depth)), spatial = \"off\", #< data = pcod_2011, mesh = mesh, family = binomial() ) fit #> Model fit by ML ['sdmTMB'] #> Formula: present ~ poly(log(depth)) #> Mesh: mesh (isotropic covariance) #> Data: pcod_2011 #> Family: binomial(link = 'logit') #> #> coef.est coef.se #> (Intercept) -0.16 0.07 #> poly(log(depth)) -13.19 2.14 #> #> ML criterion at convergence: 648.334 #> #> See ?tidy.sdmTMB to extract these values as a data frame. # Which, matches glm(): fit_glm <- glm( present ~ poly(log(depth)), data = pcod_2011, family = binomial() ) summary(fit_glm) #> #> Call: #> glm(formula = present ~ poly(log(depth)), family = binomial(), #> data = pcod_2011) #> #> Coefficients: #> Estimate Std. Error z value Pr(>|z|) #> (Intercept) -0.16433 0.06583 -2.496 0.0126 * #> poly(log(depth)) -13.18981 2.14179 -6.158 7.35e-10 *** #> --- #> Signif. codes: 0 ‘***’ 0.001 ‘**’ 0.01 ‘*’ 0.05 ‘.’ 0.1 ‘ ’ 1 #> #> (Dispersion parameter for binomial family taken to be 1) #> #> Null deviance: 1337.2 on 968 degrees of freedom #> Residual deviance: 1296.7 on 967 degrees of freedom #> AIC: 1300.7 #> #> Number of Fisher Scoring iterations: 4 #> AIC(fit, fit_glm) #> df AIC #> fit 2 1300.668 #> fit_glm 2 1300.668 # Delta/hurdle binomial-Gamma model: fit_dg <- sdmTMB( density ~ poly(log(depth), 2), data = pcod_2011, mesh = mesh, spatial = \"off\", family = delta_gamma() #< ) fit_dg #> Model fit by ML ['sdmTMB'] #> Formula: density ~ poly(log(depth), 2) #> Mesh: mesh (isotropic covariance) #> Data: pcod_2011 #> Family: delta_gamma(link1 = 'logit', link2 = 'log') #> #> Delta/hurdle model 1: ----------------------------------- #> Family: binomial(link = 'logit') #> coef.est coef.se #> (Intercept) -0.48 0.09 #> poly(log(depth), 2)1 -23.06 3.15 #> poly(log(depth), 2)2 -48.79 4.45 #> #> #> Delta/hurdle model 2: ----------------------------------- #> Family: Gamma(link = 'log') #> coef.est coef.se #> (Intercept) 4.24 0.08 #> poly(log(depth), 2)1 -5.49 3.50 #> poly(log(depth), 2)2 -13.26 3.23 #> #> Dispersion parameter: 0.64 #> #> ML criterion at convergence: 2936.579 #> #> See ?tidy.sdmTMB to extract these values as a data frame. # Delta model with different formulas and spatial structure: fit_dg <- sdmTMB( list(density ~ depth_scaled, density ~ poly(depth_scaled, 2)), #< data = pcod_2011, mesh = mesh, spatial = list(\"off\", \"on\"), #< family = delta_gamma() ) fit_dg #> Spatial model fit by ML ['sdmTMB'] #> Formula: list(density ~ depth_scaled, density ~ poly(depth_scaled, 2)) #> Mesh: mesh (isotropic covariance) #> Data: pcod_2011 #> Family: delta_gamma(link1 = 'logit', link2 = 'log') #> #> Delta/hurdle model 1: ----------------------------------- #> Family: binomial(link = 'logit') #> coef.est coef.se #> (Intercept) -0.17 0.07 #> depth_scaled -0.43 0.07 #> #> #> Delta/hurdle model 2: ----------------------------------- #> Family: Gamma(link = 'log') #> coef.est coef.se #> (Intercept) 4.08 0.14 #> poly(depth_scaled, 2)1 -6.15 4.52 #> poly(depth_scaled, 2)2 -12.58 4.14 #> #> Dispersion parameter: 0.72 #> Matérn range: 0.01 #> Spatial SD: 2149.14 #> #> ML criterion at convergence: 3034.512 #> #> See ?tidy.sdmTMB to extract these values as a data frame. #> #> **Possible issues detected! Check output of sanity().** # Delta/hurdle truncated NB2: pcod_2011$count <- round(pcod_2011$density) fit_nb2 <- sdmTMB( count ~ s(depth), data = pcod_2011, mesh = mesh, spatial = \"off\", family = delta_truncated_nbinom2() #< ) fit_nb2 #> Model fit by ML ['sdmTMB'] #> Formula: count ~ s(depth) #> Mesh: mesh (isotropic covariance) #> Data: pcod_2011 #> Family: delta_truncated_nbinom2(link1 = 'logit', link2 = 'log') #> #> Delta/hurdle model 1: ----------------------------------- #> Family: binomial(link = 'logit') #> coef.est coef.se #> (Intercept) -0.69 0.21 #> sdepth 0.29 2.37 #> #> Smooth terms: #> Std. Dev. #> sds(depth) 9.6 #> #> #> Delta/hurdle model 2: ----------------------------------- #> Family: truncated_nbinom2(link = 'log') #> coef.est coef.se #> (Intercept) 4.18 0.22 #> sdepth -0.37 1.77 #> #> Smooth terms: #> Std. Dev. #> sds(depth) 6.73 #> #> Dispersion parameter: 0.49 #> #> ML criterion at convergence: 2915.733 #> #> See ?tidy.sdmTMB to extract these values as a data frame. # Regular NB2: fit_nb2 <- sdmTMB( count ~ s(depth), data = pcod_2011, mesh = mesh, spatial = \"off\", family = nbinom2() #< ) fit_nb2 #> Model fit by ML ['sdmTMB'] #> Formula: count ~ s(depth) #> Mesh: mesh (isotropic covariance) #> Data: pcod_2011 #> Family: nbinom2(link = 'log') #> #> coef.est coef.se #> (Intercept) 2.49 0.27 #> sdepth 3.61 3.96 #> #> Smooth terms: #> Std. Dev. #> sds(depth) 16.4 #> #> Dispersion parameter: 0.14 #> ML criterion at convergence: 3006.939 #> #> See ?tidy.sdmTMB to extract these values as a data frame. # IID random intercepts by year: pcod_2011$fyear <- as.factor(pcod_2011$year) fit <- sdmTMB( density ~ s(depth) + (1 | fyear), #< data = pcod_2011, mesh = mesh, family = tweedie(link = \"log\") ) fit #> Spatial model fit by ML ['sdmTMB'] #> Formula: density ~ s(depth) + (1 | fyear) #> Mesh: mesh (isotropic covariance) #> Data: pcod_2011 #> Family: tweedie(link = 'log') #> #> coef.est coef.se #> (Intercept) 2.13 0.37 #> sdepth 1.82 2.99 #> #> Smooth terms: #> Std. Dev. #> sds(depth) 12.51 #> #> Random intercepts: #> Std. Dev. #> fyear 0.3 #> #> Dispersion parameter: 13.55 #> Tweedie p: 1.58 #> Matérn range: 16.66 #> Spatial SD: 2.21 #> ML criterion at convergence: 2933.138 #> #> See ?tidy.sdmTMB to extract these values as a data frame. # Spatially varying coefficient of year: pcod_2011$year_scaled <- as.numeric(scale(pcod_2011$year)) fit <- sdmTMB( density ~ year_scaled, spatial_varying = ~ 0 + year_scaled, #< data = pcod_2011, mesh = mesh, family = tweedie(), time = \"year\" ) fit #> Spatiotemporal model fit by ML ['sdmTMB'] #> Formula: density ~ year_scaled #> Mesh: mesh (isotropic covariance) #> Time column: year #> Data: pcod_2011 #> Family: tweedie(link = 'log') #> #> coef.est coef.se #> (Intercept) 2.86 0.33 #> year_scaled -0.06 0.15 #> #> Dispersion parameter: 14.79 #> Tweedie p: 1.56 #> Matérn range: 20.56 #> Spatial SD: 2.38 #> Spatially varying coefficient SD (year_scaled): 0.81 #> Spatiotemporal IID SD: 1.11 #> ML criterion at convergence: 3008.886 #> #> See ?tidy.sdmTMB to extract these values as a data frame. # Time-varying effects of depth and depth squared: fit <- sdmTMB( density ~ 0 + as.factor(year), time_varying = ~ 0 + depth_scaled + depth_scaled2, #< data = pcod_2011, time = \"year\", mesh = mesh, family = tweedie() ) print(fit) #> Spatiotemporal model fit by ML ['sdmTMB'] #> Formula: density ~ 0 + as.factor(year) #> Mesh: mesh (isotropic covariance) #> Time column: year #> Data: pcod_2011 #> Family: tweedie(link = 'log') #> #> coef.est coef.se #> as.factor(year)2011 3.73 0.30 #> as.factor(year)2013 3.64 0.28 #> as.factor(year)2015 4.00 0.29 #> as.factor(year)2017 3.31 0.32 #> #> Time-varying parameters: #> coef.est coef.se #> depth_scaled-2011 -0.87 0.16 #> depth_scaled-2013 -0.81 0.13 #> depth_scaled-2015 -0.75 0.13 #> depth_scaled-2017 -1.11 0.23 #> depth_scaled2-2011 -1.92 0.29 #> depth_scaled2-2013 -0.92 0.14 #> depth_scaled2-2015 -1.59 0.22 #> depth_scaled2-2017 -2.20 0.35 #> #> Dispersion parameter: 12.80 #> Tweedie p: 1.56 #> Matérn range: 0.01 #> Spatial SD: 2573.00 #> Spatiotemporal IID SD: 2087.99 #> ML criterion at convergence: 2911.371 #> #> See ?tidy.sdmTMB to extract these values as a data frame. #> #> **Possible issues detected! Check output of sanity().** # Extract values: est <- as.list(fit$sd_report, \"Estimate\") se <- as.list(fit$sd_report, \"Std. Error\") est$b_rw_t[, , 1] #> [,1] [,2] #> [1,] -0.8738038 -1.9199722 #> [2,] -0.8115162 -0.9195934 #> [3,] -0.7514766 -1.5858307 #> [4,] -1.1064060 -2.1992575 se$b_rw_t[, , 1] #> [,1] [,2] #> [1,] 0.1619744 0.2944087 #> [2,] 0.1287059 0.1386946 #> [3,] 0.1346182 0.2197914 #> [4,] 0.2305927 0.3514650 # Linear break-point effect of depth: fit <- sdmTMB( density ~ breakpt(depth_scaled), #< data = pcod_2011, mesh = mesh, family = tweedie() ) fit #> Spatial model fit by ML ['sdmTMB'] #> Formula: density ~ breakpt(depth_scaled) #> Mesh: mesh (isotropic covariance) #> Data: pcod_2011 #> Family: tweedie(link = 'log') #> #> coef.est coef.se #> (Intercept) 4.11 0.33 #> depth_scaled-slope 1.07 0.22 #> depth_scaled-breakpt -1.30 0.25 #> #> Dispersion parameter: 15.19 #> Tweedie p: 1.58 #> Matérn range: 23.57 #> Spatial SD: 1.92 #> ML criterion at convergence: 2997.241 #> #> See ?tidy.sdmTMB to extract these values as a data frame. # }"},{"path":"https://pbs-assess.github.io/sdmTMB/reference/sdmTMB_cv.html","id":null,"dir":"Reference","previous_headings":"","what":"Cross validation with sdmTMB models — sdmTMB_cv","title":"Cross validation with sdmTMB models — sdmTMB_cv","text":"Facilitates cross validation sdmTMB models. Returns log likelihood left-data, similar spirit ELPD (expected log pointwise predictive density). function option leave-future-cross validation. default, function creates folds randomly folds can manually assigned via fold_ids argument.","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/reference/sdmTMB_cv.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"Cross validation with sdmTMB models — sdmTMB_cv","text":"","code":"sdmTMB_cv( formula, data, mesh_args, mesh = NULL, time = NULL, k_folds = 8, fold_ids = NULL, lfo = FALSE, lfo_forecast = 1, lfo_validations = 5, parallel = TRUE, use_initial_fit = FALSE, future_globals = NULL, spde = deprecated(), ... )"},{"path":"https://pbs-assess.github.io/sdmTMB/reference/sdmTMB_cv.html","id":"arguments","dir":"Reference","previous_headings":"","what":"Arguments","title":"Cross validation with sdmTMB models — sdmTMB_cv","text":"formula Model formula. data data frame. mesh_args Arguments make_mesh(). supplied, mesh reconstructed fold. mesh Output make_mesh(). supplied, mesh constant across folds. time name time column. Leave NULL spatial data. k_folds Number folds. fold_ids Optional vector containing user fold IDs. Can also single string, e.g. \"fold_id\" representing name variable data. Ignored lfo TRUE lfo Whether implement leave-future-(LFO) cross validation data used predict future folds. time argument sdmTMB() must specified. See Details section . lfo_forecast lfo = TRUE, number time steps forecast. Time steps 1, ..., T used predict T + lfo_forecast last forecasted time step used validation. See Details section . lfo_validations lfo = TRUE, number times step LFOCV process. Defaults 5. See Details section . parallel TRUE future::plan() supplied, run parallel. use_initial_fit Fit first fold use parameter values starting values subsequent folds? Can faster many folds. future_globals character vector global variables used within arguments error returned future.apply find object. vector appended TRUE passed argument future.globals future.apply::future_lapply(). Useful global objects used specify arguments like priors, families, etc. spde Depreciated. Use mesh instead. ... arguments required run sdmTMB() model exception weights, used define folds.","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/reference/sdmTMB_cv.html","id":"value","dir":"Reference","previous_headings":"","what":"Value","title":"Cross validation with sdmTMB models — sdmTMB_cv","text":"list: data: Original data plus columns fold ID, CV predicted value, CV log likelihood. models: list models; one per fold. fold_loglik: Sum left-log likelihoods per fold. positive values better. sum_loglik: Sum fold_loglik across left-data. positive values better. pdHess: Logical vector: Hessian invertible fold? converged: Logical: pdHess TRUE? max_gradients: Max gradient per fold. Prior sdmTMB version '0.3.0.9002', elpd incorrectly returned log average likelihood, another metric compare models , ELPD. maximum likelihood, ELPD equivalent spirit sum log likelihoods.","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/reference/sdmTMB_cv.html","id":"details","dir":"Reference","previous_headings":"","what":"Details","title":"Cross validation with sdmTMB models — sdmTMB_cv","text":"Parallel processing Parallel processing can used setting future::plan(). example: Leave-future-cross validation (LFOCV) example LFOCV 9 time steps, lfo_forecast = 1, lfo_validations = 2: Fit data time steps 1 7, predict validate step 8. Fit data time steps 1 8, predict validate step 9. example LFOCV 9 time steps, lfo_forecast = 2, lfo_validations = 3: Fit data time steps 1 5, predict validate step 7. Fit data time steps 1 6, predict validate step 8. Fit data time steps 1 7, predict validate step 9. See example .","code":"library(future) plan(multisession) # now use sdmTMB_cv() ..."},{"path":"https://pbs-assess.github.io/sdmTMB/reference/sdmTMB_cv.html","id":"ref-examples","dir":"Reference","previous_headings":"","what":"Examples","title":"Cross validation with sdmTMB models — sdmTMB_cv","text":"","code":"mesh <- make_mesh(pcod, c(\"X\", \"Y\"), cutoff = 25) # Set parallel processing first if desired with the future package. # See the Details section above. m_cv <- sdmTMB_cv( density ~ 0 + depth_scaled + depth_scaled2, data = pcod, mesh = mesh, family = tweedie(link = \"log\"), k_folds = 2 ) #> Running fits with `future.apply()`. #> Set a parallel `future::plan()` to use parallel processing. m_cv$fold_loglik #> [1] -3351.292 -3298.167 m_cv$sum_loglik #> [1] -6649.459 head(m_cv$data) #> # A tibble: 6 × 15 #> year X Y depth density present lat lon depth_mean depth_sd #> #> 1 2003 446. 5793. 201 113. 1 52.3 -130. 5.16 0.445 #> 2 2003 446. 5800. 212 41.7 1 52.3 -130. 5.16 0.445 #> 3 2003 449. 5802. 220 0 0 52.4 -130. 5.16 0.445 #> 4 2003 437. 5802. 197 15.7 1 52.4 -130. 5.16 0.445 #> 5 2003 421. 5771. 256 0 0 52.1 -130. 5.16 0.445 #> 6 2003 418. 5772. 293 0 0 52.1 -130. 5.16 0.445 #> # ℹ 5 more variables: depth_scaled , depth_scaled2 , cv_fold , #> # cv_predicted , cv_loglik m_cv$models[[1]] #> Spatial model fit by ML ['sdmTMB'] #> Formula: density ~ 0 + depth_scaled + depth_scaled2 #> Family: tweedie(link = 'log') #> #> coef.est coef.se #> depth_scaled -2.07 0.23 #> depth_scaled2 -1.57 0.14 #> #> Dispersion parameter: 14.61 #> Tweedie p: 1.64 #> Matérn range: 100.81 #> Spatial SD: 3.11 #> ML criterion at convergence: 3191.001 #> #> See ?tidy.sdmTMB to extract these values as a data frame. m_cv$max_gradients #> [1] 2.788299e-08 6.945591e-10 # \\donttest{ # Create mesh each fold: m_cv2 <- sdmTMB_cv( density ~ 0 + depth_scaled + depth_scaled2, data = pcod, mesh_args = list(xy_cols = c(\"X\", \"Y\"), cutoff = 20), family = tweedie(link = \"log\"), k_folds = 2 ) #> Running fits with `future.apply()`. #> Set a parallel `future::plan()` to use parallel processing. # Use fold_ids: m_cv3 <- sdmTMB_cv( density ~ 0 + depth_scaled + depth_scaled2, data = pcod, mesh = mesh, family = tweedie(link = \"log\"), fold_ids = rep(seq(1, 3), nrow(pcod))[seq(1, nrow(pcod))] ) #> Running fits with `future.apply()`. #> Set a parallel `future::plan()` to use parallel processing. # }"},{"path":"https://pbs-assess.github.io/sdmTMB/reference/sdmTMB_simulate.html","id":null,"dir":"Reference","previous_headings":"","what":"Simulate from a spatial/spatiotemporal model — sdmTMB_simulate","title":"Simulate from a spatial/spatiotemporal model — sdmTMB_simulate","text":"sdmTMB_simulate() uses TMB simulate new data given specified parameter values. simulate.sdmTMB(), hand, takes existing model fit simulates new observations optionally new random effects.","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/reference/sdmTMB_simulate.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"Simulate from a spatial/spatiotemporal model — sdmTMB_simulate","text":"","code":"sdmTMB_simulate( formula, data, mesh, family = gaussian(link = \"identity\"), time = NULL, B = NULL, range = NULL, rho = NULL, sigma_O = NULL, sigma_E = NULL, sigma_Z = NULL, phi = NULL, tweedie_p = NULL, df = NULL, threshold_coefs = NULL, fixed_re = list(omega_s = NULL, epsilon_st = NULL, zeta_s = NULL), previous_fit = NULL, seed = sample.int(1e+06, 1), ... )"},{"path":"https://pbs-assess.github.io/sdmTMB/reference/sdmTMB_simulate.html","id":"arguments","dir":"Reference","previous_headings":"","what":"Arguments","title":"Simulate from a spatial/spatiotemporal model — sdmTMB_simulate","text":"formula one-sided formula describing fixed-effect structure. Random intercepts (yet) supported. Fixed effects match corresponding B argument vector coefficient values. data data frame containing predictors described formula time column time specified. mesh Output make_mesh(). family Family sdmTMB(). Delta families supported. Instead, simulate two component models separately combine. time time column name. B vector beta values (fixed-effect coefficient values). range Parameter controls decay spatial correlation. vector length 2, share_range set FALSE spatial spatiotemporal ranges unique. rho Spatiotemporal correlation years; -1 1. sigma_O SD spatial process (Omega). sigma_E SD spatiotemporal process (Epsilon). sigma_Z SD spatially varying coefficient field (Zeta). phi Observation error scale parameter (e.g., SD Gaussian). tweedie_p Tweedie p (power) parameter; 1 2. df Student-t degrees freedom. threshold_coefs optional vector threshold coefficient values formula includes breakpt() logistic(). breakpt(), slope cut values. logistic(), threshold function 50% maximum, threshold function 95% maximum, maximum. See model description vignette details. fixed_re list optional random effects fix specified (e.g., previously estimated) values. Values NULL result random effects simulated. previous_fit (Deprecated; please use simulate.sdmTMB()). optional previous sdmTMB() fit pull parameter values. -ruled non-NULL specified parameter arguments. seed Seed number. ... arguments pass sdmTMB().","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/reference/sdmTMB_simulate.html","id":"value","dir":"Reference","previous_headings":"","what":"Value","title":"Simulate from a spatial/spatiotemporal model — sdmTMB_simulate","text":"data frame : 1st column time variable (present). 2nd 3rd columns spatial coordinates. omega_s represents simulated spatial random effects (present). zeta_s represents simulated spatial varying covariate field (present). epsilon_st represents simulated spatiotemporal random effects (present). eta true value link space mu true value inverse link space. observed represents simulated process observation error. remaining columns fixed-effect model matrix.","code":""},{"path":[]},{"path":"https://pbs-assess.github.io/sdmTMB/reference/sdmTMB_simulate.html","id":"ref-examples","dir":"Reference","previous_headings":"","what":"Examples","title":"Simulate from a spatial/spatiotemporal model — sdmTMB_simulate","text":"","code":"set.seed(123) # make fake predictor(s) (a1) and sampling locations: predictor_dat <- data.frame( X = runif(300), Y = runif(300), a1 = rnorm(300), year = rep(1:6, each = 50) ) mesh <- make_mesh(predictor_dat, xy_cols = c(\"X\", \"Y\"), cutoff = 0.1) sim_dat <- sdmTMB_simulate( formula = ~ 1 + a1, data = predictor_dat, time = \"year\", mesh = mesh, family = gaussian(), range = 0.5, sigma_E = 0.1, phi = 0.1, sigma_O = 0.2, seed = 42, B = c(0.2, -0.4) # B0 = intercept, B1 = a1 slope ) head(sim_dat) #> year X Y omega_s epsilon_st mu eta #> 1 1 0.2875775 0.784575267 -0.02131861 -0.02779393 0.4369843 0.4369843 #> 2 1 0.7883051 0.009429905 0.28852319 0.09092583 0.8805246 0.8805246 #> 3 1 0.4089769 0.779065883 0.13541643 -0.08468148 0.6261504 0.6261504 #> 4 1 0.8830174 0.729390652 0.28597232 -0.01660011 0.8903775 0.8903775 #> 5 1 0.9404673 0.630131853 0.21070545 0.02005202 0.6056213 0.6056213 #> 6 1 0.0455565 0.480910830 -0.08071932 -0.11409909 -0.1272901 -0.1272901 #> observed (Intercept) a1 #> 1 0.4176273 1 -0.7152422 #> 2 0.8802910 1 -0.7526890 #> 3 0.6248675 1 -0.9385387 #> 4 0.9055722 1 -1.0525133 #> 5 0.6654724 1 -0.4371595 #> 6 -0.1399113 1 0.3311792 if (require(\"ggplot2\", quietly = TRUE)) { ggplot(sim_dat, aes(X, Y, colour = observed)) + geom_point() + facet_wrap(~year) + scale_color_gradient2() } # fit to the simulated data: fit <- sdmTMB(observed ~ a1, data = sim_dat, mesh = mesh, time = \"year\") fit #> Spatiotemporal model fit by ML ['sdmTMB'] #> Formula: observed ~ a1 #> Mesh: mesh (isotropic covariance) #> Time column: year #> Data: sim_dat #> Family: gaussian(link = 'identity') #> #> coef.est coef.se #> (Intercept) 0.23 0.09 #> a1 -0.39 0.01 #> #> Dispersion parameter: 0.09 #> Matérn range: 0.40 #> Spatial SD: 0.21 #> Spatiotemporal IID SD: 0.11 #> ML criterion at convergence: -162.527 #> #> See ?tidy.sdmTMB to extract these values as a data frame."},{"path":"https://pbs-assess.github.io/sdmTMB/reference/sdmTMB_stacking.html","id":null,"dir":"Reference","previous_headings":"","what":"Perform stacking with log scores on sdmTMB_cv() output — sdmTMB_stacking","title":"Perform stacking with log scores on sdmTMB_cv() output — sdmTMB_stacking","text":"approach described Yao et al. (2018) doi:10.1214/17-BA1091 . general method minimizes (maximizes) quantity across models. simple models normal error, may root mean squared error (RMSE), approaches include log score. adopt latter , log scores used generate stacking predictive distributions","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/reference/sdmTMB_stacking.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"Perform stacking with log scores on sdmTMB_cv() output — sdmTMB_stacking","text":"","code":"sdmTMB_stacking(model_list, include_folds = NULL)"},{"path":"https://pbs-assess.github.io/sdmTMB/reference/sdmTMB_stacking.html","id":"arguments","dir":"Reference","previous_headings":"","what":"Arguments","title":"Perform stacking with log scores on sdmTMB_cv() output — sdmTMB_stacking","text":"model_list list models fit sdmTMB_cv() generate estimates predictive densities. want set seed value fitting model manually construct fold IDs across models. include_folds optional numeric vector specifying folds include calculations. example, 5 folds used k-fold cross validation, first 4 needed generate weights, include_folds = 1:4.","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/reference/sdmTMB_stacking.html","id":"value","dir":"Reference","previous_headings":"","what":"Value","title":"Perform stacking with log scores on sdmTMB_cv() output — sdmTMB_stacking","text":"vector model weights.","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/reference/sdmTMB_stacking.html","id":"references","dir":"Reference","previous_headings":"","what":"References","title":"Perform stacking with log scores on sdmTMB_cv() output — sdmTMB_stacking","text":"Yao, Y., Vehtari, ., Simpson, D., Gelman, . 2018. Using Stacking Average Bayesian Predictive Distributions (Discussion). Bayesian Analysis 13(3): 917–1007. International Society Bayesian Analysis. doi:10.1214/17-BA1091","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/reference/sdmTMB_stacking.html","id":"ref-examples","dir":"Reference","previous_headings":"","what":"Examples","title":"Perform stacking with log scores on sdmTMB_cv() output — sdmTMB_stacking","text":"","code":"# \\donttest{ # Set parallel processing if desired. See 'Details' in ?sdmTMB_cv # Depth as quadratic: set.seed(1) m_cv_1 <- sdmTMB_cv( density ~ 0 + depth_scaled + depth_scaled2, data = pcod_2011, mesh = pcod_mesh_2011, family = tweedie(link = \"log\"), k_folds = 2 ) #> Running fits with `future.apply()`. #> Set a parallel `future::plan()` to use parallel processing. # Depth as linear: set.seed(1) m_cv_2 <- sdmTMB_cv( density ~ 0 + depth_scaled, data = pcod_2011, mesh = pcod_mesh_2011, family = tweedie(link = \"log\"), k_folds = 2 ) #> Running fits with `future.apply()`. #> Set a parallel `future::plan()` to use parallel processing. # Only an intercept: set.seed(1) m_cv_3 <- sdmTMB_cv( density ~ 1, data = pcod_2011, mesh = pcod_mesh_2011, family = tweedie(link = \"log\"), k_folds = 2 ) #> Running fits with `future.apply()`. #> Set a parallel `future::plan()` to use parallel processing. models <- list(m_cv_1, m_cv_2, m_cv_3) weights <- sdmTMB_stacking(models) weights #> [1] 0.9038042 0.0182349 0.0779609 # }"},{"path":"https://pbs-assess.github.io/sdmTMB/reference/sdmTMBcontrol.html","id":null,"dir":"Reference","previous_headings":"","what":"Optimization control options — sdmTMBcontrol","title":"Optimization control options — sdmTMBcontrol","text":"sdmTMB() stats::nlminb() control options.","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/reference/sdmTMBcontrol.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"Optimization control options — sdmTMBcontrol","text":"","code":"sdmTMBcontrol( eval.max = 2000L, iter.max = 1000L, normalize = FALSE, nlminb_loops = 1L, newton_loops = 1L, mgcv = deprecated(), quadratic_roots = FALSE, start = NULL, map_rf = deprecated(), map = NULL, lower = NULL, upper = NULL, censored_upper = NULL, multiphase = TRUE, profile = FALSE, get_joint_precision = TRUE, parallel = getOption(\"sdmTMB.cores\", 1L), ... )"},{"path":"https://pbs-assess.github.io/sdmTMB/reference/sdmTMBcontrol.html","id":"arguments","dir":"Reference","previous_headings":"","what":"Arguments","title":"Optimization control options — sdmTMBcontrol","text":"eval.max Maximum number evaluations objective function allowed. iter.max Maximum number iterations allowed. normalize Logical: use TMB::normalize() normalize process likelihood using Laplace approximation? Can result substantial speed boost cases. used default FALSE prior May 2021. Currently working models fit REML random intercepts. nlminb_loops many times run stats::nlminb() optimization. Sometimes restarting optimizer previous best values aids convergence. maximum gradient still large, try increasing 2. newton_loops many Newton optimization steps try running stats::nlminb(). sometimes aids convergence reducing log-likelihood gradient respect fixed effects. calculates Hessian current MLE stats::optimHess() using finite-difference approach uses update fixed effect estimates. mgcv Deprecated Parse formula mgcv::gam()? quadratic_roots Experimental feature internal use right now; may moved branch. Logical: quadratic roots calculated? Note: sdmTMB side, first two coefficients used generate quadratic parameters. means want generate quadratic profile depth, depth depth^2 part formula, need make sure listed first intercept included. example, formula = cpue ~ 0 + depth + depth2 + .factor(year). start named list specifying starting values parameters. can see necessary structure fitting model inspecting your_model$tmb_obj$env$parList(). Elements start specified replace default starting values. map_rf Deprecated use spatial = '', spatiotemporal = '' sdmTMB(). map named list factor NAs specifying parameter values fixed constant value. See documentation TMB::MakeADFun(). usually used start specify fixed value. lower optional named list lower bounds within optimization. Parameter vectors name (e.g., b_j ln_kappa cases) can specified numeric vector. E.g. lower = list(b_j = c(-5, -5)). upper optional named list upper bounds within optimization. censored_upper optional vector upper bounds sdmTMBcontrol(). Values NA indicate unbounded right-censored distribution, values greater observation indicate upper bound, values equal observation indicate censoring. multiphase Logical: estimate fixed random effects phases? Phases usually faster stable. profile Logical: population-level/fixed effects profiled likelihood? appended random effects vector without Laplace approximation. See TMB::MakeADFun(). can dramatically speed model fit many fixed effects experimental stage. get_joint_precision Logical. Passed getJointPrecision TMB::sdreport(). Must TRUE use simulation-based methods predict.sdmTMB() [get_index_sims()]. needed, setting FALSE reduce object size. parallel Argument currently ignored. parallel processing 3 cores, example, use TMB::openmp(n = 3, DLL = \"sdmTMB\"). careful, always faster cores definitely upper limit. ... Anything else. See 'Control parameters' section stats::nlminb().","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/reference/sdmTMBcontrol.html","id":"value","dir":"Reference","previous_headings":"","what":"Value","title":"Optimization control options — sdmTMBcontrol","text":"list control arguments","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/reference/sdmTMBcontrol.html","id":"details","dir":"Reference","previous_headings":"","what":"Details","title":"Optimization control options — sdmTMBcontrol","text":"Usually used within sdmTMB(). example:","code":"sdmTMB(..., control = sdmTMBcontrol(newton_loops = 2))"},{"path":"https://pbs-assess.github.io/sdmTMB/reference/sdmTMBcontrol.html","id":"ref-examples","dir":"Reference","previous_headings":"","what":"Examples","title":"Optimization control options — sdmTMBcontrol","text":"","code":"sdmTMBcontrol() #> $eval.max #> [1] 2000 #> #> $iter.max #> [1] 1000 #> #> $normalize #> [1] FALSE #> #> $nlminb_loops #> [1] 1 #> #> $newton_loops #> [1] 1 #> #> $profile #> [1] FALSE #> #> $quadratic_roots #> [1] FALSE #> #> $start #> NULL #> #> $map #> NULL #> #> $lower #> NULL #> #> $upper #> NULL #> #> $censored_upper #> NULL #> #> $multiphase #> [1] TRUE #> #> $parallel #> [1] 1 #> #> $get_joint_precision #> [1] TRUE #>"},{"path":"https://pbs-assess.github.io/sdmTMB/reference/set_delta_model.html","id":null,"dir":"Reference","previous_headings":"","what":"Set delta model for ggeffects::ggpredict() — set_delta_model","title":"Set delta model for ggeffects::ggpredict() — set_delta_model","text":"Set delta model component predict ggeffects::ggpredict().","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/reference/set_delta_model.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"Set delta model for ggeffects::ggpredict() — set_delta_model","text":"","code":"set_delta_model(x, model = c(NA, 1, 2))"},{"path":"https://pbs-assess.github.io/sdmTMB/reference/set_delta_model.html","id":"arguments","dir":"Reference","previous_headings":"","what":"Arguments","title":"Set delta model for ggeffects::ggpredict() — set_delta_model","text":"x sdmTMB() model fit delta family delta_gamma(). model delta/hurdle model component predict/plot . NA combined prediction, 1 binomial part, 2 positive part.","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/reference/set_delta_model.html","id":"value","dir":"Reference","previous_headings":"","what":"Value","title":"Set delta model for ggeffects::ggpredict() — set_delta_model","text":"fitted model new attribute named delta_model_predict. suggest use set_delta_model() pipe (examples) attribute persist. Otherwise, predict.sdmTMB() choose model component default. can also remove attribute :","code":"attr(fit, \"delta_model_predict\") <- NULL"},{"path":"https://pbs-assess.github.io/sdmTMB/reference/set_delta_model.html","id":"details","dir":"Reference","previous_headings":"","what":"Details","title":"Set delta model for ggeffects::ggpredict() — set_delta_model","text":"complete version examples : run CRAN version ggeffects > 1.3.2 CRAN. now, can install GitHub version ggeffects. https://github.com/strengejacke/ggeffects.","code":"fit <- sdmTMB(density ~ poly(depth_scaled, 2), data = pcod_2011, spatial = \"off\", family = delta_gamma()) # binomial part: set_delta_model(fit, model = 1) |> ggeffects::ggpredict(\"depth_scaled [all]\") # gamma part: set_delta_model(fit, model = 2) |> ggeffects::ggpredict(\"depth_scaled [all]\") # combined: set_delta_model(fit, model = NA) |> ggeffects::ggpredict(\"depth_scaled [all]\")"},{"path":"https://pbs-assess.github.io/sdmTMB/reference/set_delta_model.html","id":"ref-examples","dir":"Reference","previous_headings":"","what":"Examples","title":"Set delta model for ggeffects::ggpredict() — set_delta_model","text":"","code":"fit <- sdmTMB(density ~ poly(depth_scaled, 2), data = pcod_2011, spatial = \"off\", family = delta_gamma()) # binomial part: set_delta_model(fit, model = 1) #> Model fit by ML ['sdmTMB'] #> Formula: density ~ poly(depth_scaled, 2) #> Mesh: NULL (isotropic covariance) #> Data: pcod_2011 #> Family: delta_gamma(link1 = 'logit', link2 = 'log') #> #> Delta/hurdle model 1: ----------------------------------- #> Family: binomial(link = 'logit') #> coef.est coef.se #> (Intercept) -0.48 0.09 #> poly(depth_scaled, 2)1 -23.06 3.15 #> poly(depth_scaled, 2)2 -48.79 4.45 #> #> #> Delta/hurdle model 2: ----------------------------------- #> Family: Gamma(link = 'log') #> coef.est coef.se #> (Intercept) 4.24 0.08 #> poly(depth_scaled, 2)1 -5.49 3.50 #> poly(depth_scaled, 2)2 -13.26 3.23 #> #> Dispersion parameter: 0.64 #> #> ML criterion at convergence: 2936.579 #> #> See ?tidy.sdmTMB to extract these values as a data frame. # gamma part: set_delta_model(fit, model = 2) #> Model fit by ML ['sdmTMB'] #> Formula: density ~ poly(depth_scaled, 2) #> Mesh: NULL (isotropic covariance) #> Data: pcod_2011 #> Family: delta_gamma(link1 = 'logit', link2 = 'log') #> #> Delta/hurdle model 1: ----------------------------------- #> Family: binomial(link = 'logit') #> coef.est coef.se #> (Intercept) -0.48 0.09 #> poly(depth_scaled, 2)1 -23.06 3.15 #> poly(depth_scaled, 2)2 -48.79 4.45 #> #> #> Delta/hurdle model 2: ----------------------------------- #> Family: Gamma(link = 'log') #> coef.est coef.se #> (Intercept) 4.24 0.08 #> poly(depth_scaled, 2)1 -5.49 3.50 #> poly(depth_scaled, 2)2 -13.26 3.23 #> #> Dispersion parameter: 0.64 #> #> ML criterion at convergence: 2936.579 #> #> See ?tidy.sdmTMB to extract these values as a data frame. # combined: set_delta_model(fit, model = NA) #> Model fit by ML ['sdmTMB'] #> Formula: density ~ poly(depth_scaled, 2) #> Mesh: NULL (isotropic covariance) #> Data: pcod_2011 #> Family: delta_gamma(link1 = 'logit', link2 = 'log') #> #> Delta/hurdle model 1: ----------------------------------- #> Family: binomial(link = 'logit') #> coef.est coef.se #> (Intercept) -0.48 0.09 #> poly(depth_scaled, 2)1 -23.06 3.15 #> poly(depth_scaled, 2)2 -48.79 4.45 #> #> #> Delta/hurdle model 2: ----------------------------------- #> Family: Gamma(link = 'log') #> coef.est coef.se #> (Intercept) 4.24 0.08 #> poly(depth_scaled, 2)1 -5.49 3.50 #> poly(depth_scaled, 2)2 -13.26 3.23 #> #> Dispersion parameter: 0.64 #> #> ML criterion at convergence: 2936.579 #> #> See ?tidy.sdmTMB to extract these values as a data frame."},{"path":"https://pbs-assess.github.io/sdmTMB/reference/simulate.sdmTMB.html","id":null,"dir":"Reference","previous_headings":"","what":"Simulate from a fitted sdmTMB model — simulate.sdmTMB","title":"Simulate from a fitted sdmTMB model — simulate.sdmTMB","text":"simulate.sdmTMB S3 method producing matrix simulations fitted model. similar lme4::simulate.merMod() glmmTMB::simulate.glmmTMB(). can used DHARMa package among uses.","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/reference/simulate.sdmTMB.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"Simulate from a fitted sdmTMB model — simulate.sdmTMB","text":"","code":"# S3 method for sdmTMB simulate( object, nsim = 1L, seed = sample.int(1e+06, 1L), type = c(\"mle-eb\", \"mle-mvn\"), model = c(NA, 1, 2), re_form = NULL, mcmc_samples = NULL, silent = TRUE, ... )"},{"path":"https://pbs-assess.github.io/sdmTMB/reference/simulate.sdmTMB.html","id":"arguments","dir":"Reference","previous_headings":"","what":"Arguments","title":"Simulate from a fitted sdmTMB model — simulate.sdmTMB","text":"object sdmTMB model nsim Number response lists simulate. Defaults 1. seed Random number seed type parameters treated. \"mle-eb\": fixed effects maximum likelihood (MLE) estimates random effects empirical Bayes (EB) estimates. \"mle-mvn\": fixed effects MLEs random effects taken single approximate sample. latter option suggested approach simulations used goodness fit testing (e.g., DHARMa package). model delta/hurdle model, model simulate ? NA = combined, 1 = first model, 2 = second mdoel. re_form NULL specify simulation conditional fitted random effects (simulates observation error). ~0 NA simulate new random affects (smoothers, internally random effects, simulated new). mcmc_samples optional matrix MCMC samples. See extract_mcmc() sdmTMBextra package. silent Logical. Silent? ... Extra arguments (used)","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/reference/simulate.sdmTMB.html","id":"value","dir":"Reference","previous_headings":"","what":"Value","title":"Simulate from a fitted sdmTMB model — simulate.sdmTMB","text":"Returns matrix; number columns nsim.","code":""},{"path":[]},{"path":"https://pbs-assess.github.io/sdmTMB/reference/simulate.sdmTMB.html","id":"ref-examples","dir":"Reference","previous_headings":"","what":"Examples","title":"Simulate from a fitted sdmTMB model — simulate.sdmTMB","text":"","code":"# start with some data simulated from scratch: set.seed(1) predictor_dat <- data.frame(X = runif(300), Y = runif(300), a1 = rnorm(300)) mesh <- make_mesh(predictor_dat, xy_cols = c(\"X\", \"Y\"), cutoff = 0.1) dat <- sdmTMB_simulate( formula = ~ 1 + a1, data = predictor_dat, mesh = mesh, family = poisson(), range = 0.5, sigma_O = 0.2, seed = 42, B = c(0.2, -0.4) # B0 = intercept, B1 = a1 slope ) fit <- sdmTMB(observed ~ 1 + a1, data = dat, family = poisson(), mesh = mesh) # simulate from the model: s1 <- simulate(fit, nsim = 300) dim(s1) #> [1] 300 300 # test whether fitted models are consistent with the observed number of zeros: sum(s1 == 0)/length(s1) #> [1] 0.3297667 sum(dat$observed == 0) / length(dat$observed) #> [1] 0.3466667 # simulate with random effects sampled from their approximate posterior s2 <- simulate(fit, nsim = 1, params = \"mle-mvn\") # these may be useful in conjunction with DHARMa simulation-based residuals # simulate with new random fields: s3 <- simulate(fit, nsim = 1, re_form = ~ 0)"},{"path":"https://pbs-assess.github.io/sdmTMB/reference/surveydata.html","id":null,"dir":"Reference","previous_headings":"","what":"Example fish survey data — pcod","title":"Example fish survey data — pcod","text":"Various fish survey datasets.","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/reference/surveydata.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"Example fish survey data — pcod","text":"","code":"pcod pcod_2011 pcod_mesh_2011 qcs_grid dogfish yelloweye hbll_s_grid wcvi_grid"},{"path":"https://pbs-assess.github.io/sdmTMB/reference/surveydata.html","id":"format","dir":"Reference","previous_headings":"","what":"Format","title":"Example fish survey data — pcod","text":"pcod: Trawl survey data Pacific Cod Queen Charlotte Sound. data frame. pcod_2011: version pcod years 2011 (smaller speed). data frame. pcod_mesh_2011: mesh pre-built pcod_2011 examples. list class sdmTMBmesh. qcs_grid 2x2km prediction grid Queen Charlotte Sound. data frame. dogfish: Trawl survey data Pacific Spiny Dogfish West Coast Vancouver Island. data frame. yelloweye: Survey data Yelloweye Rockfish Hard Bottom Longline Survey (South) West Coast Vancouver Island. hbll_s_grid: survey domain grid go yelloweye. data frame. wcvi_grid: survey domain grid go dogfish. data frame.","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/reference/tidy.sdmTMB.html","id":null,"dir":"Reference","previous_headings":"","what":"Turn sdmTMB model output into a tidy data frame — tidy.sdmTMB","title":"Turn sdmTMB model output into a tidy data frame — tidy.sdmTMB","text":"Turn sdmTMB model output tidy data frame","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/reference/tidy.sdmTMB.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"Turn sdmTMB model output into a tidy data frame — tidy.sdmTMB","text":"","code":"# S3 method for sdmTMB tidy( x, effects = c(\"fixed\", \"ran_pars\", \"ran_vals\"), model = 1, conf.int = FALSE, conf.level = 0.95, exponentiate = FALSE, silent = FALSE, ... )"},{"path":"https://pbs-assess.github.io/sdmTMB/reference/tidy.sdmTMB.html","id":"arguments","dir":"Reference","previous_headings":"","what":"Arguments","title":"Turn sdmTMB model output into a tidy data frame — tidy.sdmTMB","text":"x Output sdmTMB(). effects character value. One \"fixed\" ('fixed' main-effect parameters), \"ran_pars\" (standard deviations, spatial range, random effect dispersion-related terms), \"ran_vals\" (individual random intercepts, included; behaves like ranef()). model model tidy delta model (1 2). conf.int Include confidence interval? conf.level Confidence level CI. exponentiate Whether exponentiate fixed-effect coefficient estimates confidence intervals. silent Omit messages? ... Extra arguments (used).","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/reference/tidy.sdmTMB.html","id":"value","dir":"Reference","previous_headings":"","what":"Value","title":"Turn sdmTMB model output into a tidy data frame — tidy.sdmTMB","text":"data frame","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/reference/tidy.sdmTMB.html","id":"details","dir":"Reference","previous_headings":"","what":"Details","title":"Turn sdmTMB model output into a tidy data frame — tidy.sdmTMB","text":"Follows conventions broom broom.mixed packages. Currently, effects = \"ran_pars\" also includes dispersion-related terms (e.g., phi), actually associated random effects. Standard errors spatial variance terms fit log space (e.g., variance terms, range, parameters associated observation error) omitted avoid confusion. Confidence intervals still available.","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/reference/tidy.sdmTMB.html","id":"ref-examples","dir":"Reference","previous_headings":"","what":"Examples","title":"Turn sdmTMB model output into a tidy data frame — tidy.sdmTMB","text":"","code":"fit <- sdmTMB(density ~ poly(depth_scaled, 2, raw = TRUE), data = pcod_2011, mesh = pcod_mesh_2011, family = tweedie() ) tidy(fit) #> # A tibble: 3 × 3 #> term estimate std.error #> #> 1 (Intercept) 3.65 0.281 #> 2 poly(depth_scaled, 2, raw = TRUE)1 -1.54 0.186 #> 3 poly(depth_scaled, 2, raw = TRUE)2 -1.11 0.101 tidy(fit, conf.int = TRUE) #> # A tibble: 3 × 5 #> term estimate std.error conf.low conf.high #> #> 1 (Intercept) 3.65 0.281 3.10 4.20 #> 2 poly(depth_scaled, 2, raw = TRUE)1 -1.54 0.186 -1.90 -1.17 #> 3 poly(depth_scaled, 2, raw = TRUE)2 -1.11 0.101 -1.31 -0.913 tidy(fit, \"ran_pars\", conf.int = TRUE) #> # A tibble: 4 × 5 #> term estimate std.error conf.low conf.high #> #> 1 range 19.1 14.0 4.58 80.0 #> 2 phi 14.0 0.677 12.8 15.4 #> 3 sigma_O 2.14 0.941 0.906 5.07 #> 4 tweedie_p 1.58 0.0153 1.55 1.61 pcod_2011$fyear <- as.factor(pcod_2011$year) fit <- sdmTMB(density ~ poly(depth_scaled, 2, raw = TRUE) + (1 | fyear), data = pcod_2011, mesh = pcod_mesh_2011, family = tweedie() ) tidy(fit, \"ran_vals\") #> # A tibble: 4 × 3 #> term estimate std.error #> #> 1 fyear_2011 0.0163 0.187 #> 2 fyear_2013 0.177 0.188 #> 3 fyear_2015 0.232 0.189 #> 4 fyear_2017 -0.431 0.205"},{"path":"https://pbs-assess.github.io/sdmTMB/reference/visreg_delta.html","id":null,"dir":"Reference","previous_headings":"","what":"Plot sdmTMB models with the visreg package — visreg_delta","title":"Plot sdmTMB models with the visreg package — visreg_delta","text":"sdmTMB models fit regular (non-delta) families can passed visreg::visreg() visreg::visreg2d() directly. Examples shown . Delta models can use helper functions visreg_delta() visreg2d_delta() described .","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/reference/visreg_delta.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"Plot sdmTMB models with the visreg package — visreg_delta","text":"","code":"visreg_delta(object, ..., model = c(1, 2)) visreg2d_delta(object, ..., model = c(1, 2))"},{"path":"https://pbs-assess.github.io/sdmTMB/reference/visreg_delta.html","id":"arguments","dir":"Reference","previous_headings":"","what":"Arguments","title":"Plot sdmTMB models with the visreg package — visreg_delta","text":"object Fit sdmTMB() ... arguments passed visreg::visreg() visreg::visreg2d() model 1st 2nd delta model","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/reference/visreg_delta.html","id":"value","dir":"Reference","previous_headings":"","what":"Value","title":"Plot sdmTMB models with the visreg package — visreg_delta","text":"plot visreg package. Optionally, data plotted invisibly plot = FALSE. useful want make plot .","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/reference/visreg_delta.html","id":"details","dir":"Reference","previous_headings":"","what":"Details","title":"Plot sdmTMB models with the visreg package — visreg_delta","text":"Note residuals currently randomized quantile residuals, deviance residuals usual GLMs visreg.","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/reference/visreg_delta.html","id":"ref-examples","dir":"Reference","previous_headings":"","what":"Examples","title":"Plot sdmTMB models with the visreg package — visreg_delta","text":"","code":"if (require(\"ggplot2\", quietly = TRUE) && require(\"visreg\", quietly = TRUE)) { # \\donttest{ fit <- sdmTMB( density ~ s(depth_scaled), data = pcod_2011, spatial = \"off\", family = tweedie() ) visreg::visreg(fit, xvar = \"depth_scaled\") visreg::visreg(fit, xvar = \"depth_scaled\", scale = \"response\") v <- visreg::visreg(fit, xvar = \"depth_scaled\") head(v$fit) # now use ggplot2 etc. if desired # Delta model example: fit_dg <- sdmTMB( density ~ s(depth_scaled, year, k = 8), data = pcod_2011, mesh = pcod_mesh_2011, spatial = \"off\", family = delta_gamma() ) visreg_delta(fit_dg, xvar = \"depth_scaled\", model = 1, gg = TRUE) visreg_delta(fit_dg, xvar = \"depth_scaled\", model = 2, gg = TRUE) visreg_delta(fit_dg, xvar = \"depth_scaled\", model = 1, scale = \"response\", gg = TRUE ) visreg_delta(fit_dg, xvar = \"depth_scaled\", model = 2, scale = \"response\" ) visreg_delta(fit_dg, xvar = \"depth_scaled\", model = 2, scale = \"response\", gg = TRUE, rug = FALSE ) visreg2d_delta(fit_dg, xvar = \"depth_scaled\", yvar = \"year\", model = 2, scale = \"response\" ) visreg2d_delta(fit_dg, xvar = \"depth_scaled\", yvar = \"year\", model = 1, scale = \"response\", plot.type = \"persp\" ) visreg2d_delta(fit_dg, xvar = \"depth_scaled\", yvar = \"year\", model = 2, scale = \"response\", plot.type = \"gg\" ) # } } #> These are residuals for delta model component 1. Use the `model` argument to #> select the other component. #> These are residuals for delta model component 2. Use the `model` argument to #> select the other component. #> These are residuals for delta model component 1. Use the `model` argument to #> select the other component. #> These are residuals for delta model component 2. Use the `model` argument to #> select the other component. #> These are residuals for delta model component 2. Use the `model` argument to #> select the other component."},{"path":"https://pbs-assess.github.io/sdmTMB/news/index.html","id":"sdmtmb-development-version","dir":"Changelog","previous_headings":"","what":"sdmTMB (development version)","title":"sdmTMB (development version)","text":"Pass several arguments DHARMa::plotQQunif() set uniformity p-value argumemt TRUE default. Add silent option simulate.sdmTMB(). Setting FALSE allows monitoring simulations larger models. Fix bug est_non_rf1 est_non_rf2 columns following conditions true: predicting new data using delta model including IID random intercepts time-varying coefficients See #342. Thanks @tom-peatman issue report. Fix delta-gamma binomial link printing type = 'poisson-link' #340 Add suggestion use optimized BLAS library README. Add warning ’s detected problems reloading (e.g., readRDS()) fitted model. Simultaneously revert approach reloaded models reattached. Check random effects haven’t reverted zero readRDS() fitted model. E.g. get_pars(fit)$omega_s spatial random fields enabled. Move log_ratio_mix parameter 2nd phase starting value -1 instead 0 improve convergence. Fix bugs nbinom1() nbinom2_mix() simulation. Allow profile argument control list take character vector parameters. Add theoretical quantile residuals generalized gamma distribution. Thanks J.C. Dunic. #333 Add poisson-link delta-mixture lognormal. Fix bug simulation Poisson-link delta models. Simplify internal treatment extra time slices (extra_time). #329 less bug prone also fixes recently created bug. #335","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/news/index.html","id":"sdmtmb-050","dir":"Changelog","previous_headings":"","what":"sdmTMB 0.5.0","title":"sdmTMB 0.5.0","text":"CRAN release: 2024-04-03 Overhaul residuals vignette (‘article’) https://pbs-assess.github.io/sdmTMB/articles/web_only/residual-checking.html including brief intros randomized quantile residuals, simulation-based residuals, ‘one-sample’ residuals, uniform vs. Gaussian residuals. Add check prediction coordinates appear outside fitted coordinates. #285 Fix memory issue Tweedie family large datasets. #302 Add experimental option return standard normal residuals dharma_residuals(). Make simulate.sdmTMB() include extra_time elements. Improved re-initialization saved fitted model objects new sessions. Fix important bug simulate.sdmTMB() method delta families positive linear predictor getting simulated observations present fitted data. Add new \"mle-mvn\" type residuals.sdmTMB() make default. fast option evaluating goodness fit better previous default. See details section ?residuals.sdmTMB details. previous default now called \"mvn-eb\" recommended. Bring dharma_residuals() back sdmTMBextra sdmTMB. Add new option type argument (\"mle-mvn\") make simulation residuals consistent expected distribution. See new documentation ?residuals.sdmTMB. examples ?dharma_residuals illustrate suggested use. Fix bug sanity() gradient checks missing abs() large negative gradients weren’t getting caught. #324 Return offset vector fitted object element. Ensure extra time rows data data element fitted object include extra time slices. Add experimental residuals option “mle-mvn” single approximate posterior sample random effects drawn combined MLE fixed effects produce residuals. may become default option. Add generalized gamma distribution (thanks J.T. Thorson additional work J.C. Dunic.) See gengamma(). distribution still testing phase recommended applied use yet. #286 Detect possible issue factor(time) formula column name used time extra_time specified. #320 Improve sanity() check output NA fixed effect standard errors. Set intern = FALSE within index bias correction, seems considerably faster testing models.","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/news/index.html","id":"sdmtmb-043","dir":"Changelog","previous_headings":"","what":"sdmTMB 0.4.3","title":"sdmTMB 0.4.3","text":"CRAN release: 2024-02-29 Fix bug likely introduced July 2023 caused issues extra_time specified. important bug models fit extra_time date (using GitHub version) v0.4.2.9004 (2024-02-24) checked current version sdmTMB (v0.4.2.9005 greater). CRAN, affected v0.4.0 (2023-10-20) v0.4.2. Details: essence bug extra_time works padding data fake row data every extra time element (using first row data template). supposed omitted likelihood impact model fitting beyond spacing time-series processes appropriately setting internal structures forecasting. Unfortunately, bug introduced caused fake data (1 per extra time element) included likelihood. Issue error time column NAs. #298 #299 Fix bug get_cog(..., format = \"wide\") time column hardcoded \"year\" accident. Poisson-link delta models now use type argument delta_gamma() delta_lognormal(). delta_poisson_link_gamma() delta_poisson_link_lognormal() deprecated. #290 Delta families can now pass links different default \"logit\" \"log\". #290","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/news/index.html","id":"sdmtmb-042","dir":"Changelog","previous_headings":"","what":"sdmTMB 0.4.2","title":"sdmTMB 0.4.2","text":"CRAN release: 2024-01-18 Force rebuild CRAN binaries fix issue breaking Matrix ABI change causing NaN gradient errors. #288 #287 Fix crash sdmTMB(..., do_index = TRUE) extra_time supplied along predict_args = list(newdata = ...) lacked extra_time elements. Allow get_index() work missing time elements. Add ability pass custom randomized quantile function qres_func residuals.sdmTMB(). Add check factor random intercept columns newdata avoid crash. #278 #280 Improve warnings/errors around use do_index = TRUE get_index() newdata = NULL. #276 Fix prediction offset newdata NULL offset specified. #274 Fix prediction failure offset nsim provided model includes extra_time. #273","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/news/index.html","id":"sdmtmb-041","dir":"Changelog","previous_headings":"","what":"sdmTMB 0.4.1","title":"sdmTMB 0.4.1","text":"CRAN release: 2023-11-03 Fix memory issues detected CRAN ‘Additional issues’ clang-UBSAN, valgrind. Fix bug predicting new data specified offset extra_time. #270 Add warning around non-factor handling spatial_varying formula. #269 Add experimental set_delta_model() plotting delta models ggeffects::ggpredict() (GitHub version next CRAN version).","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/news/index.html","id":"sdmtmb-040","dir":"Changelog","previous_headings":"","what":"sdmTMB 0.4.0","title":"sdmTMB 0.4.0","text":"CRAN release: 2023-10-20 Move add_barrier_mesh() sdmTMBextra avoid final INLA dependency. https://github.com/pbs-assess/sdmTMBextra Switch using new fmesher package mesh/SPDE calculations. INLA longer dependency. Switch diagonal.penalty = FALSE mgcv::smoothCon(). changes scale linear component smoother, result model. https://github.com/glmmTMB/glmmTMB/issues/928#issuecomment-1642862066 Implement cross validation delta models #239 Remove ELPD cross validation output. Use sum_loglik instead. #235 Turn Newton optimization default. #182 print() now checks sanity() issues warning may issues. #176 Poisson-link delta models censored likelihood distributions made considerably robust. #186 Standard errors now available SD parameters etc. tidy() #240 Fix bug print()/tidy() delta-model positive model component sigma_E. recently introduce bug causing sigma_E 2nd model reported 1st model component sigma_E. Add new anisotropy plotting function. Add anisotropic range printing. #149 @jdunic","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/news/index.html","id":"sdmtmb-030","dir":"Changelog","previous_headings":"","what":"sdmTMB 0.3.0","title":"sdmTMB 0.3.0","text":"CRAN release: 2023-01-28 Create sdmTMBextra package remove rstan/tmbstan helpers, causing memory sanitizer errors CRAN. https://github.com/pbs-assess/sdmTMBextra following functions affected: predict.sdmTMB() now takes mcmc_samples, output sdmTMBextra::extract_mcmc(). simulate.sdmTMB() now takes mcmc_samples, output sdmTMBextra::extract_mcmc(). residuals.sdmTMB() now takes mcmc_samples, output sdmTMBextra::predict_mle_mcmc(). affects residuals(..., type = \"mle-mcmc\"). Move dharma_residuals() sdmTMBextra reduce heavy dependencies. See examples Bayesian residuals vignettes help files functions within sdmTMBextra.","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/news/index.html","id":"sdmtmb-022","dir":"Changelog","previous_headings":"","what":"sdmTMB 0.2.2","title":"sdmTMB 0.2.2","text":"Various fixes pass CRAN checks. #158 Fix memory issue highlighted Additional issues CRAN checks. #158 ‘offset’ argument can now character value indicating column name. preferred way using offset parallel cross validation. #165 Fix parallel cross validation using offset vector. #165 Add leave-future-cross validation functionality. #156 Example data qcs_grid longer replicated year save package space. #158 Add message tidy(fit, \"ran_pars\") SEs NA. Add anisotropy print() #157 Fix predict(..., type = \"response\", se_fit = TRUE), involves issuing warning sticking link space. #140","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/news/index.html","id":"sdmtmb-021","dir":"Changelog","previous_headings":"","what":"sdmTMB 0.2.1","title":"sdmTMB 0.2.1","text":"CRAN release: 2023-01-10 Fixes resubmission CRAN.","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/news/index.html","id":"sdmtmb-020","dir":"Changelog","previous_headings":"","what":"sdmTMB 0.2.0","title":"sdmTMB 0.2.0","text":"Initial submission CRAN.","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/news/index.html","id":"sdmtmb-014","dir":"Changelog","previous_headings":"","what":"sdmTMB 0.1.4","title":"sdmTMB 0.1.4","text":"Relax range parameter sanity() check 1x 1.5x greatest distance data. Add Pearson residuals several families. residuals(fit, type = \"pearson\") Useful checking overdispersion N > 1 binomial Poisson families, among uses. See overdisp_fun() function : https://bbolker.github.io/mixedmodels-misc/glmmFAQ.html#testing--overdispersioncomputing-overdispersion-factor Fix bug using residuals() simulate() binomial families specified via cbind() weights = N. binomial sample size wasn’t passed typically resulting Inf/-Inf. Add mixture families: gamma_mix(), lognormal_mix() associated delta/hurdle families: delta_gamma_mix(), delta_lognormal_mix(). families feature mixture two distributions different means shared variance parameters. Add delta_beta() family.","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/news/index.html","id":"sdmtmb-013","dir":"Changelog","previous_headings":"","what":"sdmTMB 0.1.3","title":"sdmTMB 0.1.3","text":"Tweak sanity() checking standard error size. Export previously experimental plot_anisotropy() function. old function now plot_anisotropy2(). Allow passing offset data predict.sdmTMB() via offset argument.","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/news/index.html","id":"sdmtmb-012","dir":"Changelog","previous_headings":"","what":"sdmTMB 0.1.2","title":"sdmTMB 0.1.2","text":"Switch effects = 'ran_vals' random intercept values tidy.sdmTMB() match broom.mixed package. Make tidy.sdmTMB() return tibble tibble package installed. Note affect old code since drop = FALSE default tibbles drop = TRUE default data frames (.e., tibbles always return data frame subsetted). Fix longstanding issue predicting newdata mgcv’s t2(). Previously disabled issues. now works expected. Add knots argument sdmTMB(), passed mgcv. common use specify end points cyclical spline (e.g., s(x, bs = 'cc', k = 4), knots = list(x = c(1, 3, 5, 7))) data don’t extend fully boundaries match .","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/news/index.html","id":"sdmtmb-011","dir":"Changelog","previous_headings":"","what":"sdmTMB 0.1.1","title":"sdmTMB 0.1.1","text":"Preparing release CRAN. Add time-varying AR1 option (originally always random walk). See time_varying_type argument ?sdmTMB. Allow prediction newdata missing time elements. #130 Add check offset() (work sdmTMB, use offset argument instead). #131 Add check random slopes (sdmTMB currently random intercepts, although slopes can vary spatially). #131","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/news/index.html","id":"sdmtmb-010","dir":"Changelog","previous_headings":"","what":"sdmTMB 0.1.0","title":"sdmTMB 0.1.0","text":"ADREPORT several parameters natural space. https://github.com/pbs-assess/sdmTMB/discussions/113 Improve robustness model print() esoteric mgcv smoothers. Let sims_var work multiple spatially varying slopes (zeta_s); return output named list coefficients. #107 Add threshold_coefs sdmTMB_simulate(). Don’t make fake mesh non-spatial model (faster).","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/news/index.html","id":"sdmtmb-00269001","dir":"Changelog","previous_headings":"","what":"sdmTMB 0.0.26.9001","title":"sdmTMB 0.0.26.9001","text":"Add vignettes visreg, ggeffects, delta families (thanks J. Indivero!) #83 #87 #89 Forecasting presence-vignettes merged soon. Add support emmeans package. See ?emmeans.sdmTMB examples. Add support effects package. ggeffects::ggeffect() function can used make fast marginal effects plots. ggeffects::ggpredict() works custom fork ggeffects. pull request made shortly. #101 Add vcov(), fixef(), df.residual(), formula(), terms(), model.frame() methods. Add support \"cloglog\" link. Code adapted glmmTMB robust likelihood implementation. delta models, default share anisotropy parameters VAST. Separate anisotropy (old behavior) can estimated control = sdmTMBcontrol(map = list(ln_H_input = factor(c(1, 2, 3, 4)))) Add experimental do_index, predict_args, index_args sdmTMB(). can used perform prediction index calculation time fitting. large datasets meshes can save time compared fitting, predicting, index calculation 3 separate steps since TMB AD object doesn’t rebuilt. somewhat slow initial fitting. Remove max_gradient bad_eig get_index() output. Use unique locations prediction huge speedups large newdata gridded data. Fix bug rare cases get_index() return gibberish small values. Add bayesian argument, TRUE adds Jacobian adjustments non-linear transformed parameters. TRUE model passed tmbstan, FALSE otherwise. #95 Add experimental -yet-exported sdmTMB:::plot_anisotropy2(). Add many anisotropy, delta model, index calculation unit tests.","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/news/index.html","id":"sdmtmb-00249001","dir":"Changelog","previous_headings":"","what":"sdmTMB 0.0.24.9001","title":"sdmTMB 0.0.24.9001","text":"Enable random walk random field TMB simulation sdmTMB_simulate(). Add check irregular time AR1 random walk processes. Fix bugs introduced delta model code (offsets extra_time threshold model prediction). Fix bug sanity() message small random field SDs.","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/news/index.html","id":"sdmtmb-00249000","dir":"Changelog","previous_headings":"","what":"sdmTMB 0.0.24.9000","title":"sdmTMB 0.0.24.9000","text":"Add support ‘delta’ (‘hurdle’) models. See examples documentation ?sdmTMB. resulted substantial restructuring internal model code. default model components (e.g., binomial & Gamma) share formula, spatial, spatiotemporal structure, can separated supplying argument values lists first element corresponds first model second element corresponds second model (limitations described ?sdmTMB documentation ‘Details’). Add support multiple spatially varying coefficients (used limited single variable). Add compatibility ‘visreg’ package visualizing conditional effects parameters. See ?visreg_delta examples. Add MCMC residual type residuals.sdmTMB(). ‘better’ residuals slower calculate. See documentation ‘Details’ ?residuals.sdmTMB. Make offset argument sdmTMB(). Using reserved word offset formula now deprecated. Add sanity() function perform basic sanity checks model fits. Make sdmTMB() model object compatible update() method. Remove several deprecated arguments. Overhaul examples ?sdmTMB. Use faster “low-rank sparse hessian bias-correction” TMB bias correction. Add parallel processing support. See parallel argument sdmTMBcontrol. default, grabs value sdmTMB.cores option. E.g. options(sdmTMB.cores = 4). currently enabled Mac/Linux. Using many cores can much slower 1 core. Use ‘cli’ package cli_abort()/cli_warn()/cli_inform() stop()/warning()/message(). Add many unit tests.","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/news/index.html","id":"sdmtmb-00239000","dir":"Changelog","previous_headings":"","what":"sdmTMB 0.0.23.9000","title":"sdmTMB 0.0.23.9000","text":"package version number used internal testing ‘delta’ branch several people.","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/news/index.html","id":"sdmtmb-00229001","dir":"Changelog","previous_headings":"","what":"sdmTMB 0.0.22.9001","title":"sdmTMB 0.0.22.9001","text":"Switch TMBad library ~3-fold speedup(!)","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/news/index.html","id":"sdmtmb-00229000","dir":"Changelog","previous_headings":"","what":"sdmTMB 0.0.22.9000","title":"sdmTMB 0.0.22.9000","text":"Fix bug predictions poly(..., raw = FALSE) newdata. #77","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/news/index.html","id":"sdmtmb-00219009","dir":"Changelog","previous_headings":"","what":"sdmTMB 0.0.21.9009","title":"sdmTMB 0.0.21.9009","text":"Add experimental sdmTMB_stacking() ensemble model stacking weights. Add fake mesh random fields . #59 Make predict(..., newdata = NULL) also use last.par.best instead last.par match newdata = df. Fix bug MVN fixed-effect prior indexing sims n_sims arguments deprecated standardized nsim match simulate() S3 method. Bias correction get_index() get_cog() now selective just applied necessary derived parameters. INLA projection matrix ‘’ now shared across spatial spatiotemporal fields. Add add_utm_columns() ease adding UTM columns.","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/news/index.html","id":"sdmtmb-00209001","dir":"Changelog","previous_headings":"","what":"sdmTMB 0.0.20.9001","title":"sdmTMB 0.0.20.9001","text":"Add dharma_residuals(). Fix bug simulate.sdmTMB() residuals.sdmTMB() binomial family.","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/news/index.html","id":"sdmtmb-00209000","dir":"Changelog","previous_headings":"","what":"sdmTMB 0.0.20.9000","title":"sdmTMB 0.0.20.9000","text":"Smoothers now appear print() output. format roughly match brms. main-effect component (e.g., sdepth s(depth)) represents linear component random effect (e.g., sds(depth)) component output corresponds standard deviation penalized weights. Add censored_poisson(link = 'log') family; implemented @joenomiddlename fields sdmTMB() now deprecated replaced spatiotemporal. include_spatial sdmTMB() now deprecated replaced spatial. spatial_only sdmTMB() now deprecated replaced spatiotemporal. E.g. spatial_only = TRUE now spatiotemporal = '' leaving time = NULL. spde sdmTMB() now deprecated replaced mesh. sdmTMB_simulate() new likely replace sdmTMB_sim() eventually. sdmTMB_simulate() set take formula data frame easier use want different spatial observations (covariates) time slice. can also take fitted model modify parts simulate. Finally, function uses TMB simulation much faster flexible can simulate (e.g., anisotropy) previous version. spatial_trend now spatial_varying accepts one-sided formula single predictor coefficient varying space random field. Note may want include fixed effect variable improve interpretability. (scaled) time column used, represent local-time-trend model . Tweedie power (p) parameter now print() tidy() output. thetaf now tweedie_p sdmTMB_sim().","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/news/index.html","id":"sdmtmb-00199003","dir":"Changelog","previous_headings":"","what":"sdmTMB 0.0.19.9003","title":"sdmTMB 0.0.19.9003","text":"Fix bug affecting prediction se_fit = TRUE breakpoint models.","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/news/index.html","id":"sdmtmb-00199002","dir":"Changelog","previous_headings":"","what":"sdmTMB 0.0.19.9002","title":"sdmTMB 0.0.19.9002","text":"Simulation parameter covariance matrix works random effects turned . #57","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/news/index.html","id":"sdmtmb-00199000","dir":"Changelog","previous_headings":"","what":"sdmTMB 0.0.19.9000","title":"sdmTMB 0.0.19.9000","text":"Smoothers s() now penalized smoothers: determine degree wiggliness (mgcv) longer necessary choose appropriate k value priori. Models fit previous versions sdmTMB s(x, k = ...) match models specified way version >= 0.0.19 since basis functions now penalized. various mgcv::s() options supported t2() (ti() te()) supported.","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/news/index.html","id":"sdmtmb-00189001","dir":"Changelog","previous_headings":"","what":"sdmTMB 0.0.18.9001","title":"sdmTMB 0.0.18.9001","text":"Add ELPD (expected log predictive density) sdmTMB_cv() https://arxiv.org/abs/1507.04544 Fix bug evaluating ... sdmTMB_cv() called within function. #54","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/news/index.html","id":"sdmtmb-00189000","dir":"Changelog","previous_headings":"","what":"sdmTMB 0.0.18.9000","title":"sdmTMB 0.0.18.9000","text":"Fix minor error PC Matern prior","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/news/index.html","id":"sdmtmb-00179000","dir":"Changelog","previous_headings":"","what":"sdmTMB 0.0.17.9000","title":"sdmTMB 0.0.17.9000","text":"Add random walk option: fields = \"RW\". Depreciate ar1_fields argument. See new fields argument `sdmTMB(). Many packages moved ‘Imports’ ‘Suggests’","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/news/index.html","id":"sdmtmb-00169000","dir":"Changelog","previous_headings":"","what":"sdmTMB 0.0.16.9000","title":"sdmTMB 0.0.16.9000","text":"Lower default nlminb() eval.max iter.max 1000 2000. Added profile option sdmTMBcontrol(). can dramatically improve model fitting speed many fixed effects. Note result likely slightly different TRUE vs. FALSE. Added simulation MVN precision matrix predict.sdmTMB(). See sims argument. Added gather_sims() spread_sims() extract parameter simulations joint precision matrix format matches tidybayes package. Added get_index_sims() population index calculated MVN simulation draws. Added extract_mcmc() extract MCMC samples model passed tmbstan. Added ability predict model fitted tmbstan. See tmbstan_model argument predict.sdmTMB(). Allowed separate random field Matern range parameters spatial spatiotemporal fields. E.g. sdmTMB(shared_range = FALSE) Bounded AR1 rho parameter -0.999 0.999 improve convergence; -1 1. Please post issue creates problems model. Added map, start, lower, upper options control model fitting. See sdmTMBcontrol(). Added priors parameters. See ?sdmTMB::priors priors argument sdmTMB(). PC priors available random fields. See ?pc_matern details . Moved many less-common arguments sdmTMB() sdmTMBcontrol(). Fix bug sdmTMB_cv() fitting testing data splits reversed. .e., small chunk fit; big chunk tested.","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/news/index.html","id":"sdmtmb-00159000","dir":"Changelog","previous_headings":"","what":"sdmTMB 0.0.15.9000","title":"sdmTMB 0.0.15.9000","text":"Added experimental penalized complexity (PC) prior used INLA. See arguments matern_prior_O matern_prior_E. Added back normalize argument sdmTMB() default FALSE. Setting TRUE can dramatically speed model fits (~4 times test models).","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/news/index.html","id":"sdmtmb-00149003","dir":"Changelog","previous_headings":"","what":"sdmTMB 0.0.14.9003","title":"sdmTMB 0.0.14.9003","text":"Added vignette making pretty maps output","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/news/index.html","id":"sdmtmb-00149001","dir":"Changelog","previous_headings":"","what":"sdmTMB 0.0.14.9001","title":"sdmTMB 0.0.14.9001","text":"AR1 spatial-model Missing factor levels time Coordinate systems big","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/news/index.html","id":"sdmtmb-00149000","dir":"Changelog","previous_headings":"","what":"sdmTMB 0.0.14.9000","title":"sdmTMB 0.0.14.9000","text":"Add re_form_iid predict.sdmTMB(). Add map_rf option sdmTMB(). lets map (fix starting values zero) random fields produce classic GLM/GLMM.","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/news/index.html","id":"sdmtmb-00139000","dir":"Changelog","previous_headings":"","what":"sdmTMB 0.0.13.9000","title":"sdmTMB 0.0.13.9000","text":"Add IID random intercepts interface. E.g. ... + (1 | g) #34","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/news/index.html","id":"sdmtmb-00129000","dir":"Changelog","previous_headings":"","what":"sdmTMB 0.0.12.9000","title":"sdmTMB 0.0.12.9000","text":"Add epsilon_predictor argument sdmTMB() allow model spatiotemporal variance time.","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/news/index.html","id":"sdmtmb-00119000","dir":"Changelog","previous_headings":"","what":"sdmTMB 0.0.11.9000","title":"sdmTMB 0.0.11.9000","text":"Add penalties argument allow regularization.","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/news/index.html","id":"sdmtmb-00109001","dir":"Changelog","previous_headings":"","what":"sdmTMB 0.0.10.9001","title":"sdmTMB 0.0.10.9001","text":"Fix Student-t degrees freedom randomized quantile residuals","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/news/index.html","id":"sdmtmb-00109000","dir":"Changelog","previous_headings":"","what":"sdmTMB 0.0.10.9000","title":"sdmTMB 0.0.10.9000","text":"Fixed parameter initialization inverse links #35 Switched Gamma ‘phi’ parameter representing shape instead CV match glm(), glmmTMB(), etc.","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/news/index.html","id":"sdmtmb-0099000","dir":"Changelog","previous_headings":"","what":"sdmTMB 0.0.9.9000","title":"sdmTMB 0.0.9.9000","text":"Switched density/abundance index calculation use link function opposed hardcoded log() get_generic() function can used grab things like standardized average values response across grid. used log_total raw TMB output now link_total users shouldn’t notice difference.","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/news/index.html","id":"sdmtmb-0089000","dir":"Changelog","previous_headings":"","what":"sdmTMB 0.0.8.9000","title":"sdmTMB 0.0.8.9000","text":"Overhauled simulation function. function now called sdmTMB_sim() uses INLA functions instead RandomFields functions simulating random fields. simulation function can now accommodate families links takes INLA mesh input.","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/news/index.html","id":"sdmtmb-0079001","dir":"Changelog","previous_headings":"","what":"sdmTMB 0.0.7.9001","title":"sdmTMB 0.0.7.9001","text":"Allow specifying degrees freedom Student-t family #29","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/news/index.html","id":"sdmtmb-0079000","dir":"Changelog","previous_headings":"","what":"sdmTMB 0.0.7.9000","title":"sdmTMB 0.0.7.9000","text":"Added tidy() method (broom broom.mixed) return data frame parameter estimates. function can extract fixed effects random effect parameters (variances, AR1 correlation, spatial range). Added argument extra_time sdmTMB(). introduces additional time slices can predict want interpolate forecast. Internally, uses Eric Ward’s ‘weights hack’. also useful data unevenly spaced time want gaps evenly spaced random walk AR1 process (add missing years extra_time). make_spde() now replaced make_mesh() make_spde() soft deprecated. make_mesh() carries x y column names predict function line tidyverse style taking data frame first. make_mesh() can accept cutoff argument (INLA), likely better default way specify mesh since scales across regions better line literature INLA. make_mesh() can use binary search algorithm find cutoff best matches desired number knots (thanks Kelli Johnson idea). Barrier meshes now possible. See add_barrier_mesh() example. pkgdown website now gets auto generated GitHub actions. start model description vignette. much work progress.","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/news/index.html","id":"sdmtmb-0069009","dir":"Changelog","previous_headings":"","what":"sdmTMB 0.0.6.9009","title":"sdmTMB 0.0.6.9009","text":"Fixed bug dlnorm","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/news/index.html","id":"sdmtmb-0069005","dir":"Changelog","previous_headings":"","what":"sdmTMB 0.0.6.9005","title":"sdmTMB 0.0.6.9005","text":"Fixed bug predictions standard errors one(?) parameter (breakpoint parameter) passed initial instead MLE value.","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/news/index.html","id":"sdmtmb-0069004","dir":"Changelog","previous_headings":"","what":"sdmTMB 0.0.6.9004","title":"sdmTMB 0.0.6.9004","text":"Fixed bug predictions new data models break points Overhauled cross validation function. function now: uses Eric’s weights hack can also used forecasting initializes subsequent folds MLE first fold considerable speed increases works parallel future plan initialized; see examples Added threshold parameters print method Added forecasting example weights hack Fixed bug linear break point models","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/news/index.html","id":"sdmtmb-0069002","dir":"Changelog","previous_headings":"","what":"sdmTMB 0.0.6.9002","title":"sdmTMB 0.0.6.9002","text":"Fixed GAM predictions 0s new data. Add linear logistic threshold models. #17","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/news/index.html","id":"sdmtmb-0059000","dir":"Changelog","previous_headings":"","what":"sdmTMB 0.0.5.9000","title":"sdmTMB 0.0.5.9000","text":"Added parsing mgcv formulas splines. #16 Added ability predict standard errors population level. helps making marginal-effect plots. #15 Added optimization options aid convergence. Also added run_extra_optimization() run already fit models. Default extra optimization. Added binomial likelihood cross validation. Git hash ee3f3ba. Started keeping track news NEWS.md.","code":""}] +[{"path":"https://pbs-assess.github.io/sdmTMB/articles/model-description.html","id":"other-vignettes-available","dir":"Articles","previous_headings":"","what":"Other vignettes available","title":"sdmTMB model description","text":"vignette viewed CRAN, note many vignettes describing use sdmTMB available documentation site Articles.","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/articles/model-description.html","id":"notation-conventions","dir":"Articles","previous_headings":"","what":"Notation conventions","title":"sdmTMB model description","text":"appendix uses following notation conventions, generally follow guidance Edwards & Auger-Méthé (2019): Greek symbols parameters, Latin/Roman alphabet data (except \\(\\boldsymbol{Q}\\) \\(\\boldsymbol{H}\\), used convention), bold symbols vectors matrices (e.g., \\(\\boldsymbol{\\omega}\\) vector \\(\\omega_{\\boldsymbol{s}}\\) value \\(\\boldsymbol{\\omega}\\) point space \\(\\boldsymbol{s}\\)), \\(\\phi\\) distribution dispersion parameters consistency code, \\(\\mathbb{E}[y]\\) define expected value (mean) variable \\(y\\), \\(\\mathrm{Var}[y]\\) define expected variance variable \\(y\\), \\(^*\\) superscript represents interpolated projected values opposed values knot locations (e.g., \\(\\boldsymbol{\\omega}\\) vs. \\(\\boldsymbol{\\omega}^*\\)), possible, notation chosen match VAST (Thorson 2019) maintain consistency (e.g., \\(\\boldsymbol{\\omega}\\) spatial fields \\(\\boldsymbol{\\epsilon}_t\\) spatiotemporal fields). include tables major indices (Table 1) symbols (Table 2). Table 1: Subscript notation Table 2: Symbol notation, code representation (model output model template code), descriptions.","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/articles/model-description.html","id":"sdmtmb-model-structure","dir":"Articles","previous_headings":"","what":"sdmTMB model structure","title":"sdmTMB model description","text":"complete sdmTMB model can written \\[ \\begin{aligned} \\mathbb{E}[y_{\\boldsymbol{s},t}] &= \\mu_{\\boldsymbol{s},t},\\\\ \\mu_{\\boldsymbol{s},t} &= f^{-1} \\left( \\boldsymbol{X}^{\\mathrm{main}}_{\\boldsymbol{s},t} \\boldsymbol{\\beta} + O_{\\boldsymbol{s},t} + \\alpha_g + \\boldsymbol{X}^{\\mathrm{tvc}}_{\\boldsymbol{s},t} \\boldsymbol{\\gamma_t} + \\boldsymbol{X}^{\\mathrm{svc}}_{\\boldsymbol{s},t} \\zeta_{\\boldsymbol{s}} + \\omega_{\\boldsymbol{s}} + \\epsilon_{\\boldsymbol{s},t} \\right), \\end{aligned} \\] \\(y_{\\boldsymbol{s},t}\\) represents response data point \\(\\boldsymbol{s}\\) time \\(t\\); \\(\\mu\\) represents mean; \\(f\\) represents link function (e.g., log logit) \\(f^{-1}\\) represents inverse; \\(\\boldsymbol{X}^{\\mathrm{main}}\\), \\(\\boldsymbol{X}^{\\mathrm{tvc}}\\), \\(\\boldsymbol{X}^{\\mathrm{svc}}\\) represent design matrices (superscript identifiers ‘main’ = main effects, ‘tvc’ = time varying coefficients, ‘svc’ = spatially varying coefficients); \\(\\boldsymbol{\\beta}\\) represents vector fixed-effect coefficients; \\(O_{\\boldsymbol{s},t}\\) represents offset: covariate (usually log transformed) coefficient fixed one; \\(\\alpha_{g}\\) represents random intercepts group \\(g\\), \\(\\alpha_{g}\\sim \\mathrm{N}(0,\\sigma^2_\\alpha)\\); \\(\\gamma_{t}\\) represents time-varying coefficients (random walk), \\(\\gamma_{t} \\sim \\mathrm{N}(\\gamma_{t-1},\\sigma^2_\\gamma)\\); \\(\\zeta_{\\boldsymbol{s}}\\) represents spatially varying coefficients (random field), \\(\\zeta_{\\boldsymbol{s}} \\sim \\mathrm{MVN}(\\boldsymbol{0},\\boldsymbol{\\Sigma}_\\zeta)\\); \\(\\omega_{\\boldsymbol{s}}\\) represents spatial component (random field), \\(\\omega_{\\boldsymbol{s}} \\sim \\mathrm{MVN}(\\boldsymbol{0},\\boldsymbol{\\Sigma}_\\omega)\\); \\(\\epsilon_{\\boldsymbol{s},t}\\) represents spatiotemporal component (random field), \\(\\epsilon_{\\boldsymbol{s},t} \\sim \\mathrm{MVN}(\\boldsymbol{0},\\boldsymbol{\\Sigma}_{\\epsilon})\\). single sdmTMB model rarely, ever, contain components. Next, split model describe various parts detail using ‘\\(\\ldots\\)’ represent optional components.","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/articles/model-description.html","id":"main-effects","dir":"Articles","previous_headings":"sdmTMB model structure","what":"Main effects","title":"sdmTMB model description","text":"\\[ \\begin{aligned} \\mu_{\\boldsymbol{s},t} &= f^{-1} \\left( \\boldsymbol{X}^{\\mathrm{main}}_{\\boldsymbol{s},t} \\boldsymbol{\\beta} \\ldots \\right) \\end{aligned} \\] Within sdmTMB(), \\(\\boldsymbol{X}^{\\mathrm{main}}_{\\boldsymbol{s},t} \\boldsymbol{\\beta}\\) defined formula argument represents main-effect model matrix corresponding vector coefficients. main effect formula can contain optional penalized smoothers non-linear functions defined .","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/articles/model-description.html","id":"smoothers","dir":"Articles","previous_headings":"sdmTMB model structure > Main effects","what":"Smoothers","title":"sdmTMB model description","text":"Smoothers sdmTMB implemented formula syntax familiar mgcv (Wood 2017) users fitting GAMs (generalized additive models). Smooths implemented formula using + s(x), implements smooth mgcv::s(). Within smooths, syntax commonly used mgcv::s() can applied, e.g. 2-dimensional smooths may constructed + s(x, y); smooths can specific various factor levels, + s(x, = group); smooths can vary according continuous variable, + s(x, = x2); basis function dimensions may specified, e.g. + s(x, k = 4) (see ?mgcv::choose.k); various types splines may constructed cyclic splines model seasonality, e.g. + s(month, bs = \"cc\", k = 12). mgcv can fit unpenalized (e.g., B-splines) penalized splines (P-splines), sdmTMB implements penalized splines. penalized splines constructed sdmTMB using function mgcv::smooth2random(), transforms splines random effects (associated design matrices) estimable mixed-effects modelling framework. approach implemented R packages gamm4 (Wood & Scheipl 2020) brms (Bürkner 2017).","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/articles/model-description.html","id":"linear-break-point-threshold-models","dir":"Articles","previous_headings":"sdmTMB model structure > Main effects","what":"Linear break-point threshold models","title":"sdmTMB model description","text":"linear break-point “hockey stick” model can used describe threshold asymptotic responses. function consists two pieces, \\(x < b_{1}\\), \\(s(x) = x \\cdot b_{0}\\), \\(x > b_{1}\\), \\(s(x) = b_{1} \\cdot b_{0}\\). cases, \\(b_{0}\\) represents slope function threshold, product \\(b_{1} \\cdot b_{0}\\) represents value asymptote. constraints placed parameters \\(b_{0}\\) \\(b_{1}\\). models can fit including + breakpt(x) model formula, x covariate. formula can contain single break-point covariate.","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/articles/model-description.html","id":"logistic-threshold-models","dir":"Articles","previous_headings":"sdmTMB model structure > Main effects","what":"Logistic threshold models","title":"sdmTMB model description","text":"Models logistic threshold relationships predictor response can fit form \\[ s(x)=\\tau + \\psi\\ { \\left[ 1+{ e }^{ -\\ln \\left(19\\right) \\cdot \\left( x-s50 \\right) / \\left(s95 - s50 \\right) } \\right] }^{-1}, \\] \\(s\\) represents logistic function, \\(\\psi\\) scaling parameter (controlling height y-axis response; unconstrained), \\(\\tau\\) intercept, \\(s50\\) parameter controlling point function reaches 50% maximum (\\(\\psi\\)), \\(s95\\) parameter controlling point function reaches 95% maximum. parameter \\(s50\\) unconstrained \\(s95\\) constrained larger \\(s50\\). models can fit including + logistic(x) model formula, x covariate. formula can contain single logistic covariate.","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/articles/model-description.html","id":"spatial-random-fields","dir":"Articles","previous_headings":"sdmTMB model structure","what":"Spatial random fields","title":"sdmTMB model description","text":"Spatial random fields, \\(\\omega_{\\boldsymbol{s}}\\), included spatial = '' (TRUE) omitted spatial = '' (FALSE). \\[ \\begin{aligned} \\mu_{\\boldsymbol{s},t} &= f^{-1} \\left( \\ldots + \\omega_{\\boldsymbol{s}} + \\ldots \\right),\\\\ \\boldsymbol{\\omega} &\\sim \\operatorname{MVNormal} \\left( \\boldsymbol{0}, \\boldsymbol{\\Sigma}_\\omega \\right),\\\\ \\end{aligned} \\] marginal standard deviation \\(\\boldsymbol{\\omega}\\) indicated Spatial SD printed model output sigma_O output sdmTMB::tidy(fit, \"ran_pars\"). ‘O’ ‘omega’ (\\(\\omega\\)). Internally, random fields follow Gaussian Markov random field (GMRF) \\[ \\boldsymbol{\\omega} \\sim \\mathrm{MVNormal}\\left(\\boldsymbol{0}, \\sigma_\\omega^2 \\boldsymbol{Q}^{-1}_\\omega\\right), \\] \\(\\boldsymbol{Q}_\\omega\\) sparse precision matrix \\(\\sigma_\\omega^2\\) marginal variance.","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/articles/model-description.html","id":"spatiotemporal-random-fields","dir":"Articles","previous_headings":"sdmTMB model structure","what":"Spatiotemporal random fields","title":"sdmTMB model description","text":"Spatiotemporal random fields included default multiple time elements (time argument NULL) can set IID (independent identically distributed, 'iid'; default), AR(1) ('ar1'), random walk ('rw'), ('') via spatiotemporal argument. text values case insensitive. Spatiotemporal random fields represented \\(\\boldsymbol{\\epsilon}_t\\) within sdmTMB. chosen match representation VAST (Thorson 2019). marginal standard deviation \\(\\boldsymbol{\\epsilon}_t\\) indicated Spatiotemporal SD printed model output sigma_E output sdmTMB::tidy(fit, \"ran_pars\"). ‘E’ ‘epsilon’ (\\(\\epsilon\\)).","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/articles/model-description.html","id":"iid-spatiotemporal-random-fields","dir":"Articles","previous_headings":"sdmTMB model structure > Spatiotemporal random fields","what":"IID spatiotemporal random fields","title":"sdmTMB model description","text":"IID spatiotemporal random fields (spatiotemporal = 'iid') can represented \\[ \\begin{aligned} \\mu_{\\boldsymbol{s},t} &= f^{-1} \\left( \\ldots + \\epsilon_{\\boldsymbol{s},t} + \\ldots \\right),\\\\ \\boldsymbol{\\epsilon_{t}} &\\sim \\operatorname{MVNormal} \\left( \\boldsymbol{0}, \\boldsymbol{\\Sigma}_{\\epsilon} \\right). \\end{aligned} \\] \\(\\epsilon_{\\boldsymbol{s},t}\\) represent random field deviations point \\(\\boldsymbol{s}\\) time \\(t\\). random fields assumed independent across time steps. Similarly spatial random fields, spatiotemporal random fields (including versions described ) parameterized internally sparse precision matrix (\\(\\boldsymbol{Q}_\\epsilon\\)) \\[ \\boldsymbol{\\epsilon_{t}} \\sim \\mathrm{MVNormal}\\left(\\boldsymbol{0}, \\sigma_\\epsilon^2 \\boldsymbol{Q}^{-1}_\\epsilon\\right). \\]","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/articles/model-description.html","id":"ar1-spatiotemporal-random-fields","dir":"Articles","previous_headings":"sdmTMB model structure > Spatiotemporal random fields","what":"AR(1) spatiotemporal random fields","title":"sdmTMB model description","text":"First-order auto regressive, AR(1), spatiotemporal random fields (spatiotemporal = 'ar1') add parameter defining correlation random field deviations one time step next. defined \\[ \\begin{aligned} \\mu_{\\boldsymbol{s},t} &= f^{-1} \\left( \\ldots + \\delta_{\\boldsymbol{s},t} \\ldots \\right),\\\\ \\boldsymbol{\\delta}_{t=1} &\\sim \\operatorname{MVNormal} (\\boldsymbol{0}, \\boldsymbol{\\Sigma}_{\\epsilon}),\\\\ \\boldsymbol{\\delta}_{t>1} &= \\rho \\boldsymbol{\\delta}_{t-1} + \\sqrt{1 - \\rho^2} \\boldsymbol{\\epsilon_{t}}, \\: \\boldsymbol{\\epsilon_{t}} \\sim \\operatorname{MVNormal} \\left(\\boldsymbol{0}, \\boldsymbol{\\Sigma}_{\\epsilon} \\right), \\end{aligned} \\] \\(\\rho\\) correlation subsequent spatiotemporal random fields. \\(\\rho \\boldsymbol{\\delta}_{t-1} + \\sqrt{1 - \\rho^2}\\) term scales spatiotemporal variance correlation represents steady-state marginal variance. correlation \\(\\rho\\) allows mean-reverting spatiotemporal fields, constrained \\(-1 < \\rho < 1\\). Internally, parameter estimated ar1_phi, unconstrained. parameter ar1_phi transformed \\(\\rho\\) \\(\\rho = 2 \\left( \\mathrm{logit}^{-1}(\\texttt{ar1\\_phi}) - 1 \\right)\\).","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/articles/model-description.html","id":"random-walk-spatiotemporal-random-fields-rw","dir":"Articles","previous_headings":"sdmTMB model structure > Spatiotemporal random fields","what":"Random walk spatiotemporal random fields (RW)","title":"sdmTMB model description","text":"Random walk spatiotemporal random fields (spatiotemporal = 'rw') represent model difference spatiotemporal deviations one time step next IID. defined \\[ \\begin{aligned} \\mu_{\\boldsymbol{s},t} &= f^{-1} \\left( \\ldots + \\delta_{\\boldsymbol{s},t} + \\ldots \\right),\\\\ \\boldsymbol{\\delta}_{t=1} &\\sim \\operatorname{MVNormal} (\\boldsymbol{0}, \\boldsymbol{\\Sigma}_{\\epsilon}),\\\\ \\boldsymbol{\\delta}_{t>1} &= \\boldsymbol{\\delta}_{t-1} + \\boldsymbol{\\epsilon_{t-1}}, \\: \\boldsymbol{\\epsilon_{t-1}} \\sim \\operatorname{MVNormal} \\left(\\boldsymbol{0}, \\boldsymbol{\\Sigma}_{\\epsilon} \\right), \\end{aligned} \\] distribution spatiotemporal field initial time step AR(1) model, absence \\(\\rho\\) parameter allows spatiotemporal field non-stationary time. Note , contrast AR(1) parametrization, variance longer steady-state marginal variance.","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/articles/model-description.html","id":"time-varying-regression-parameters","dir":"Articles","previous_headings":"sdmTMB model structure","what":"Time-varying regression parameters","title":"sdmTMB model description","text":"Parameters can modelled time-varying according random walk first-order autoregressive, AR(1), process. time-series model defined time_varying_type. types: \\[ \\begin{aligned} \\mu_{\\boldsymbol{s},t} &= f^{-1} \\left( \\ldots + \\boldsymbol{X}^{\\mathrm{tvc}}_{\\boldsymbol{s},t} \\boldsymbol{\\gamma_{t}} + \\ldots \\right), \\end{aligned} \\] \\(\\boldsymbol{\\gamma_t}\\) optional vector time-varying regression parameters \\(\\boldsymbol{X}^{\\mathrm{tvc}}_{\\boldsymbol{s},t}\\) corresponding model matrix covariate values. defined via time_varying argument, assuming time argument also supplied column name. time_varying takes one-sided formula. ~ 1 implies time-varying intercept. time_varying_type = 'rw', first time step estimated independently: \\[ \\begin{aligned} \\gamma_{t=1} &\\sim \\operatorname{Uniform} \\left(-\\infty, \\infty \\right),\\\\ \\gamma_{t>1} &\\sim \\operatorname{Normal} \\left(\\gamma_{t-1}, \\sigma^2_{\\gamma} \\right). \\end{aligned} \\] case, first time-step value given implicit uniform prior. .e., variable appear fixed effect formula since initial value estimated part time-varying formula. formula time_varying = ~ 1 implicitly represents time-varying intercept (assuming time argument supplied) , case, intercept omitted main effects (formula ~ + 0 + ... formula ~ -1 + ...). time_varying_type = 'rw0', first time step estimated mean-zero prior: \\[ \\begin{aligned} \\gamma_{t=1} &\\sim \\operatorname{Normal} \\left(0, \\sigma^2_{\\gamma} \\right),\\\\ \\gamma_{t>1} &\\sim \\operatorname{Normal} \\left(\\gamma_{t-1}, \\sigma^2_{\\gamma} \\right). \\end{aligned} \\] case, time-varying variable (including intercept) included main effects. suggest using formulation, leave 'rw' option legacy code works. time_varying_type = 'ar1': \\[ \\begin{aligned} \\gamma_{t=1} &\\sim \\operatorname{Normal} \\left(0, \\sigma^2_{\\gamma} \\right),\\\\ \\gamma_{t>1} &\\sim \\operatorname{Normal} \\left(\\rho_\\gamma\\gamma_{t-1}, \\sqrt{1 - \\rho_\\gamma^2} \\sigma^2_{\\gamma} \\right), \\end{aligned} \\] \\(\\rho_{\\gamma}\\) correlation subsequent time steps. first time step given mean-zero prior.","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/articles/model-description.html","id":"spatially-varying-coefficients-svc","dir":"Articles","previous_headings":"sdmTMB model structure","what":"Spatially varying coefficients (SVC)","title":"sdmTMB model description","text":"Spatially varying coefficient models defined \\[ \\begin{aligned} \\mu_{\\boldsymbol{s},t} &= f^{-1} \\left( \\ldots + \\boldsymbol{X}^{\\mathrm{svc}}_{\\boldsymbol{s}, t} \\zeta_{\\boldsymbol{s}} + \\ldots \\right),\\\\ \\boldsymbol{\\zeta} &\\sim \\operatorname{MVNormal} \\left( \\boldsymbol{0}, \\boldsymbol{\\Sigma}_\\zeta \\right), \\end{aligned} \\] \\(\\boldsymbol{\\zeta}\\) random field representing spatially varying coefficient. Usually, \\(\\boldsymbol{X}^{\\mathrm{svc}}_{\\boldsymbol{s}, t}\\) represent prediction matrix constant spatially given time \\(t\\) defined one-sided formula supplied spatial_varying. example spatial_varying = ~ 0 + x, 0 omits intercept. random fields parameterized internally sparse precision matrix (\\(\\boldsymbol{Q}_\\zeta\\)) \\[ \\boldsymbol{\\zeta} \\sim \\mathrm{MVNormal}\\left(\\boldsymbol{0}, \\sigma_\\zeta^2 \\boldsymbol{Q}^{-1}_\\zeta\\right). \\]","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/articles/model-description.html","id":"iid-random-or-multi-level-intercepts","dir":"Articles","previous_headings":"sdmTMB model structure","what":"IID random or multi-level intercepts","title":"sdmTMB model description","text":"Multilevel/hierarchical intercepts defined \\[ \\begin{aligned} \\mu_{\\boldsymbol{s},t} &= f^{-1} \\left( \\ldots + \\alpha_{g} + \\ldots \\right),\\\\ \\alpha_g &\\sim \\operatorname{Normal} \\left(0, \\sigma_\\alpha^2 \\right),\\\\ \\end{aligned} \\] \\(\\alpha_g\\) example optional “random” intercept—intercept mean zero varies level \\(g\\) constrained \\(\\sigma_\\alpha\\). defined formula argument via (1 | g) syntax lme4 glmmTMB. can multiple random intercepts, despite showing one . E.g., (1 | g1) + (1 | g2), case assumed independent uncorrelated .","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/articles/model-description.html","id":"offset-terms","dir":"Articles","previous_headings":"sdmTMB model structure","what":"Offset terms","title":"sdmTMB model description","text":"Offset terms can included offset argument sdmTMB(). included linear predictor \\[ \\begin{aligned} \\mu_{\\boldsymbol{s},t} &= f^{-1} \\left( \\ldots + O_{\\boldsymbol{s},t} + \\ldots \\right), \\end{aligned} \\] \\(O_{\\boldsymbol{s},t}\\) offset term—log transformed variable without coefficient (assuming log link). offset included prediction. Therefore, offset represents measure effort, example, prediction one unit effort (log(1) = 0).","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/articles/model-description.html","id":"observation-model-families","dir":"Articles","previous_headings":"","what":"Observation model families","title":"sdmTMB model description","text":"describe main observation families available sdmTMB comment parametrization, statistical properties, utility, code representation sdmTMB.","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/articles/model-description.html","id":"binomial","dir":"Articles","previous_headings":"Observation model families","what":"Binomial","title":"sdmTMB model description","text":"\\[ \\operatorname{Binomial} \\left(N, \\mu \\right) \\] \\(N\\) size number trials, \\(\\mu\\) probability success trial. \\(N = 1\\), distribution becomes Bernoulli distribution. Internally, distribution parameterized robust version TMB, numerically stable probabilities approach 0 1. Following structure stats::glm(), lme4, glmmTMB, binomial family can specified one 4 ways: response may factor (model classifies first level versus others) response may binomial (0/1) response can matrix form cbind(success, failure), response may observed proportions, weights argument used specify Binomial size (\\(N\\)) parameter (probabilty ~ ..., weights = N). Code defined within TMB. Example: family = binomial(link = \"logit\")","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/articles/model-description.html","id":"beta","dir":"Articles","previous_headings":"Observation model families","what":"Beta","title":"sdmTMB model description","text":"\\[ \\operatorname{Beta} \\left(\\mu \\phi, (1 - \\mu) \\phi \\right) \\] \\(\\mu\\) mean \\(\\phi\\) precision parameter. parametrization follows Ferrari & Cribari-Neto (2004) betareg R package (Cribari-Neto & Zeileis 2010). variance \\(\\mu (1 - \\mu) / (\\phi + 1)\\). Code defined within TMB. Example: family = Beta(link = \"logit\")","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/articles/model-description.html","id":"gamma","dir":"Articles","previous_headings":"Observation model families","what":"Gamma","title":"sdmTMB model description","text":"\\[ \\operatorname{Gamma} \\left( \\phi, \\frac{\\mu}{\\phi} \\right) \\] \\(\\phi\\) represents Gamma shape \\(\\mu / \\phi\\) represents scale. mean \\(\\mu\\) variance \\(\\mu \\cdot \\phi^2\\). Code defined within TMB. Example: family = Gamma(link = \"log\")","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/articles/model-description.html","id":"gaussian","dir":"Articles","previous_headings":"Observation model families","what":"Gaussian","title":"sdmTMB model description","text":"\\[ \\operatorname{Normal} \\left( \\mu, \\phi^2 \\right) \\] \\(\\mu\\) mean \\(\\phi\\) standard deviation. variance \\(\\phi^2\\). Example: family = Gaussian(link = \"identity\") Code defined within TMB.","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/articles/model-description.html","id":"lognormal","dir":"Articles","previous_headings":"Observation model families","what":"Lognormal","title":"sdmTMB model description","text":"sdmTMB uses bias-corrected lognormal distribution \\(\\phi\\) represents standard deviation log-space: \\[ \\operatorname{Lognormal} \\left( \\log \\mu - \\frac{\\phi^2}{2}, \\phi^2 \\right). \\] bias correction, \\(\\mathbb{E}[y] = \\mu\\) \\(\\mathrm{Var}[\\log y] = \\phi^2\\). Code defined within sdmTMB based TMB dnorm() normal density. Example: family = lognormal(link = \"log\")","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/articles/model-description.html","id":"negative-binomial-1-nb1","dir":"Articles","previous_headings":"Observation model families","what":"Negative Binomial 1 (NB1)","title":"sdmTMB model description","text":"\\[ \\operatorname{NB1} \\left( \\mu, \\phi \\right) \\] \\(\\mu\\) mean \\(\\phi\\) dispersion parameter. variance scales linearly mean \\(\\mathrm{Var}[y] = \\mu + \\mu / \\phi\\) (Hilbe 2011). Internally, distribution parameterized robust version TMB. Code defined within sdmTMB based NB2 borrowed glmmTMB. Example: family = nbinom1(link = \"log\")","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/articles/model-description.html","id":"negative-binomial-2-nb2","dir":"Articles","previous_headings":"Observation model families","what":"Negative Binomial 2 (NB2)","title":"sdmTMB model description","text":"\\[ \\operatorname{NB2} \\left( \\mu, \\phi \\right) \\] \\(\\mu\\) mean \\(\\phi\\) dispersion parameter. variance scales quadratically mean \\(\\mathrm{Var}[y] = \\mu + \\mu^2 / \\phi\\) (Hilbe 2011). NB2 parametrization commonly seen ecology NB1. Internally, distribution parameterized robust version TMB. Code defined within TMB. Example: family = nbinom2(link = \"log\")","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/articles/model-description.html","id":"poisson","dir":"Articles","previous_headings":"Observation model families","what":"Poisson","title":"sdmTMB model description","text":"\\[ \\operatorname{Poisson} \\left( \\mu \\right) \\] \\(\\mu\\) represents mean \\(\\mathrm{Var}[y] = \\mu\\). Code defined within TMB. Example: family = poisson(link = \"log\")","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/articles/model-description.html","id":"student-t","dir":"Articles","previous_headings":"Observation model families","what":"Student-t","title":"sdmTMB model description","text":"\\[ \\operatorname{Student-t} \\left( \\mu, \\phi, \\nu \\right) \\] \\(\\nu\\), degrees freedom (df), user-supplied fixed parameter. Lower values \\(\\nu\\) result heavier tails compared Gaussian distribution. approximately df = 20, distribution becomes similar Gaussian. Student-t distribution low degrees freedom (e.g., \\(\\nu \\le 7\\)) can helpful modelling data otherwise suitable Gaussian needs approach robust outliers (e.g., Anderson et al. 2017). Code defined within sdmTMB based dt() distribution TMB. Example: family = student(link = \"log\", df = 7)","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/articles/model-description.html","id":"tweedie","dir":"Articles","previous_headings":"Observation model families","what":"Tweedie","title":"sdmTMB model description","text":"\\[ \\operatorname{Tweedie} \\left(\\mu, p, \\phi \\right), \\: 1 < p < 2 \\] \\(\\mu\\) mean, \\(p\\) power parameter constrained 1 2, \\(\\phi\\) dispersion parameter. Tweedie distribution can helpful modelling data positive continuous also contain zeros. Internally, \\(p\\) transformed \\(\\mathrm{logit}^{-1} (\\texttt{thetaf}) + 1\\) constrain 1 2 estimated unconstrained variable. source code implemented cplm package (Zhang 2013) based Dunn & Smyth (2005). TMB version defined . Example: family = tweedie(link = \"log\")","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/articles/model-description.html","id":"gamma-mixture","dir":"Articles","previous_headings":"Observation model families","what":"Gamma mixture","title":"sdmTMB model description","text":"2 component mixture extends Gamma distribution, \\[ (1 - p) \\cdot \\operatorname{Gamma} \\left( \\phi, \\frac{\\mu_{1}}{\\phi} \\right) + p \\cdot \\operatorname{Gamma} \\left( \\phi, \\frac{\\mu_{2}}{\\phi} \\right), \\] \\(\\phi\\) represents Gamma shape, \\(\\mu_{1} / \\phi\\) represents scale first (smaller component) distribution, \\(\\mu_{2} / \\phi\\) represents scale second (larger component) distribution, \\(p\\) controls contribution component mixture (also interpreted probability larger events). mean \\((1-p) \\cdot \\mu_{1} + p \\cdot \\mu_{2}\\) variance \\((1-p) ^ 2 \\cdot \\mu_{1} \\cdot \\phi^2 + (p) ^ 2 \\cdot \\mu_{2} \\cdot \\phi^2\\). , mixture distributions, probability larger mean can obtained plogis(fit$model$par[[\"logit_p_mix\"]]) ratio larger mean smaller mean can obtained 1 + exp(fit$model$par[[\"log_ratio_mix\"]]). standard errors available TMB sdreport: fit$sd_report. wish fix probability large (.e., extreme) mean, can hard estimate, can use map list. E.g.: Example: family = gamma_mix(link = \"log\"). See also family = delta_gamma_mix() extension incorporating distribution delta models.","code":"sdmTMB(..., control = sdmTMBcontrol( start = list(logit_p_mix = qlogis(0.05)), # 5% probability of 'mu2' map = list(logit_p_mix = factor(NA)) # don't estimate ) )"},{"path":"https://pbs-assess.github.io/sdmTMB/articles/model-description.html","id":"lognormal-mixture","dir":"Articles","previous_headings":"Observation model families","what":"Lognormal mixture","title":"sdmTMB model description","text":"2 component mixture extends lognormal distribution, \\[ (1 - p) \\cdot \\operatorname{Lognormal} \\left( \\log \\mu_{1} - \\frac{\\phi^2}{2}, \\phi^2 \\right) + p \\cdot \\operatorname{Lognormal} \\left( \\log \\mu_{2} - \\frac{\\phi^2}{2}, \\phi^2 \\right). \\] bias correction, \\(\\mathbb{E}[y] = (1-p) \\cdot \\mu_{1} + p \\cdot \\mu_{2}\\) \\(\\mathrm{Var}[\\log y] = (1-p)^2 \\cdot \\phi^2 + p^2 \\cdot \\phi^2\\). Gamma mixture, \\(p\\) controls contribution component mixture (also interpreted probability larger events). Example: family = lognormal_mix(link = \"log\"). See also family = delta_lognormal_mix() extension incorporating distribution delta models.","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/articles/model-description.html","id":"negative-binomial-2-mixture","dir":"Articles","previous_headings":"Observation model families","what":"Negative binomial 2 mixture","title":"sdmTMB model description","text":"2 component mixture extends NB2 distribution, \\[ (1 - p) \\cdot \\operatorname{NB2} \\left( \\mu_1, \\phi \\right) + p \\cdot \\operatorname{NB2} \\left( \\mu_2, \\phi \\right) \\] \\(\\mu_1\\) mean first (smaller component) distribution, \\(\\mu_2\\) mean larger component, \\(p\\) controls contribution component mixture. Example: family = nbinom2_mix(link = \"log\")","code":""},{"path":[]},{"path":"https://pbs-assess.github.io/sdmTMB/articles/model-description.html","id":"matérn-parameterization","dir":"Articles","previous_headings":"Gaussian random fields","what":"Matérn parameterization","title":"sdmTMB model description","text":"Matérn defines covariance \\(\\Phi \\left( s_j, s_k \\right)\\) spatial locations \\(s_j\\) \\(s_k\\) \\[ \\Phi\\left( s_j,s_k \\right) = \\tau^2/\\Gamma(\\nu)2^{\\nu - 1} (\\kappa d_{jk})^\\nu K_\\nu \\left( \\kappa d_{jk} \\right), \\] \\(\\tau^2\\) controls spatial variance, \\(\\nu\\) controls smoothness, \\(\\Gamma\\) represents Gamma function, \\(d_{jk}\\) represents distance locations \\(s_j\\) \\(s_k\\), \\(K_\\nu\\) represents modified Bessel function second kind, \\(\\kappa\\) represents decorrelation rate. parameter \\(\\nu\\) set 1 take advantage Stochastic Partial Differential Equation (SPDE) approximation GRF greatly increase computational efficiency (Lindgren et al. 2011). Internally, parameters \\(\\kappa\\) \\(\\tau\\) converted range marginal standard deviation \\(\\sigma\\) \\(\\textrm{range} = \\sqrt{8} / \\kappa\\) \\(\\sigma = 1 / \\sqrt{4 \\pi \\exp \\left(2 \\log(\\tau) + 2 \\log(\\kappa) \\right) }\\). case spatiotemporal model spatial spatiotemporal fields, share_range = TRUE sdmTMB() (default), single \\(\\kappa\\) range estimated separate \\(\\sigma_\\omega\\) \\(\\sigma_\\epsilon\\). often makes sense since data often weakly informative \\(\\kappa\\). share_range = FALSE, separate \\(\\kappa_\\omega\\) \\(\\kappa_\\epsilon\\) estimated. spatially varying coefficient field always shares \\(\\kappa\\) spatial random field.","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/articles/model-description.html","id":"projection-boldsymbola-matrix","dir":"Articles","previous_headings":"Gaussian random fields","what":"Projection \\(\\boldsymbol{A}\\) matrix","title":"sdmTMB model description","text":"values spatial variables knots multiplied projection matrix \\(\\boldsymbol{}\\) bilinearly interpolates knot locations values locations observed predicted data (Lindgren & Rue 2015) \\[ \\boldsymbol{\\omega}^* = \\boldsymbol{} \\boldsymbol{\\omega}, \\] \\(\\boldsymbol{\\omega}^*\\) represents values spatial random fields observed locations predicted data locations. matrix \\(\\boldsymbol{}\\) row data point prediction point column knot. Three non-zero elements row define weight neighbouring 3 knot locations location \\(\\boldsymbol{s}\\). bilinear interpolation happens spatiotemporal random fields \\[ \\boldsymbol{\\epsilon}_t^* = \\boldsymbol{} \\boldsymbol{\\epsilon}_t. \\]","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/articles/model-description.html","id":"anisotropy","dir":"Articles","previous_headings":"Gaussian random fields","what":"Anisotropy","title":"sdmTMB model description","text":"TMB allows anisotropy, spatial covariance may asymmetric respect latitude longitude (full details). Anisotropy can turned logical anisotropy argument sdmTMB(). number ways implement anisotropic covariance (Fuglstad et al. 2015), adopt 2-parameter rotation matrix \\(\\textbf{H}\\). elements \\(\\textbf{H}\\) defined parameter vector \\(\\boldsymbol{x}\\) \\(H_{1,1} = x_{1}\\), \\(H_{1,2} = H_{2,1} = x_{2}\\) \\(H_{2,2} = (1 + x_{2}^2) / x_{1}\\). model fitted sdmTMB(), anisotropy relationships may plotted using plot_anisotropy() function, takes fitted object argument. barrier mesh used, anisotropy disabled.","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/articles/model-description.html","id":"incorporating-physical-barriers-into-the-spde","dir":"Articles","previous_headings":"Gaussian random fields","what":"Incorporating physical barriers into the SPDE","title":"sdmTMB model description","text":"cases spatial domain interest may complex bounded barrier land water (e.g., coastlines, islands, lakes). SPDE models allow physical barriers incorporated modelling (Bakka et al. 2019). sdmTMB() models, mesh construction occurs two steps: user (1) constructs mesh call sdmTMB::make_mesh(), (2) passes mesh sdmTMB::add_barrier_mesh(). barriers must constructed sf objects (Pebesma 2018) polygons defining barriers. See ?sdmTMB::add_barrier_mesh example. barrier implementation requires user select fraction value (range_fraction argument) defines fraction usual spatial range crossing barrier (Bakka et al. 2019). example, range estimated 10 km, range_fraction = 0.2 assume range 2 km across barrier. let spatial correlation decay 5 times faster distance. experimentation, values around 0.1 0.2 seem work well values much lower 0.1 can result convergence issues. website Francesco Serafini Haakon Bakka provides illustration INLA. implementation within TMB borrowed code written Olav Nikolai Breivik Hans Skaug TMB Case Studies Github site.","code":""},{"path":[]},{"path":"https://pbs-assess.github.io/sdmTMB/articles/model-description.html","id":"optimization-details","dir":"Articles","previous_headings":"Optimization","what":"Optimization details","title":"sdmTMB model description","text":"sdmTMB model fit maximum marginal likelihood. Internally, TMB (Kristensen et al. 2016) model template calculates marginal log likelihood gradient, negative log likelihood minimized via non-linear optimization routine stats::nlminb() R (Gay 1990; R Core Team 2021). Random effects estimated values maximize log likelihood conditional estimated fixed effects integrated via Laplace approximation (Kristensen et al. 2016). Like AD Model Builder (Fournier et al. 2012), TMB allows parameters fit phases include multiphase argument sdmTMB::sdmTMBcontrol() allow . high-dimensional models (many fixed random effects), phased estimation may faster result stable convergence. sdmTMB, phased estimation proceeds first estimating fixed-effect parameters contributing likelihood (holding random effects constant initial values). second phase, random-effect parameters (variances) also estimated. Fixed-effect parameters also estimated second phase initialized estimates first phase. cases, single call stats::nlminb() may result convergence (e.g., maximum gradient marginal likelihood respect fixed-effect parameters small enough yet), algorithm may need run multiple times. sdmTMB::sdmTMBcontrol() function, include argument nlminb_loops restart optimization previous best values. number nlminb_loops generally small (e.g., 2 3 initially), defaults 1. sdmTMB models, Hessian may also unstable need re-evaluated. optionally stats::optimHess() routine call stats::nlminb(). stats::optimHess() function implements Newton optimization routine find Hessian, include argument newton_loops sdmTMB::sdmTMBcontrol() allow multiple function evaluations (starting previous best value). default, included newton_loops set 0. model already fit, function sdmTMB::run_extra_optimization() can run additional optimization loops either routine reduce maximum gradient.","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/articles/model-description.html","id":"assessing-convergence","dir":"Articles","previous_headings":"Optimization","what":"Assessing convergence","title":"sdmTMB model description","text":"Much guidance around diagnostics glmmTMB also applies sdmTMB, e.g. glmmTMB vignette troubleshooting. Optimization stats::nlminb() involves specifying number iterations evaluations (eval.max iter.max) tolerances (abs.tol, rel.tol, x.tol, xf.tol)—greater number iterations smaller tolerance thresholds increase chance optimal solution found, evaluations translates longer computation time. Warnings non-positive-definite Hessian matrices (accompanied parameters NAs standard errors) often mean models improperly specified given data. Standard errors can observed output print.sdmTMB() checking fit$sd_report. maximum gradient marginal likelihood respect fixed-effect parameters can checked inspecting (fit$gradients). Guidance varies, maximum gradient likely least \\(< 0.001\\) assuming fitting routine consistent convergence. maximum gradients already relatively small, can often reduced additional optimization calls beginning previous best parameter vector described sdmTMB::run_extra_optimization().","code":""},{"path":[]},{"path":"https://pbs-assess.github.io/sdmTMB/articles/web_only/basic-intro.html","id":"parameter-estimates","dir":"Articles > Web_only","previous_headings":"","what":"Parameter estimates","title":"Introduction to modelling with sdmTMB","text":"can view confidence intervals fixed effects using tidy function: similarly random effect variance parameters: Note standard errors reported coefficients log space, confidence intervals reported. parameters defined follows: range: derived parameter defines distance 2 points effectively independent (actually 13% correlated). share_range argument changed FALSE spatial spatiotemporal ranges unique, otherwise default share range. phi: Observation error scale parameter (e.g., SD Gaussian). sigma_O: SD spatial process (“Omega”). sigma_E: SD spatiotemporal process (“Epsilon”). tweedie_p: Tweedie p (power) parameter; 1 2. model used AR1 spatiotemporal fields : rho: Spatiotemporal correlation years; -1 1. model includes spatial_varying predictor : sigma_Z: SD spatially varying coefficient field (“Zeta”).","code":"tidy(m3, conf.int = TRUE) #> # A tibble: 3 × 5 #> term estimate std.error conf.low conf.high #> #> 1 (Intercept) 1.86 0.208 1.45 2.26 #> 2 poly(log(depth), 2)1 -65.1 6.32 -77.5 -52.8 #> 3 poly(log(depth), 2)2 -96.5 5.98 -108. -84.8 tidy(m3, \"ran_pars\", conf.int = TRUE) #> # A tibble: 5 × 5 #> term estimate std.error conf.low conf.high #> #> 1 range 19.8 3.03 14.6 26.7 #> 2 phi 11.0 0.377 10.3 11.8 #> 3 sigma_O 1.40 0.162 1.12 1.76 #> 4 sigma_E 1.55 0.129 1.32 1.83 #> 5 tweedie_p 1.50 0.0119 1.48 1.52"},{"path":"https://pbs-assess.github.io/sdmTMB/articles/web_only/basic-intro.html","id":"model-diagnostics","dir":"Articles > Web_only","previous_headings":"","what":"Model diagnostics","title":"Introduction to modelling with sdmTMB","text":"can inspect randomized quantile residuals: fast calculate can look ‘’ even model consistent data. MCMC-based residuals reliable slow. can calculate help sdmTMBextra package. https://github.com/pbs-assess/sdmTMBextra. practice like want mcmc_iter mcmc_warmup. Total samples mcmc_iter - mcmc_warmup. also just use spatial model vignette builds quickly. See ?residuals.sdmTMB().","code":"pcod$resids <- residuals(m3) # randomized quantile residuals #> Note what used to be the default sdmTMB residuals (before version 0.4.3.9005) #> are now `type = 'mle-eb'`. We recommend using the current default `'mle-mvn'`, #> which takes one sample from the approximate posterior of the random effects or #> `dharma_residuals()` using a similar approach. qqnorm(pcod$resids) qqline(pcod$resids) ggplot(pcod, aes(X, Y, col = resids)) + scale_colour_gradient2() + geom_point() + facet_wrap(~year) + coord_fixed() set.seed(123) samps <- sdmTMBextra::predict_mle_mcmc(m3, mcmc_warmup = 100, mcmc_iter = 101) #> #> SAMPLING FOR MODEL 'tmb_generic' NOW (CHAIN 1). #> Chain 1: #> Chain 1: Gradient evaluation took 0.010102 seconds #> Chain 1: 1000 transitions using 10 leapfrog steps per transition would take 101.02 seconds. #> Chain 1: Adjust your expectations accordingly! #> Chain 1: #> Chain 1: #> Chain 1: WARNING: There aren't enough warmup iterations to fit the #> Chain 1: three stages of adaptation as currently configured. #> Chain 1: Reducing each adaptation stage to 15%/75%/10% of #> Chain 1: the given number of warmup iterations: #> Chain 1: init_buffer = 15 #> Chain 1: adapt_window = 75 #> Chain 1: term_buffer = 10 #> Chain 1: #> Chain 1: Iteration: 1 / 101 [ 0%] (Warmup) #> Chain 1: Iteration: 10 / 101 [ 9%] (Warmup) #> Chain 1: Iteration: 20 / 101 [ 19%] (Warmup) #> Chain 1: Iteration: 30 / 101 [ 29%] (Warmup) #> Chain 1: Iteration: 40 / 101 [ 39%] (Warmup) #> Chain 1: Iteration: 50 / 101 [ 49%] (Warmup) #> Chain 1: Iteration: 60 / 101 [ 59%] (Warmup) #> Chain 1: Iteration: 70 / 101 [ 69%] (Warmup) #> Chain 1: Iteration: 80 / 101 [ 79%] (Warmup) #> Chain 1: Iteration: 90 / 101 [ 89%] (Warmup) #> Chain 1: Iteration: 100 / 101 [ 99%] (Warmup) #> Chain 1: Iteration: 101 / 101 [100%] (Sampling) #> Chain 1: #> Chain 1: Elapsed Time: 26.933 seconds (Warm-up) #> Chain 1: 0.198 seconds (Sampling) #> Chain 1: 27.131 seconds (Total) #> Chain 1: r <- residuals(m3, \"mle-mcmc\", mcmc_samples = samps) qqnorm(r) qqline(r)"},{"path":"https://pbs-assess.github.io/sdmTMB/articles/web_only/basic-intro.html","id":"spatial-predictions","dir":"Articles > Web_only","previous_headings":"","what":"Spatial predictions","title":"Introduction to modelling with sdmTMB","text":"Now, purposes example (e.g., visualization), want predict fine-scale grid entire survey domain. grid built package Queen Charlotte Sound named qcs_grid. prediction grid also needs covariates used model . can replicate grid across necessary years: Now make predictions new data: Let’s make small function help make maps. four kinds predictions get model. First, show predictions incorporate fixed effects random effects: can also look just fixed effects, quadratic effect depth: can look spatial random effects represent consistent deviations space time accounted fixed effects. words, deviations represent consistent biotic abiotic factors affecting biomass density accounted model. finally can look spatiotemporal random effects represent deviation fixed effect predictions spatial random effect deviations. represent biotic abiotic factors changing time accounted model. can also estimate uncertainty spatiotemporal density predictions using simulations joint precision matrix setting nsim > 0 predict function. generate 100 estimates use apply() calculate upper lower confidence intervals, standard deviation, coefficient variation (CV). Plot CV estimates:","code":"glimpse(qcs_grid) #> Rows: 7,314 #> Columns: 5 #> $ X 456, 458, 460, 462, 464, 466, 468, 470, 472, 474, 476, 4… #> $ Y 5636, 5636, 5636, 5636, 5636, 5636, 5636, 5636, 5636, 56… #> $ depth 347.08345, 223.33479, 203.74085, 183.29868, 182.99983, 1… #> $ depth_scaled 1.56081222, 0.56976988, 0.36336929, 0.12570465, 0.122036… #> $ depth_scaled2 2.436134794, 0.324637712, 0.132037240, 0.015801659, 0.01… grid_yrs <- replicate_df(qcs_grid, \"year\", unique(pcod$year)) predictions <- predict(m3, newdata = grid_yrs) plot_map <- function(dat, column) { ggplot(dat, aes(X, Y, fill = {{ column }})) + geom_raster() + coord_fixed() } plot_map(predictions, exp(est)) + scale_fill_viridis_c( trans = \"sqrt\", # trim extreme high values to make spatial variation more visible na.value = \"yellow\", limits = c(0, quantile(exp(predictions$est), 0.995)) ) + facet_wrap(~year) + ggtitle(\"Prediction (fixed effects + all random effects)\", subtitle = paste(\"maximum estimated biomass density =\", round(max(exp(predictions$est)))) ) plot_map(predictions, exp(est_non_rf)) + scale_fill_viridis_c(trans = \"sqrt\") + ggtitle(\"Prediction (fixed effects only)\") plot_map(predictions, omega_s) + scale_fill_gradient2() + ggtitle(\"Spatial random effects only\") plot_map(predictions, epsilon_st) + scale_fill_gradient2() + facet_wrap(~year) + ggtitle(\"Spatiotemporal random effects only\") sim <- predict(m3, newdata = grid_yrs, nsim = 100) sim_last <- sim[grid_yrs$year == max(grid_yrs$year), ] # just plot last year pred_last <- predictions[predictions$year == max(grid_yrs$year), ] pred_last$lwr <- apply(exp(sim_last), 1, quantile, probs = 0.025) pred_last$upr <- apply(exp(sim_last), 1, quantile, probs = 0.975) pred_last$sd <- round(apply(exp(sim_last), 1, function(x) sd(x)), 2) pred_last$cv <- round(apply(exp(sim_last), 1, function(x) sd(x) / mean(x)), 2) ggplot(pred_last, aes(X, Y, fill = cv)) + geom_raster() + scale_fill_viridis_c()"},{"path":"https://pbs-assess.github.io/sdmTMB/articles/web_only/basic-intro.html","id":"conditional-effects","dir":"Articles > Web_only","previous_headings":"","what":"Conditional effects","title":"Introduction to modelling with sdmTMB","text":"can visualize conditional effect covariates feeding simplified data frames predict function fix covariate values want fixed (e.g., means) vary parameters want visualize (across range values): also visreg package. version link space residuals partial randomized quantile residuals. See scale argument visreg response scale plots. ggeffects package marginal effects plot. also faster since relies already estimated coefficients variance-covariance matrix.","code":"nd <- data.frame( depth = seq(min(pcod$depth), max(pcod$depth), length.out = 100 ), year = 2015L # a chosen year ) p <- predict(m3, newdata = nd, se_fit = TRUE, re_form = NA) ggplot(p, aes(depth, exp(est), ymin = exp(est - 1.96 * est_se), ymax = exp(est + 1.96 * est_se) )) + geom_line() + geom_ribbon(alpha = 0.4) + scale_x_continuous() + coord_cartesian(expand = F) + labs(x = \"Depth (m)\", y = \"Biomass density (kg/km2)\") visreg::visreg(m3, \"depth\") ggeffects::ggeffect(m3, \"depth [0:500 by=1]\") %>% plot() #> Warning: Removed 1 row containing missing values or values outside the scale range #> (`geom_line()`)."},{"path":"https://pbs-assess.github.io/sdmTMB/articles/web_only/basic-intro.html","id":"time-varying-effects","dir":"Articles > Web_only","previous_headings":"Conditional effects","what":"Time-varying effects","title":"Introduction to modelling with sdmTMB","text":"also let effect depth vary time. example, helps give year separate mean effect (.factor(year)). ~ 0 part formula omits intercept. models like take longer fit, might want set silent = FALSE can monitor progress. plot , make data frame contains combinations time-varying covariate time. easily created using expand.grid() tidyr::expand_grid().","code":"m4 <- sdmTMB( density ~ 0 + as.factor(year), data = pcod, time_varying = ~ 0 + depth_scaled + depth_scaled2, mesh = mesh, family = tweedie(link = \"log\"), spatial = \"on\", time = \"year\", spatiotemporal = \"IID\" ) #> Detected irregular time spacing with an AR(1) or random walk process. #> Consider filling in the missing time slices with `extra_time`. #> `extra_time = c(2006, 2008, 2010, 2012, 2014, 2016)` #> Warning in sqrt(diag(cov)): NaNs produced #> Warning: The model may not have converged: non-positive-definite Hessian #> matrix. m4 #> Spatiotemporal model fit by ML ['sdmTMB'] #> Formula: density ~ 0 + as.factor(year) #> Mesh: mesh (isotropic covariance) #> Time column: year #> Data: pcod #> Family: tweedie(link = 'log') #> Warning in sqrt(diag(object$cov.fixed)): NaNs produced #> Warning in sqrt(diag(object$cov.fixed)): NaNs produced #> Warning in sqrt(diag(object$cov.fixed)): NaNs produced #> Warning in sqrt(diag(object$cov.fixed)): NaNs produced #> Warning in sqrt(diag(object$cov.fixed)): NaNs produced #> Warning in sqrt(diag(object$cov.fixed)): NaNs produced #> Warning in sqrt(diag(object$cov.fixed)): NaNs produced #> Warning in sqrt(diag(object$cov.fixed)): NaNs produced #> Warning in sqrt(diag(object$cov.fixed)): NaNs produced #> Warning in sqrt(diag(object$cov.fixed)): NaNs produced #> Warning in sqrt(diag(object$cov.fixed)): NaNs produced #> Warning in sqrt(diag(object$cov.fixed)): NaNs produced #> Warning in sqrt(diag(object$cov.fixed)): NaNs produced #> Warning in sqrt(diag(object$cov.fixed)): NaNs produced #> Warning in sqrt(diag(object$cov.fixed)): NaNs produced #> Warning in sqrt(diag(object$cov.fixed)): NaNs produced #> Warning in sqrt(diag(object$cov.fixed)): NaNs produced #> Warning in sqrt(diag(object$cov.fixed)): NaNs produced #> Warning in sqrt(diag(object$cov.fixed)): NaNs produced #> Warning in sqrt(diag(object$cov.fixed)): NaNs produced #> Warning in sqrt(diag(object$cov.fixed)): NaNs produced #> Warning in sqrt(diag(object$cov.fixed)): NaNs produced #> Warning in sqrt(diag(object$cov.fixed)): NaNs produced #> Warning in sqrt(diag(object$cov.fixed)): NaNs produced #> Warning in sqrt(diag(object$cov.fixed)): NaNs produced #> Warning in sqrt(diag(object$cov.fixed)): NaNs produced #> Warning in sqrt(diag(object$cov.fixed)): NaNs produced #> Warning in sqrt(diag(object$cov.fixed)): NaNs produced #> Warning in sqrt(diag(object$cov.fixed)): NaNs produced #> Warning in sqrt(diag(object$cov.fixed)): NaNs produced #> Warning in sqrt(diag(object$cov.fixed)): NaNs produced #> Warning in sqrt(diag(object$cov.fixed)): NaNs produced #> Warning in sqrt(diag(object$cov.fixed)): NaNs produced #> Warning in sqrt(diag(object$cov.fixed)): NaNs produced #> #> coef.est coef.se #> as.factor(year)2003 3.36 0.30 #> as.factor(year)2004 4.04 0.28 #> as.factor(year)2005 3.77 0.28 #> as.factor(year)2007 2.64 0.30 #> as.factor(year)2009 2.65 0.29 #> as.factor(year)2011 3.72 0.29 #> as.factor(year)2013 3.51 0.27 #> as.factor(year)2015 3.76 0.28 #> as.factor(year)2017 3.21 0.30 #> #> Time-varying parameters: #> coef.est coef.se #> depth_scaled-2003 -0.96 0.08 #> depth_scaled-2004 -0.96 0.08 #> depth_scaled-2005 -0.96 0.08 #> depth_scaled-2007 -0.96 0.08 #> depth_scaled-2009 -0.96 0.08 #> depth_scaled-2011 -0.96 0.08 #> depth_scaled-2013 -0.96 0.08 #> depth_scaled-2015 -0.96 0.08 #> depth_scaled-2017 -0.96 0.08 #> depth_scaled2-2003 -1.50 0.23 #> depth_scaled2-2004 -1.70 0.17 #> depth_scaled2-2005 -1.68 0.22 #> depth_scaled2-2007 -1.82 0.28 #> depth_scaled2-2009 -0.99 0.17 #> depth_scaled2-2011 -2.02 0.25 #> depth_scaled2-2013 -1.09 0.12 #> depth_scaled2-2015 -1.80 0.22 #> depth_scaled2-2017 -2.10 0.26 #> #> Dispersion parameter: 10.86 #> Tweedie p: 1.50 #> Matérn range: 13.58 #> Spatial SD: 1.63 #> Spatiotemporal IID SD: 1.66 #> ML criterion at convergence: 6251.165 #> #> See ?tidy.sdmTMB to extract these values as a data frame. #> #> **Possible issues detected! Check output of sanity().** AIC(m4) #> [1] 12534.33 nd <- expand.grid( depth_scaled = seq(min(pcod$depth_scaled) + 0.2, max(pcod$depth_scaled) - 0.2, length.out = 50 ), year = unique(pcod$year) # all years ) nd$depth_scaled2 <- nd$depth_scaled^2 p <- predict(m4, newdata = nd, se_fit = TRUE, re_form = NA) ggplot(p, aes(depth_scaled, exp(est), ymin = exp(est - 1.96 * est_se), ymax = exp(est + 1.96 * est_se), group = as.factor(year) )) + geom_line(aes(colour = year), lwd = 1) + geom_ribbon(aes(fill = year), alpha = 0.1) + scale_colour_viridis_c() + scale_fill_viridis_c() + scale_x_continuous(labels = function(x) round(exp(x * pcod$depth_sd[1] + pcod$depth_mean[1]))) + coord_cartesian(expand = F) + labs(x = \"Depth (m)\", y = \"Biomass density (kg/km2)\")"},{"path":"https://pbs-assess.github.io/sdmTMB/articles/web_only/bayesian.html","id":"simulating-data","dir":"Articles > Web_only","previous_headings":"","what":"Simulating data","title":"Bayesian estimation with sdmTMB","text":"demonstrate using simulated dataset. Visualize simulated data:","code":"set.seed(123) predictor_dat <- data.frame( X = runif(500), Y = runif(500), a1 = rnorm(500) ) mesh <- make_mesh(predictor_dat, xy_cols = c(\"X\", \"Y\"), cutoff = 0.1) # plot(mesh) # mesh$mesh$n sim_dat <- sdmTMB_simulate( formula = ~a1, data = predictor_dat, mesh = mesh, family = gaussian(), range = 0.3, phi = 0.2, sigma_O = 0.2, seed = 123, B = c(0.8, -0.4) # B0 = intercept, B1 = a1 slope ) ggplot(sim_dat, aes(X, Y, colour = observed)) + geom_point() + scale_color_viridis_c()"},{"path":"https://pbs-assess.github.io/sdmTMB/articles/web_only/bayesian.html","id":"fitting-the-model-with-marginal-likelihood","dir":"Articles > Web_only","previous_headings":"","what":"Fitting the model with marginal likelihood","title":"Bayesian estimation with sdmTMB","text":"First, fit spatial random field GLMM maximum likelihood:","code":"fit <- sdmTMB( observed ~ a1, data = sim_dat, mesh = mesh, family = gaussian(), spatial = \"on\" ) fit #> Spatial model fit by ML ['sdmTMB'] #> Formula: observed ~ a1 #> Mesh: mesh (isotropic covariance) #> Data: sim_dat #> Family: gaussian(link = 'identity') #> #> coef.est coef.se #> (Intercept) 0.8 0.06 #> a1 -0.4 0.01 #> #> Dispersion parameter: 0.20 #> Matérn range: 0.32 #> Spatial SD: 0.17 #> ML criterion at convergence: -65.939 #> #> See ?tidy.sdmTMB to extract these values as a data frame."},{"path":"https://pbs-assess.github.io/sdmTMB/articles/web_only/bayesian.html","id":"adding-priors","dir":"Articles > Web_only","previous_headings":"","what":"Adding priors","title":"Bayesian estimation with sdmTMB","text":"first model fit use priors. case, priors implied uniform internal parameter space. However, sdmTMB provides option applying priors. show example applying Normal(0, 5) (mean, SD) prior intercept Normal(0, 1) prior slope parameter. guess model matrix structure based formula, can verify looking internal model matrix previous fit (using do_fit = FALSE save time didn’t want fit first time). column corresponds order b priors:","code":"head(fit$tmb_data$X_ij[[1]]) #> (Intercept) a1 #> 1 1 -0.60189285 #> 2 1 -0.99369859 #> 3 1 1.02678506 #> 4 1 0.75106130 #> 5 1 -1.50916654 #> 6 1 -0.09514745 fit <- sdmTMB( observed ~ a1, data = sim_dat, mesh = mesh, family = gaussian(), spatial = \"on\", priors = sdmTMBpriors( # location = vector of means; scale = vector of standard deviations: b = normal(location = c(0, 0), scale = c(5, 2)), ) ) fit #> Spatial model fit by ML ['sdmTMB'] #> Formula: observed ~ a1 #> Mesh: mesh (isotropic covariance) #> Data: sim_dat #> Family: gaussian(link = 'identity') #> #> coef.est coef.se #> (Intercept) 0.8 0.06 #> a1 -0.4 0.01 #> #> Dispersion parameter: 0.20 #> Matérn range: 0.32 #> Spatial SD: 0.17 #> ML criterion at convergence: -62.846 #> #> See ?tidy.sdmTMB to extract these values as a data frame."},{"path":"https://pbs-assess.github.io/sdmTMB/articles/web_only/bayesian.html","id":"fixing-a-spatial-correlation-parameter-to-improve-convergence","dir":"Articles > Web_only","previous_headings":"","what":"Fixing a spatial correlation parameter to improve convergence","title":"Bayesian estimation with sdmTMB","text":"Sometimes spatial correlation parameters can challenging estimate Stan. One option apply penalized complexity (PC) priors sdmTMBpriors() Matérn parameters. Another option, can also used conjunction priors, fix one parameters maximum likelihood estimate (MLE) values. Frequently, fixing parameter ln_kappa can help convergence (e.g., Monnahan et al. 2021). estimated parameter transformed range estimate, controls rate spatial correlation decay. Now rebuild fitted object fixed (‘mapped’) ln_kappa parameters using update() function. ’ll use do_fit = FALSE avoid actually fitting updated model since ’s necessary.","code":"# grab the internal parameter list at estimated values: pars <- sdmTMB::get_pars(fit) # create a 'map' vector for TMB # factor NA values cause TMB to fix or map the parameter at the starting value: kappa_map <- factor(rep(NA, length(pars$ln_kappa))) # rebuild model updating some elements: fit_mle <- update( fit, control = sdmTMBcontrol( start = list( ln_kappa = pars$ln_kappa #< ), map = list( ln_kappa = kappa_map #< ) ), do_fit = FALSE #< ) #> ℹ Initiating `ln_kappa` at specified starting value(s) of: #> 2.173, 2.173 #> ℹ Fixing or mirroring `ln_kappa`"},{"path":"https://pbs-assess.github.io/sdmTMB/articles/web_only/bayesian.html","id":"passing-the-model-to-tmbstan","dir":"Articles > Web_only","previous_headings":"","what":"Passing the model to tmbstan","title":"Bayesian estimation with sdmTMB","text":"Now can pass $tmb_obj element model tmbstan::tmbstan(). using 1000 iterations 2 chains vignette builds quickly. practice, likely want use (e.g., 2000 iterations, 4 chains). Sometimes may need adjust sampler settings : See Details section ?rstan::stan. can also ‘thin’ samples via thin argument working model predictions becomes cumbersome given large number required samples. can look model: Rhat values look reasonable (< 1.05). n_eff (number effective samples) values mostly look reasonable (> 100) inference mean parameters except intercept (b_j[1]). Furthermore, can see correlation MCMC samples b_j[1]. try running iterations chains /placing priors parameters described (highly recommended). Now can use various functions visualize posterior:","code":"fit_stan <- tmbstan::tmbstan( fit_mle$tmb_obj, iter = 1000, chains = 2, seed = 8217 # ensures repeatability ) tmbstan::tmbstan( ..., control = list(adapt_delta = 0.9, max_treedepth = 12) ) fit_stan #> Inference for Stan model: sdmTMB. #> 2 chains, each with iter=1000; warmup=500; thin=1; #> post-warmup draws per chain=500, total post-warmup draws=1000. #> #> mean se_mean sd 2.5% 25% 50% 75% 97.5% n_eff Rhat #> b_j[1] 0.80 0.01 0.06 0.68 0.76 0.80 0.84 0.92 82 1.02 #> b_j[2] -0.40 0.00 0.01 -0.41 -0.40 -0.40 -0.39 -0.38 1747 1.00 #> ln_tau_O -1.67 0.01 0.14 -1.93 -1.76 -1.67 -1.57 -1.42 169 1.01 #> ln_phi -1.63 0.00 0.03 -1.69 -1.65 -1.63 -1.61 -1.56 1360 1.00 #> omega_s[1] -0.09 0.01 0.09 -0.26 -0.14 -0.09 -0.03 0.09 162 1.01 #> omega_s[2] -0.06 0.01 0.09 -0.25 -0.12 -0.06 0.01 0.11 196 1.01 #> omega_s[3] 0.01 0.01 0.10 -0.17 -0.05 0.01 0.08 0.20 196 1.01 #> omega_s[4] -0.21 0.01 0.09 -0.40 -0.27 -0.21 -0.15 -0.02 206 1.01 #> omega_s[5] -0.34 0.01 0.10 -0.53 -0.41 -0.33 -0.27 -0.13 247 1.00 #> omega_s[6] -0.08 0.01 0.10 -0.27 -0.15 -0.08 0.00 0.13 268 1.01 #> omega_s[7] -0.02 0.01 0.09 -0.19 -0.08 -0.02 0.04 0.15 172 1.01 #> omega_s[8] -0.23 0.01 0.10 -0.42 -0.29 -0.23 -0.16 -0.05 219 1.01 #> omega_s[9] -0.32 0.01 0.09 -0.50 -0.38 -0.32 -0.26 -0.13 166 1.02 #> omega_s[10] 0.29 0.01 0.09 0.11 0.23 0.28 0.35 0.46 147 1.01 #> omega_s[11] -0.16 0.01 0.09 -0.34 -0.22 -0.16 -0.09 0.02 208 1.01 #> omega_s[12] 0.00 0.01 0.10 -0.19 -0.07 0.01 0.07 0.21 205 1.01 #> omega_s[13] 0.19 0.01 0.09 0.02 0.13 0.19 0.25 0.36 166 1.01 #> omega_s[14] -0.09 0.01 0.10 -0.29 -0.16 -0.09 -0.03 0.11 253 1.01 #> omega_s[15] 0.22 0.01 0.09 0.06 0.16 0.22 0.28 0.40 152 1.01 #> omega_s[16] -0.02 0.01 0.10 -0.21 -0.08 -0.02 0.05 0.16 198 1.01 #> omega_s[17] -0.15 0.01 0.09 -0.34 -0.21 -0.14 -0.08 0.04 198 1.01 #> omega_s[18] -0.28 0.01 0.11 -0.50 -0.35 -0.28 -0.21 -0.09 270 1.00 #> omega_s[19] 0.00 0.01 0.10 -0.18 -0.06 0.00 0.07 0.19 211 1.01 #> omega_s[20] 0.03 0.01 0.08 -0.14 -0.03 0.02 0.09 0.19 180 1.01 #> omega_s[21] 0.08 0.01 0.10 -0.11 0.02 0.08 0.15 0.26 191 1.01 #> omega_s[22] -0.01 0.01 0.10 -0.21 -0.08 -0.01 0.05 0.17 212 1.00 #> omega_s[23] 0.12 0.01 0.09 -0.04 0.06 0.12 0.18 0.30 156 1.01 #> omega_s[24] 0.20 0.01 0.10 0.00 0.13 0.19 0.26 0.40 256 1.00 #> omega_s[25] 0.08 0.01 0.09 -0.09 0.01 0.07 0.14 0.26 167 1.01 #> omega_s[26] -0.01 0.01 0.10 -0.21 -0.08 -0.01 0.06 0.20 215 1.01 #> omega_s[27] -0.11 0.01 0.09 -0.30 -0.18 -0.10 -0.05 0.06 216 1.00 #> omega_s[28] 0.12 0.01 0.10 -0.08 0.05 0.11 0.18 0.32 229 1.00 #> omega_s[29] 0.30 0.01 0.09 0.12 0.23 0.30 0.36 0.47 212 1.01 #> omega_s[30] -0.04 0.01 0.09 -0.22 -0.10 -0.04 0.02 0.14 193 1.01 #> omega_s[31] 0.09 0.01 0.09 -0.08 0.04 0.09 0.15 0.26 154 1.01 #> omega_s[32] 0.05 0.01 0.12 -0.18 -0.02 0.05 0.14 0.28 244 1.01 #> omega_s[33] 0.08 0.01 0.09 -0.10 0.01 0.07 0.14 0.26 219 1.01 #> omega_s[34] 0.04 0.01 0.09 -0.13 -0.03 0.04 0.10 0.20 163 1.01 #> omega_s[35] 0.07 0.01 0.10 -0.10 0.00 0.07 0.14 0.26 230 1.01 #> omega_s[36] 0.14 0.01 0.09 -0.04 0.08 0.15 0.20 0.33 197 1.01 #> omega_s[37] 0.16 0.01 0.11 -0.06 0.09 0.16 0.24 0.39 275 1.00 #> omega_s[38] 0.12 0.01 0.10 -0.09 0.05 0.11 0.19 0.31 217 1.01 #> omega_s[39] -0.22 0.01 0.10 -0.40 -0.28 -0.22 -0.15 -0.04 186 1.01 #> omega_s[40] -0.03 0.01 0.09 -0.21 -0.09 -0.03 0.03 0.17 197 1.01 #> omega_s[41] 0.19 0.01 0.08 0.03 0.13 0.19 0.24 0.35 143 1.01 #> omega_s[42] 0.21 0.01 0.09 0.01 0.14 0.21 0.27 0.38 173 1.01 #> omega_s[43] 0.15 0.01 0.10 -0.04 0.07 0.15 0.21 0.35 221 1.01 #> omega_s[44] 0.14 0.01 0.10 -0.07 0.08 0.14 0.20 0.32 194 1.01 #> omega_s[45] 0.10 0.01 0.10 -0.10 0.02 0.10 0.16 0.30 248 1.00 #> omega_s[46] 0.06 0.01 0.10 -0.14 -0.01 0.06 0.12 0.26 187 1.01 #> omega_s[47] 0.31 0.01 0.09 0.13 0.24 0.31 0.37 0.50 166 1.01 #> omega_s[48] -0.24 0.01 0.10 -0.44 -0.31 -0.24 -0.18 -0.06 196 1.01 #> omega_s[49] 0.10 0.01 0.10 -0.10 0.04 0.10 0.17 0.32 255 1.01 #> omega_s[50] -0.09 0.01 0.09 -0.26 -0.14 -0.09 -0.03 0.08 178 1.01 #> omega_s[51] 0.25 0.01 0.11 0.04 0.17 0.25 0.32 0.48 247 1.01 #> omega_s[52] -0.21 0.01 0.11 -0.43 -0.29 -0.21 -0.14 -0.01 251 1.01 #> omega_s[53] 0.04 0.01 0.10 -0.16 -0.03 0.04 0.10 0.22 213 1.00 #> omega_s[54] 0.03 0.01 0.10 -0.16 -0.04 0.03 0.10 0.22 214 1.00 #> omega_s[55] -0.09 0.01 0.11 -0.30 -0.16 -0.09 -0.01 0.15 311 1.01 #> omega_s[56] -0.42 0.01 0.10 -0.62 -0.48 -0.41 -0.35 -0.24 229 1.01 #> omega_s[57] 0.01 0.01 0.11 -0.20 -0.07 0.00 0.08 0.24 274 1.00 #> omega_s[58] -0.21 0.01 0.10 -0.40 -0.28 -0.21 -0.14 -0.01 201 1.01 #> omega_s[59] 0.04 0.01 0.24 -0.43 -0.11 0.05 0.20 0.47 992 1.00 #> omega_s[60] -0.23 0.01 0.28 -0.78 -0.41 -0.23 -0.03 0.27 1087 1.00 #> omega_s[61] -0.26 0.01 0.24 -0.69 -0.41 -0.27 -0.10 0.20 662 1.00 #> omega_s[62] -0.26 0.01 0.23 -0.72 -0.42 -0.26 -0.11 0.19 693 1.00 #> omega_s[63] -0.27 0.01 0.23 -0.72 -0.43 -0.28 -0.12 0.18 807 1.00 #> omega_s[64] 0.08 0.01 0.22 -0.37 -0.07 0.09 0.23 0.51 866 1.00 #> omega_s[65] 0.17 0.01 0.22 -0.29 0.02 0.17 0.32 0.57 693 1.00 #> omega_s[66] 0.16 0.01 0.22 -0.27 0.02 0.16 0.31 0.60 525 1.00 #> omega_s[67] -0.03 0.01 0.22 -0.49 -0.17 -0.01 0.13 0.42 1154 1.00 #> omega_s[68] -0.02 0.01 0.25 -0.50 -0.17 -0.02 0.15 0.47 1152 1.00 #> omega_s[69] 0.00 0.01 0.22 -0.46 -0.15 0.00 0.15 0.43 458 1.01 #> omega_s[70] 0.01 0.01 0.21 -0.42 -0.13 0.01 0.15 0.40 431 1.00 #> omega_s[71] 0.17 0.01 0.24 -0.30 0.01 0.17 0.33 0.67 884 1.00 #> omega_s[72] -0.12 0.01 0.24 -0.62 -0.29 -0.11 0.04 0.35 1064 1.00 #> omega_s[73] -0.14 0.01 0.22 -0.58 -0.28 -0.15 0.01 0.30 600 1.00 #> omega_s[74] -0.14 0.01 0.22 -0.56 -0.29 -0.14 0.01 0.26 624 1.00 #> omega_s[75] -0.38 0.01 0.20 -0.76 -0.51 -0.38 -0.25 0.00 654 1.00 #> omega_s[76] 0.10 0.01 0.21 -0.31 -0.04 0.10 0.23 0.51 848 1.00 #> omega_s[77] 0.09 0.01 0.22 -0.34 -0.04 0.09 0.23 0.52 1053 1.00 #> omega_s[78] -0.06 0.01 0.21 -0.47 -0.19 -0.06 0.08 0.35 1693 1.00 #> omega_s[79] 0.06 0.01 0.16 -0.27 -0.05 0.06 0.17 0.37 622 1.00 #> omega_s[80] -0.18 0.01 0.23 -0.65 -0.33 -0.18 -0.02 0.27 632 1.00 #> omega_s[81] -0.07 0.01 0.20 -0.45 -0.20 -0.07 0.05 0.33 1165 1.00 #> omega_s[82] -0.14 0.01 0.21 -0.55 -0.28 -0.14 0.00 0.29 810 1.00 #> omega_s[83] -0.02 0.01 0.23 -0.47 -0.18 -0.01 0.14 0.44 1096 1.00 #> omega_s[84] 0.12 0.01 0.20 -0.26 -0.02 0.12 0.25 0.53 952 1.00 #> omega_s[85] -0.29 0.01 0.21 -0.68 -0.44 -0.30 -0.15 0.11 1111 1.00 #> lp__ 136.06 0.93 9.26 118.11 129.87 136.38 142.46 152.88 100 1.02 #> #> Samples were drawn using NUTS(diag_e) at Sun May 26 04:17:07 2024. #> For each parameter, n_eff is a crude measure of effective sample size, #> and Rhat is the potential scale reduction factor on split chains (at #> convergence, Rhat=1). plot(fit_stan) #> 'pars' not specified. Showing first 10 parameters by default. #> ci_level: 0.8 (80% intervals) #> outer_level: 0.95 (95% intervals) pars_plot <- c(\"b_j[1]\", \"b_j[2]\", \"ln_tau_O\", \"omega_s[1]\") bayesplot::mcmc_trace(fit_stan, pars = pars_plot) bayesplot::mcmc_pairs(fit_stan, pars = pars_plot)"},{"path":"https://pbs-assess.github.io/sdmTMB/articles/web_only/bayesian.html","id":"posterior-predictive-checks","dir":"Articles > Web_only","previous_headings":"","what":"Posterior predictive checks","title":"Bayesian estimation with sdmTMB","text":"can perform posterior predictive checks assess whether model can generate predictive data consistent observations. , can make use simulate.sdmTMB() passing Stan model. simulate.sdmTMB() take draws joint parameter posterior add observation error. need ensure nsim less equal total number post-warmup samples. See ?bayesplot::pp_check. solid line represents density observed data light blue lines represent density 50 posterior predictive simulations. case, simulated data seem consistent observed data.","code":"set.seed(19292) samps <- sdmTMBextra::extract_mcmc(fit_stan) s <- simulate(fit_mle, mcmc_samples = samps, nsim = 50) bayesplot::pp_check( sim_dat$observed, yrep = t(s), fun = bayesplot::ppc_dens_overlay )"},{"path":"https://pbs-assess.github.io/sdmTMB/articles/web_only/bayesian.html","id":"plotting-predictions","dir":"Articles > Web_only","previous_headings":"","what":"Plotting predictions","title":"Bayesian estimation with sdmTMB","text":"can make predictions Bayesian model supplying posterior samples mcmc_samples argument predict.sdmTMB(). output matrix row corresponds row predicted data column corresponds sample. can summarize draws various ways visualize : predict grid given value a1:","code":"pred <- predict(fit_mle, mcmc_samples = samps) dim(pred) #> [1] 500 1000 sim_dat$post_mean <- apply(pred, 1, mean) sim_dat$post_sd <- apply(pred, 1, sd) ggplot(sim_dat, aes(X, Y, colour = post_mean)) + geom_point() + scale_color_viridis_c() ggplot(sim_dat, aes(X, Y, colour = post_sd)) + geom_point() + scale_color_viridis_c() nd <- expand.grid( X = seq(0, 1, length.out = 70), Y = seq(0, 1, length.out = 70), a1 = 0 ) pred <- predict(fit_mle, newdata = nd, mcmc_samples = samps) nd$post_mean <- apply(pred, 1, mean) nd$post_sd <- apply(pred, 1, sd) ggplot(nd, aes(X, Y, fill = post_mean)) + geom_raster() + scale_fill_viridis_c() + coord_fixed() ggplot(nd, aes(X, Y, fill = post_sd)) + geom_raster() + scale_fill_viridis_c() + coord_fixed()"},{"path":"https://pbs-assess.github.io/sdmTMB/articles/web_only/bayesian.html","id":"extracting-parameter-posterior-samples","dir":"Articles > Web_only","previous_headings":"","what":"Extracting parameter posterior samples","title":"Bayesian estimation with sdmTMB","text":"can extract posterior samples rstan::extract(), result list element corresponds parameter set parameters: example calculating derived parameter, calculate marginal spatial random field standard deviation:","code":"post <- rstan::extract(fit_stan) names(post) #> [1] \"b_j\" \"ln_tau_O\" \"ln_phi\" \"omega_s\" \"lp__\" hist(post$b_j[, 1]) ln_kappa <- get_pars(fit_mle)$ln_kappa[1] # 2 elements since 2nd would be for spatiotemporal ln_tau_O <- post$ln_tau_O sigma_O <- 1 / sqrt(4 * pi * exp(2 * ln_tau_O + 2 * ln_kappa)) hist(sigma_O)"},{"path":"https://pbs-assess.github.io/sdmTMB/articles/web_only/bayesian.html","id":"extracting-the-posterior-of-other-predicted-elements","dir":"Articles > Web_only","previous_headings":"","what":"Extracting the posterior of other predicted elements","title":"Bayesian estimation with sdmTMB","text":"default predict.sdmTMB() returns overall prediction link space tmbstan model passed . instead want element might find usual data frame returned predict.sdmTMB() applied regular sdmTMB model, can specify sims_var argument. example, let’s extract spatial random field values \"omega_s\". options documented ?predict.sdmTMB().","code":"fit_pred <- predict( fit_mle, newdata = nd, mcmc_samples = samps, sims_var = \"omega_s\" #< ) nd$spatial_rf_mean <- apply(fit_pred, 1, mean) nd$spatial_rf_sd <- apply(fit_pred, 1, sd) ggplot(nd, aes(X, Y, fill = spatial_rf_mean)) + geom_raster() + scale_fill_gradient2() + coord_fixed() ggplot(nd, aes(X, Y, fill = spatial_rf_sd)) + geom_raster() + scale_fill_viridis_c() + coord_fixed()"},{"path":[]},{"path":"https://pbs-assess.github.io/sdmTMB/articles/web_only/cross-validation.html","id":"overview","dir":"Articles > Web_only","previous_headings":"","what":"Overview","title":"Cross-validation for model evaluation and comparison","text":"Cross-validation one best approaches can used quantify model performance compare sdmTMB models different structures (unlike AIC, approach also factor uncertainty random effects). Arguably challenging decision implementing cross-validation specify folds (fold representing subset data turn held used test set). Folds may vary number data partitioned, likely slightly different application. goals sdmTMB applications may focused spatial prediction; include making prediction new spatial regions (e.g. unsampled areas areas sampled every year). types models recommend exploring folds using blockCV spatialsample packages (Valavi et al. 2019; Silge 2021). general, spatial sampling approaches assign observations spatially autocorrelated fold. Accounting spatial correlation can lead better estimates covariate effects, well prediction errors. Alternatively, goals analysis sdmTMB may evaluate predictive accuracy model time (e.g. missing survey year, prediction future years). retrospective analyses, points within year may assigned fold (groups years fold). contrast, models forward looking use Leave Future Cross-Validation (LFOCV). LFOCV, data year \\(t\\) used predict observations \\(t+1\\), etc.","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/articles/web_only/cross-validation.html","id":"cross-validation-in-sdmtmb","dir":"Articles > Web_only","previous_headings":"","what":"Cross validation in sdmTMB","title":"Cross-validation for model evaluation and comparison","text":"Cross validation sdmTMB implemented using sdmTMB_cv() function, k_folds argument specifying number folds (defaults 8). function uses parallelization default future::plan() set, can turned parallel argument. example, folds assigned randomly—can modified specific spatial temporal applications. Without getting complexities blockCV spatialsample packages, simply use kmeans generate spatial clusters, e.g. similarly, clusters assigned time—, year unique fold. Note year included factor spatiotemporal fields turned estimated missing years.","code":"data(pcod) mesh <- make_mesh(pcod, c(\"X\", \"Y\"), cutoff = 25) pcod$fyear <- as.factor(pcod$year) # Set parallel processing if desired: library(future) plan(multisession, workers = 2) m_cv <- sdmTMB_cv( density ~ 0 + s(depth_scaled) + fyear, data = pcod, mesh = mesh, family = tweedie(link = \"log\"), k_folds = 4 ) #> Running fits with `future.apply()`. #> Set a parallel `future::plan()` to use parallel processing. clust <- kmeans(pcod[, c(\"X\", \"Y\")], 20)$cluster m_cv <- sdmTMB_cv( density ~ 0 + s(depth_scaled) + fyear, data = pcod, mesh = mesh, fold_ids = clust, family = tweedie(link = \"log\") ) #> Running fits with `future.apply()`. #> Set a parallel `future::plan()` to use parallel processing. clust <- as.numeric(as.factor(pcod$year)) m_cv <- sdmTMB_cv( density ~ 0 + s(depth_scaled), data = pcod, mesh = mesh, fold_ids = clust, spatiotemporal = \"off\", family = tweedie(link = \"log\") ) #> Running fits with `future.apply()`. #> Set a parallel `future::plan()` to use parallel processing."},{"path":"https://pbs-assess.github.io/sdmTMB/articles/web_only/cross-validation.html","id":"measuring-model-performance","dir":"Articles > Web_only","previous_headings":"","what":"Measuring model performance","title":"Cross-validation for model evaluation and comparison","text":"Lots measures predictive accuracy can used evaluate model performance. default, sdmTMB_cv() returns list contains sum log likelihoods left-fold total summed across left-folds. roughly equivalent expected log predictive density (ELPD) Bayesian literature can interpreted predictive ability model new observations. can accessed , inspecting quantities across folds may help elucidate whether particular folds difficult predict.","code":"m_cv <- sdmTMB_cv( density ~ 0 + s(depth_scaled) + fyear, data = pcod, mesh = mesh, family = tweedie(link = \"log\"), k_folds = 4 ) #> Running fits with `future.apply()`. #> Set a parallel `future::plan()` to use parallel processing. m_cv$fold_loglik # fold log-likelihood #> [1] -1720.122 -1756.138 -1502.946 -1661.257 m_cv$sum_loglik # total log-likelihood #> [1] -6640.463"},{"path":"https://pbs-assess.github.io/sdmTMB/articles/web_only/cross-validation.html","id":"single-splits","dir":"Articles > Web_only","previous_headings":"","what":"Single splits","title":"Cross-validation for model evaluation and comparison","text":"cases single test set evaluated (e.g., 10% data), using sdmTMB_cv() function may overkill two sdmTMB() models fit, using function may worthwhile reduce coding errors (log-likelihood calculations). example, assign two folds, randomly holding 10% observations test set (test set given ID = 1, training set given ID = 2). can ignore total log-likelihood, just focus first element list list:","code":"clust <- sample(1:2, size = nrow(pcod), replace = TRUE, prob = c(0.1, 0.9)) m_cv <- sdmTMB_cv( density ~ 0 + s(depth_scaled) + fyear, data = pcod, mesh = mesh, fold_ids = clust, family = tweedie(link = \"log\"), k_folds = length(unique(clust)) ) m_cv$fold_loglik[[1]] #> [1] -608.5838"},{"path":"https://pbs-assess.github.io/sdmTMB/articles/web_only/cross-validation.html","id":"comparing-two-or-more-models","dir":"Articles > Web_only","previous_headings":"","what":"Comparing two or more models","title":"Cross-validation for model evaluation and comparison","text":"can use output sdmTMB_cv() compare two models. example, wanted evaluate support depth effect , 10-fold cross validation (’s important folds across two models). example, using either predictive log-likelihood ELPD lead one conclude including depth improves predictive accuracy model.","code":"clust <- sample(seq_len(10), size = nrow(pcod), replace = TRUE) m1 <- sdmTMB_cv( density ~ 0 + fyear, data = pcod, mesh = mesh, fold_ids = clust, family = tweedie(link = \"log\") ) #> Running fits with `future.apply()`. #> Set a parallel `future::plan()` to use parallel processing. m2 <- sdmTMB_cv( density ~ 0 + fyear + s(depth_scaled), data = pcod, mesh = mesh, fold_ids = clust, family = tweedie(link = \"log\") ) #> Running fits with `future.apply()`. #> Set a parallel `future::plan()` to use parallel processing. # Compare log-likelihoods -- higher is better! m1$sum_loglik #> [1] -6748.929 m2$sum_loglik #> [1] -6596.481"},{"path":"https://pbs-assess.github.io/sdmTMB/articles/web_only/cross-validation.html","id":"model-ensembling","dir":"Articles > Web_only","previous_headings":"","what":"Model ensembling","title":"Cross-validation for model evaluation and comparison","text":"Finally, instead identifying single “best” models, may interested model averaging. sdmTMB package, ’ve implemented model stacking procedure described (Yao et al. 2018) sdmTMB_stacking() function. procedure uses optimization find normalized weights maximize total log-likelihood across models (metrics may also used). Inputs function list models (fictitious model_list), list element output call sdmTMB_cv(): default calculation uses data fold. instead, split data 10/90 split (example ), wouldn’t want use 2nd model fit generate weights. just wanted use predictions first fold onto 10% test set, specify using include_folds argument.","code":"weights <- sdmTMB_stacking(model_list) weights <- sdmTMB_stacking(model_list, include_folds = 1)"},{"path":[]},{"path":"https://pbs-assess.github.io/sdmTMB/articles/web_only/delta-models.html","id":"example-with-built-in-delta-model","dir":"Articles > Web_only","previous_headings":"","what":"Example with built-in delta model","title":"Fitting delta (hurdle) models with sdmTMB","text":", show example fitting using built-delta functionality, well build model component separately combine. built-approach convenient, allows parameters shared across components, allows calculation derived quantities standardized indexes (get_index()) internally calculated standard errors. use dataset built sdmTMB package: trawl survey data Pacific Cod Queen Charlotte Sound, British Columbia, Canada. density units kg/km2. , X Y coordinates UTM zone 9. first create mesh use models. can fit model Pacific cod density using delta-gamma model, including smoothed effect depth. default built-delta models formula, spatial spatiotemporal structure, anisotropy shared two model components. However, elements (formula, spatial, spatiotemporal, share_range) can also specified independently model using list format within function argument (see examples ). first element list binomial component second element positive component (e.g., Gamma). elements must shared now (e.g., smoothers, spatially varying coefficients, time-varying coefficients). specify settings spatial spatiotemporal effects model component, create list settings within spatial spatiotemporal arguments. example, spatial = list(\"\", \"\"), spatiotemporal = list(\"\", \"rw\"). similarly specify different formula component model, using list(y ~ x1, y ~ x2) instance, include effect depth positive model, remove presence-absence model. However, currently limitations specifying separate formulas model component. two formulas : smoothers threshold effects random intercepts now, must specified single formula shared across two models. model component can similarly separate settings share_range, determines whether shared spatial spatiotemporal range parameter (TRUE) independent range parameters (FALSE), using list. Lastly, whether anisotropy included model determined logical argument anisotropy (.e., TRUE FALSE), separately specified model. anisotropy included, default shared across two model components. However can made unique model component using sdmTMBcontrol(map = ...) adding argument control fitting model. ‘maps’ anisotropy parameters unique across model components. fit delta model, can evaluate plot output, diagnostics, predictions similar sdmTMB models. printed model output show estimates standard errors parameters model separately. Using tidy() function turn sdmTMB model output data frame, argument model=1 model=2 specify model component extract dataframe. See tidy.sdmTMB() additional arguments options. built-delta models, default function return estimated response parameters grid cell model separately, notated 1 (presence/absence model) 2 (positive catch model) column name. See predict.sdmTMB() description values data frame. can use predictions built-delta model (making sure return_tmb_object=TRUE) get index values using get_index() function. can used predictions include first second models (.e., using default specifying model argument) predictions generated using model=NA. get_index() function automatically combine predictions first second model calculating index values. modelling purposes creating index see vignette Index standardization sdmTMB. can plot conditional effects covariates (depth example model) using package visreg specifying model component model=1 presence-absence model model=2 positive catch model. Currently, plotting effects built-delta models ggeffects supported. See vignette using visreg sdmTMB information. built-delta models can also evaluated residuals() functions sdmTMB. Similarly generating predictions, can specify model components want return residuals using model argument specifying =1 =2. See residuals.sdmTMB() additional options evaluating residuals sdmTMB models. can also simulate new observations fitted delta model. functions, can specify model simulate using argument model=1 presence/absence, model=2 positive catches, model=NA combined predictions. See simulate.sdmTMB() details simulation options.","code":"pcod_mesh <- make_mesh(pcod, c(\"X\", \"Y\"), cutoff = 15) fit_dg <- sdmTMB(density ~ 1 + s(depth), data = pcod, mesh = pcod_mesh, time = \"year\", family = delta_gamma() ) print(fit_dg) #> Spatiotemporal model fit by ML ['sdmTMB'] #> Formula: density ~ 1 + s(depth) #> Mesh: pcod_mesh (isotropic covariance) #> Time column: year #> Data: pcod #> Family: delta_gamma(link1 = 'logit', link2 = 'log') #> #> Delta/hurdle model 1: ----------------------------------- #> Family: binomial(link = 'logit') #> coef.est coef.se #> (Intercept) -0.34 0.61 #> sdepth 1.28 2.91 #> #> Smooth terms: #> Std. Dev. #> sds(depth) 14.38 #> #> Matérn range: 61.41 #> Spatial SD: 1.71 #> Spatiotemporal IID SD: 0.81 #> #> Delta/hurdle model 2: ----------------------------------- #> Family: Gamma(link = 'log') #> coef.est coef.se #> (Intercept) 3.67 0.12 #> sdepth 0.31 1.29 #> #> Smooth terms: #> Std. Dev. #> sds(depth) 5.41 #> #> Dispersion parameter: 1.03 #> Matérn range: 14.80 #> Spatial SD: 0.69 #> Spatiotemporal IID SD: 1.45 #> #> ML criterion at convergence: 6126.400 #> #> See ?tidy.sdmTMB to extract these values as a data frame. tidy(fit_dg) # model = 1 is default #> # A tibble: 1 × 3 #> term estimate std.error #> #> 1 (Intercept) -0.343 0.614 tidy(fit_dg, model = 1) #> # A tibble: 1 × 3 #> term estimate std.error #> #> 1 (Intercept) -0.343 0.614 tidy(fit_dg, model = 1, \"ran_pars\", conf.int = TRUE) #> # A tibble: 3 × 5 #> term estimate std.error conf.low conf.high #> #> 1 range 61.4 14.1 39.2 96.3 #> 2 sigma_O 1.71 0.265 1.26 2.32 #> 3 sigma_E 0.806 0.142 0.570 1.14 tidy(fit_dg, model = 2) #> # A tibble: 1 × 3 #> term estimate std.error #> #> 1 (Intercept) 3.67 0.120 tidy(fit_dg, model = 2, \"ran_pars\", conf.int = TRUE) #> # A tibble: 4 × 5 #> term estimate std.error conf.low conf.high #> #> 1 range 14.8 5.02 7.62 28.8 #> 2 phi 1.03 0.0502 0.939 1.14 #> 3 sigma_O 0.691 0.228 0.361 1.32 #> 4 sigma_E 1.45 0.336 0.919 2.28 grid_yrs <- replicate_df(qcs_grid, \"year\", unique(pcod$year)) p <- predict(fit_dg, newdata = grid_yrs) str(p) #> 'data.frame': 65826 obs. of 16 variables: #> $ X : num 456 458 460 462 464 466 468 470 472 474 ... #> $ Y : num 5636 5636 5636 5636 5636 ... #> $ depth : num 347 223 204 183 183 ... #> $ depth_scaled : num 1.561 0.57 0.363 0.126 0.122 ... #> $ depth_scaled2: num 2.4361 0.3246 0.132 0.0158 0.0149 ... #> $ year : int 2003 2003 2003 2003 2003 2003 2003 2003 2003 2003 ... #> $ est1 : num -5.804 -0.796 0.021 0.87 0.98 ... #> $ est2 : num 3.2 3.71 4.02 4.39 4.43 ... #> $ est_non_rf1 : num -5.508 -0.598 0.12 0.871 0.882 ... #> $ est_non_rf2 : num 2.83 3.31 3.59 3.94 3.94 ... #> $ est_rf1 : num -0.296138 -0.197484 -0.09883 -0.000177 0.098477 ... #> $ est_rf2 : num 0.368 0.398 0.428 0.459 0.489 ... #> $ omega_s1 : num -0.00543 0.09317 0.19178 0.29039 0.38899 ... #> $ omega_s2 : num 0.108 0.119 0.129 0.139 0.15 ... #> $ epsilon_st1 : num -0.291 -0.291 -0.291 -0.291 -0.291 ... #> $ epsilon_st2 : num 0.26 0.28 0.299 0.319 0.339 ... p2 <- predict(fit_dg, newdata = grid_yrs, return_tmb_object = TRUE) ind_dg <- get_index(p2, bias_correct = FALSE) visreg_delta(fit_dg, xvar = \"depth\", model = 1, gg = TRUE) #> These are residuals for delta model component 1. Use the `model` argument to #> select the other component. visreg_delta(fit_dg, xvar = \"depth\", model = 2, gg = TRUE) simulations <- simulate(fit_dg, nsim = 5, seed = 5090, model = NA)"},{"path":"https://pbs-assess.github.io/sdmTMB/articles/web_only/delta-models.html","id":"delta-models-by-fitting-two-components-separately-and-combining-predictions","dir":"Articles > Web_only","previous_headings":"","what":"Delta models by fitting two components separately and combining predictions","title":"Fitting delta (hurdle) models with sdmTMB","text":"Next, show example implement delta-gamma model sdmTMB, component fit separately combined. approach gives maximum flexibility model lets develop time. limitations calculating index abundance want share parameters. necessary use mesh models, one can updating first mesh match reduced data frame shown : delta-gamma model similar Tweedie model Intro modelling sdmTMB vignette, except use s() depth effect. One can use different covariates model, case just let depth effect wiggly specifying k = 3. Next, need way combining predictions across two models. need point predictions, can just multiply predictions two models applying inverse link: measure uncertainty required, can simulate joint parameter precision matrix using predict() function number simulations selected (e.g., sims = 500). predictions come simulated draws parameter covariance matrix, predictions become consistent larger number draws. However, greater number draws takes longer calculate use memory (larger matrix), fewer draws (~100) may fine experimentation. larger number (say ~1000) may appropriate final model runs. p_combined_sim matrix row row data predicted width nsim. can process matrix however like. can save median predictions upper lower 95% confidence intervals: can calculate spatial uncertainty:","code":"glimpse(pcod) #> Rows: 2,143 #> Columns: 12 #> $ year 2003, 2003, 2003, 2003, 2003, 2003, 2003, 2003, 2003, 20… #> $ X 446.4752, 446.4594, 448.5987, 436.9157, 420.6101, 417.71… #> $ Y 5793.426, 5800.136, 5801.687, 5802.305, 5771.055, 5772.2… #> $ depth 201, 212, 220, 197, 256, 293, 410, 387, 285, 270, 381, 1… #> $ density 113.138476, 41.704922, 0.000000, 15.706138, 0.000000, 0.… #> $ present 1, 1, 0, 1, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0,… #> $ lat 52.28858, 52.34890, 52.36305, 52.36738, 52.08437, 52.094… #> $ lon -129.7847, -129.7860, -129.7549, -129.9265, -130.1586, -… #> $ depth_mean 5.155194, 5.155194, 5.155194, 5.155194, 5.155194, 5.1551… #> $ depth_sd 0.4448783, 0.4448783, 0.4448783, 0.4448783, 0.4448783, 0… #> $ depth_scaled 0.3329252, 0.4526914, 0.5359529, 0.2877417, 0.8766077, 1… #> $ depth_scaled2 0.11083919, 0.20492947, 0.28724555, 0.08279527, 0.768440… mesh1 <- make_mesh(pcod, c(\"X\", \"Y\"), cutoff = 20) # coarse for vignette speed dat2 <- subset(pcod, density > 0) mesh2 <- make_mesh(dat2, xy_cols = c(\"X\", \"Y\"), mesh = mesh1$mesh ) m1 <- sdmTMB( formula = present ~ 0 + as.factor(year) + s(depth, k = 3), data = pcod, mesh = mesh1, time = \"year\", family = binomial(link = \"logit\"), spatiotemporal = \"iid\", spatial = \"on\" ) m1 #> Spatiotemporal model fit by ML ['sdmTMB'] #> Formula: present ~ 0 + as.factor(year) + s(depth, k = 3) #> Mesh: mesh1 (isotropic covariance) #> Time column: year #> Data: pcod #> Family: binomial(link = 'logit') #> #> coef.est coef.se #> as.factor(year)2003 -0.76 0.42 #> as.factor(year)2004 -0.40 0.42 #> as.factor(year)2005 -0.42 0.42 #> as.factor(year)2007 -1.40 0.42 #> as.factor(year)2009 -1.16 0.42 #> as.factor(year)2011 -1.56 0.42 #> as.factor(year)2013 -0.38 0.42 #> as.factor(year)2015 -0.65 0.42 #> as.factor(year)2017 -1.56 0.42 #> sdepth -5.66 0.50 #> #> Smooth terms: #> Std. Dev. #> sds(depth) 85.9 #> #> Matérn range: 28.70 #> Spatial SD: 1.89 #> Spatiotemporal IID SD: 0.90 #> ML criterion at convergence: 1054.414 #> #> See ?tidy.sdmTMB to extract these values as a data frame. m2 <- sdmTMB( formula = density ~ 0 + as.factor(year) + s(depth), data = dat2, mesh = mesh2, time = \"year\", family = Gamma(link = \"log\"), spatiotemporal = \"iid\", spatial = \"on\" ) m2 #> Spatiotemporal model fit by ML ['sdmTMB'] #> Formula: density ~ 0 + as.factor(year) + s(depth) #> Mesh: mesh2 (isotropic covariance) #> Time column: year #> Data: dat2 #> Family: Gamma(link = 'log') #> #> coef.est coef.se #> as.factor(year)2003 4.06 0.20 #> as.factor(year)2004 4.22 0.20 #> as.factor(year)2005 4.19 0.20 #> as.factor(year)2007 3.38 0.20 #> as.factor(year)2009 3.71 0.21 #> as.factor(year)2011 4.51 0.21 #> as.factor(year)2013 4.02 0.19 #> as.factor(year)2015 4.13 0.20 #> as.factor(year)2017 3.84 0.22 #> sdepth -0.28 0.36 #> #> Smooth terms: #> Std. Dev. #> sds(depth) 1.89 #> #> Dispersion parameter: 0.94 #> Matérn range: 0.01 #> Spatial SD: 727.31 #> Spatiotemporal IID SD: 2064.97 #> ML criterion at convergence: 5102.136 #> #> See ?tidy.sdmTMB to extract these values as a data frame. #> #> **Possible issues detected! Check output of sanity().** pred <- grid_yrs # use the grid as template for saving our predictions p_bin <- predict(m1, newdata = grid_yrs) p_pos <- predict(m2, newdata = grid_yrs) p_bin_prob <- m1$family$linkinv(p_bin$est) p_pos_exp <- m2$family$linkinv(p_pos$est) pred$est_exp <- p_bin_prob * p_pos_exp set.seed(28239) p_bin_sim <- predict(m1, newdata = grid_yrs, nsim = 100) p_pos_sim <- predict(m2, newdata = grid_yrs, nsim = 100) p_bin_prob_sim <- m1$family$linkinv(p_bin_sim) p_pos_exp_sim <- m2$family$linkinv(p_pos_sim) p_combined_sim <- p_bin_prob_sim * p_pos_exp_sim pred$median <- apply(p_combined_sim, 1, median) plot(pred$est_exp, pred$median) ggplot(subset(pred, year == 2017), aes(X, Y, fill = median)) + geom_raster() + coord_fixed() + scale_fill_viridis_c(trans = \"sqrt\") pred$cv <- apply(p_combined_sim, 1, function(x) sd(x) / mean(x)) ggplot(subset(pred, year == 2017), aes(X, Y, fill = cv)) + # 2017 as an example geom_raster() + coord_fixed() + scale_fill_viridis_c(trans = \"log10\")"},{"path":"https://pbs-assess.github.io/sdmTMB/articles/web_only/ggeffects.html","id":"example-with-pacific-cod-presence","dir":"Articles > Web_only","previous_headings":"","what":"Example with Pacific cod presence","title":"Visualizing marginal effects in sdmTMB models with ggeffects","text":"start, use Pacific cod example data. fit model fish presence/absence covariates depth fixed effect year using Tweedie distribution. can use ggeffects::ggeffect() see effect depth probability Pacific cod present. can control range interval depths predicted within function (e.g. [0:500 =1]). can also plot effects year. can add data points can also use ggeffect plot multiple variables listing terms = c(), first term listed indicating variable plotted x-axis, remaining listed terms (four total) indicating groups. Adding facet = TRUE show year separate plot, instead overlain one plot. Adding facet = TRUE show year separate plot, instead overlain one plot. can also use make ggplot plot calling ggeffects object dat data frame. Plotting using continuous response (density) rather presence-similar. instance: can fit model interaction two continuous variables: plotting two continuous variables, ggeffect() make non-target (2nd) variable discrete selecting different levels. specify levels rather letting ggeffect() choose , use brackets selected values within term list, instance","code":"pcod$fyear <- as.factor(pcod$year) mesh <- make_mesh(pcod, c(\"X\", \"Y\"), cutoff = 20) fit <- sdmTMB(present ~ poly(depth, 2) + fyear, data = pcod, mesh = mesh, spatial = \"on\", family = binomial() ) g <- ggeffect(fit, \"depth [0:500 by=1]\") plot(g) g2 <- ggeffect(fit, \"fyear\") plot(g2) plot(g, add.data = TRUE) #> Warning: Argument `add.data` is deprecated and will be removed in the future. #> Please use `show_data` instead. #> Data points may overlap. Use the `jitter` argument to add some amount of #> random variation to the location of data points and avoid overplotting. dat <- ggeffect(fit, terms = c(\"depth [0:350 by=5]\", \"fyear\")) plot(dat) plot(dat, facet = TRUE) ggplot(dat, aes(x, predicted, colour = group)) + geom_line() g5 <- ggeffect(fit3, terms = c(\"depth_scaled [-3:2.7 by=0.01]\", \"numeric_year\")) plot(g5) plot(g5, facet = TRUE) g6 <- ggeffect(fit3, terms = c(\"depth_scaled [-3:2.7 by=0.01]\", \"numeric_year [1,7,15]\")) plot(g6)"},{"path":"https://pbs-assess.github.io/sdmTMB/articles/web_only/poisson-link.html","id":"introduction","dir":"Articles > Web_only","previous_headings":"","what":"Introduction","title":"Poisson-link delta models","text":"Jim Thorson introduced Poisson-link delta model CJFAS paper (Thorson 2018). ’s useful model family dealing positive continuous data zeroes. can used circumstances use Tweedie traditional delta-gamma delta-lognormal model logit log links. Poisson-link delta formulation describes go linear predictors link space expected values entered data likelihood. likelihood first component Bernoulli (.e., binomial single trials) likelihood second component can gamma lognormal. sdmTMB, families specified delta_gamma(type = \"poisson-link\") delta_lognormal(type = \"poisson-link\").","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/articles/web_only/poisson-link.html","id":"the-model","dir":"Articles > Web_only","previous_headings":"","what":"The model","title":"Poisson-link delta models","text":"Thorson (2018) describes process model : \\[ \\begin{aligned} p &= 1 - \\exp(-n),\\\\ r &= \\frac{n w}{p}, \\end{aligned} \\] \\(p\\) represents encounter probability, \\(n\\) represents group numbers density, \\(r\\) represents positive catch rate (.e., catch rate given encounter), \\(w\\) represents weight per group. linear predictors represent log group numbers density (\\(\\log (n)\\)) log weight per group (\\(\\log (w)\\)): \\[ \\begin{aligned} \\log (n) &= \\boldsymbol{X_1} \\boldsymbol{\\beta_1} + \\ldots,\\\\ \\log (w) &= \\boldsymbol{X_2} \\boldsymbol{\\beta_2} + \\ldots, \\end{aligned} \\] \\(\\boldsymbol{X}\\)s represent design matrices, \\(\\boldsymbol{\\beta}\\)s represent coefficient vectors, \\(\\ldots\\) represents possible model components linear link space including random fields. transformed process equations generate \\(p\\) \\(r\\). model evaluates following likelihood components. encounter probability \\(p\\) \\[ \\operatorname{Bernoulli} (p), \\] equivalently \\[ \\operatorname{Binomial} (1, p), \\] positive rate \\(r\\), either gamma lognormal likelihood \\[ \\operatorname{Gamma} \\left( \\phi, \\frac{r}{\\phi} \\right), \\] \\(\\phi\\) represents gamma shape \\(r / \\phi\\) represents scale, \\[ \\operatorname{Lognormal} \\left( \\log (r) - \\frac{\\phi^2}{2}, \\phi^2 \\right), \\] \\(\\log (r) - \\frac{\\phi^2}{2}\\) represents bias-corrected log mean \\(\\phi\\) represents standard deviation log-space.","code":""},{"path":[]},{"path":"https://pbs-assess.github.io/sdmTMB/articles/web_only/poisson-link.html","id":"data","dir":"Articles > Web_only","previous_headings":"An example","what":"Data","title":"Poisson-link delta models","text":"work Pacific Spiny Dogfish observations trawl survey west coast Vancouver Island, Canada. dataset dogfish contained within sdmTMB package.","code":"library(ggplot2) theme_set(theme_light()) library(dplyr) library(sdmTMB) dogfish$log_depth <- log(dogfish$depth) head(dogfish) #> # A tibble: 6 × 10 #> year longitude latitude X Y present catch_weight area_swept depth #> #> 1 2004 -125. 48.7 780. 5399. 1 22.7 0.103 73 #> 2 2004 -126. 48.2 735. 5346. 0 0 0.103 455 #> 3 2004 -126. 48.3 738. 5355. 0 0 0.116 171 #> 4 2004 -126. 48.3 749. 5354. 1 221. 0.122 137 #> 5 2004 -126. 48.4 744. 5362. 1 440. 0.0964 140 #> 6 2004 -126. 48.4 737. 5362. 1 48.2 0.122 145 #> # ℹ 1 more variable: log_depth mesh <- make_mesh(dogfish, c(\"X\", \"Y\"), cutoff = 10)"},{"path":"https://pbs-assess.github.io/sdmTMB/articles/web_only/poisson-link.html","id":"a-conventional-delta-gamma-model","dir":"Articles > Web_only","previous_headings":"An example","what":"A conventional delta-gamma model:","title":"Poisson-link delta models","text":"First, lets fit conventional delta-gamma model logit log links illustrate limitation. can show effect depth catch weight? one curve, two components use different links (logit + log), different catch-weight depth relationship depending expected catch weight coefficients (year also given point space spatial random field). example, curves look different years: one curve! depends value year every point space different random field value.","code":"fit_dg <- sdmTMB(catch_weight ~ 0 + as.factor(year) + s(log_depth), family = delta_gamma(), spatial = \"on\", mesh = mesh, data = dogfish, anisotropy = FALSE, reml = TRUE, offset = log(dogfish$area_swept), silent = FALSE ) sanity(fit_dg) #> ✔ Non-linear minimizer suggests successful convergence #> ✔ Hessian matrix is positive definite #> ✔ No extreme or very small eigenvalues detected #> ✔ No gradients with respect to fixed effects are >= 0.001 #> ✔ No fixed-effect standard errors are NA #> ✔ No standard errors look unreasonably large #> ✔ No sigma parameters are < 0.01 #> ✔ No sigma parameters are > 100 #> ✔ Range parameters don't look unreasonably large nd <- expand.grid( log_depth = seq(min(dogfish$log_depth), max(dogfish$log_depth), length.out = 200), year = as.factor(unique(dogfish$year)) ) p <- predict(fit_dg, newdata = nd, re_form = NA) ggplot(p, aes(log_depth, log(plogis(est1) * exp(est2)), colour = year)) + geom_line() + ylab(\"log expected catch weight\")"},{"path":"https://pbs-assess.github.io/sdmTMB/articles/web_only/poisson-link.html","id":"a-poisson-link-delta-gamma-alternative","dir":"Articles > Web_only","previous_headings":"An example","what":"A Poisson-link-delta-gamma alternative","title":"Poisson-link delta models","text":"Instead, let’s fit Poisson-link delta-gamma model: make plot: Note lines now parallel. predictors shift curve affect shape combined prediction use log link.","code":"fit_dpg <- sdmTMB(catch_weight ~ 0 + as.factor(year) + s(log_depth), family = delta_gamma(type = \"poisson-link\"), spatial = \"on\", mesh = mesh, data = dogfish, anisotropy = TRUE, reml = TRUE, offset = log(dogfish$area_swept), silent = FALSE ) sanity(fit_dpg) #> ✔ Non-linear minimizer suggests successful convergence #> ✔ Hessian matrix is positive definite #> ✔ No extreme or very small eigenvalues detected #> ✔ No gradients with respect to fixed effects are >= 0.001 #> ✔ No fixed-effect standard errors are NA #> ✔ No standard errors look unreasonably large #> ✔ No sigma parameters are < 0.01 #> ✔ No sigma parameters are > 100 #> ✔ Range parameters don't look unreasonably large summary(fit_dpg) #> Spatial model fit by REML ['sdmTMB'] #> Formula: catch_weight ~ 0 + as.factor(year) + s(log_depth) #> Mesh: mesh (anisotropic covariance) #> Data: dogfish #> Family: delta_gamma(link1 = 'log', link2 = 'log', type = 'poisson-link') #> #> Delta/hurdle model 1: ----------------------------------- #> Family: binomial(link = 'log') #> coef.est coef.se #> as.factor(year)2004 1.96 0.77 #> as.factor(year)2006 2.89 0.77 #> as.factor(year)2008 2.57 0.76 #> as.factor(year)2010 2.43 0.77 #> as.factor(year)2012 1.94 0.76 #> as.factor(year)2014 1.93 0.76 #> as.factor(year)2016 2.20 0.76 #> as.factor(year)2018 2.03 0.76 #> as.factor(year)2021 1.18 0.76 #> as.factor(year)2022 1.75 0.76 #> slog_depth 0.15 0.54 #> #> Smooth terms: #> Std. Dev. #> sds(log_depth) 2.38 #> #> Matérn anisotropic range (spatial): 26.1 to 381.1 at 142 deg. #> Spatial SD: 1.18 #> #> Delta/hurdle model 2: ----------------------------------- #> Family: Gamma(link = 'log') #> coef.est coef.se #> as.factor(year)2004 3.59 0.37 #> as.factor(year)2006 2.78 0.33 #> as.factor(year)2008 2.47 0.32 #> as.factor(year)2010 3.50 0.34 #> as.factor(year)2012 2.77 0.33 #> as.factor(year)2014 2.45 0.33 #> as.factor(year)2016 2.09 0.32 #> as.factor(year)2018 2.60 0.32 #> as.factor(year)2021 1.69 0.33 #> as.factor(year)2022 1.98 0.33 #> slog_depth -0.20 0.82 #> #> Smooth terms: #> Std. Dev. #> sds(log_depth) 5.09 #> #> Dispersion parameter: 0.59 #> Matérn anisotropic range (spatial): 4.2 to 60.8 at 142 deg. #> Spatial SD: 1.97 #> #> REML criterion at convergence: 5851.613 #> #> See ?tidy.sdmTMB to extract these values as a data frame. #> See ?plot_anisotropy to plot the anisotropic range. p_dpg <- predict(fit_dpg, newdata = nd, re_form = NA) ggplot(p_dpg, aes(log_depth, est1 + est2, colour = year)) + geom_line() + ylab(\"log expected catch weight\")"},{"path":"https://pbs-assess.github.io/sdmTMB/articles/web_only/poisson-link.html","id":"examining-the-model-components-and-how-they-combine","dir":"Articles > Web_only","previous_headings":"An example","what":"Examining the model components and how they combine","title":"Poisson-link delta models","text":"’ll make predictions across depths single year simplicity: can extract components theoretical deconstruction catch group numbers density, weight per group, encounter probability, positive rate: can come overall predictions two ways: n * w: group numbers \\(\\times\\) weight per group. equivalently exp(est1 + est2), est1 est2 linear predictors link space. p * r: encounter probability \\(\\times\\) positive catch rate. give identical answers: Let’s plot components possible combinations:","code":"nd2010 <- dplyr::filter(nd, year == 2010) p_pdg <- predict(fit_dpg, newdata = nd2010, re_form = NA) n <- exp(p_pdg$est1) w <- exp(p_pdg$est2) p <- 1 - exp(-n) r <- (n * w) / p lims <- c(0, max(p * r)) plot(n * w, p * r, xlim = lims, ylim = lims) abline(0, 1) g1 <- ggplot(p_pdg, aes(log_depth, n)) + geom_line() + ggtitle(\"Expected group density\") g2 <- ggplot(p_pdg, aes(log_depth, w)) + geom_line() + ggtitle(\"Expected weight per group\") g3 <- ggplot(p_pdg, aes(log_depth, p)) + geom_line() + ylim(0, 1) + ggtitle(\"Expected encounter probability\") g4 <- ggplot(p_pdg, aes(log_depth, r)) + geom_line() + ggtitle(\"Expected catch given encounter\") g5 <- ggplot(p_pdg, aes(log_depth, n * w)) + geom_line() + ggtitle(\"Expected catch\") g6 <- ggplot(p_pdg, aes(log_depth, p * r)) + geom_line() + ggtitle(\"Expected catch\") cowplot::plot_grid(g1, g2, g3, g4, g5, g6, ncol = 2)"},{"path":[]},{"path":"https://pbs-assess.github.io/sdmTMB/articles/web_only/poisson-link.html","id":"what-is-a-group-and-how-does-the-model-know-about-numbers","dir":"Articles > Web_only","previous_headings":"FAQ","what":"What is a ‘group’ and how does the model know about numbers!?","title":"Poisson-link delta models","text":"model represents process groups fish (observations, course) encountered certain numbers (“group numbers density”) group certain weight (“weight per group”). theoretical construct reflecting model can best parse two components according properties observed data. Since predicted response value \\(n \\cdot w\\), can get value multiplying \\(n\\) dividing \\(w\\) amount. numbers less weight per group fewer numbers weight per group get place. However, balance two, given covariates random effects, best fits data entered likelihood \\(p\\) \\(r\\).","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/articles/web_only/poisson-link.html","id":"what-is-the-connection-to-the-complementary-log-log-link-cloglog-and-the-poisson","dir":"Articles > Web_only","previous_headings":"FAQ","what":"What is the connection to the complementary log-log link (‘cloglog’) and the Poisson?","title":"Poisson-link delta models","text":"Wikipedia: “cloglog model corresponds applications observe either zero events (e.g., defects) one , number events assumed follow Poisson distribution.” following based around entry. Poisson assumes probability observing zero given expected number \\(n\\) \\(\\exp(-n)\\). .e., \\[ \\mathrm{Pr}(0) = \\exp(-n) \\] \\(p\\) probability observing non-zero (.e., encounter), \\[ \\exp(-n) = 1 - p = \\mathrm{Pr}(0). \\] can re-arrange \\[ n = -\\log(1 - p). \\] since want linear predictor take values negative positive infinity (therefore keep predicted number densities positive exponentiating ), work log space: \\[ \\log (n) = \\log(-\\log(1 - p)). \\] right side known ‘cloglog’ link function. link following inverse link function: \\[ p = 1 - \\exp(-\\exp(\\log n)). \\] can check R: can see cloglog inverse link first part Poisson-link delta model. However, ‘trick’ group density \\(n\\) affects encounter probability \\(p\\) (shown ) positive catch rates \\(r\\): \\[ r = \\frac{n w}{p}. \\] , going linear predictors (log links) expected values going data likelihood (binomial + lognormal/gamma), first linear predictor plays double duty appears delta model expected values.","code":"p <- 0.78 log_n <- log(-log(1 - p)) log_n #> [1] 0.4148395 1 - exp(-exp(log_n)) #> [1] 0.78"},{"path":"https://pbs-assess.github.io/sdmTMB/articles/web_only/poisson-link.html","id":"what-are-the-advantages-of-using-such-a-model","dir":"Articles > Web_only","previous_headings":"FAQ","what":"What are the advantages of using such a model?","title":"Poisson-link delta models","text":"poisson-link families flexibility delta model, practice often helpful better predictions. checked AIC, better yet, cross validation. Like delta models, decomposition process two theoretical parts. can help interpretation (can make things complicated interpret!). , parts can rearranged represent two ways thinking two components (\\(n\\) \\(w\\) \\(p\\) \\(r\\)). Compared traditional delta-gamma model logit log link, two linear predictors log links, coefficients interpretable multiplicative effects, effects can combined. E.g., coefficients can added log space predictions can added log space generate single response curve given predictor.","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/articles/web_only/poisson-link.html","id":"where-does-the-offset-come-in","dir":"Articles > Web_only","previous_headings":"FAQ","what":"Where does the offset come in?","title":"Poisson-link delta models","text":"offset added linear predictor \\(\\log n\\), .e., exp(offset) * n. example \\(\\) represents area swept (measure effort) log area swept entered model ‘offset’, , \\[ p = 1 - \\exp(-\\cdot n). \\] Therefore, encounter probability goes 1 area swept (effort) underlying group numbers density increase towards infinity. gets carried positive rate \\(r\\) via \\(p\\) \\(r = nw/p\\).","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/articles/web_only/poisson-link.html","id":"why-call-it-a-poisson-link","dir":"Articles > Web_only","previous_headings":"FAQ","what":"Why call it a Poisson-link?","title":"Poisson-link delta models","text":"can guess. cloglog function derived theory Poisson although cloglog function appears ‘Poisson-link’ model, interaction two model components makes different pure cloglog assume needed another name.","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/articles/web_only/poisson-link.html","id":"these-equations-dont-look-anything-like-the-sdmtmb-source-code","dir":"Articles > Web_only","previous_headings":"FAQ","what":"These equations don’t look anything like the sdmTMB source code!","title":"Poisson-link delta models","text":"’s calculations done log space computational stability calculations fastest stable done slightly differently easily described equations .","code":""},{"path":[]},{"path":"https://pbs-assess.github.io/sdmTMB/articles/web_only/residual-checking.html","id":"residual-checking-with-worked-examples","dir":"Articles > Web_only","previous_headings":"","what":"Residual checking with worked examples","title":"Residual checking with sdmTMB","text":"start data simulated scratch. simulate NB2 negative binomial observation model, spatial random field, intercept, one predictor named ‘a1’ linear effect observed data. Next, fit model configurations various families predictors. first model use Poisson instead NB2. 2nd model match simulated data. third model missing ‘a1’ predictor. ’ll use penalized complexity (PC) prior Matérn parameters aid estimation. can see just looking fits Poisson model inflates spatial random field standard deviation (SD) compared truth. model missing ‘a1’ predictor lesser degree.","code":"library(sdmTMB) set.seed(1) predictor_dat <- data.frame(X = runif(1000), Y = runif(1000), a1 = rnorm(1000)) mesh <- make_mesh(predictor_dat, xy_cols = c(\"X\", \"Y\"), cutoff = 0.1) dat <- sdmTMB_simulate( formula = ~ 1 + a1, data = predictor_dat, mesh = mesh, family = nbinom2(link = \"log\"), phi = 0.4, range = 0.4, sigma_O = 0.4, seed = 1, B = c(0.2, 0.8) # B0 = intercept, B1 = a1 slope ) pc <- pc_matern(range_gt = 0.1, sigma_lt = 1) fit_pois <- sdmTMB(observed ~ 1 + a1, data = dat, family = poisson(), mesh = mesh, priors = sdmTMBpriors(matern_s = pc)) fit_pois #> Spatial model fit by ML ['sdmTMB'] #> Formula: observed ~ 1 + a1 #> Mesh: mesh (isotropic covariance) #> Data: dat #> Family: poisson(link = 'log') #> #> coef.est coef.se #> (Intercept) 0.27 0.17 #> a1 0.82 0.02 #> #> Matérn range: 0.12 #> Spatial SD: 1.20 #> ML criterion at convergence: 2887.957 #> #> See ?tidy.sdmTMB to extract these values as a data frame. fit_nb2 <- update(fit_pois, family = nbinom2()) fit_nb2 #> Spatial model fit by ML ['sdmTMB'] #> Formula: observed ~ 1 + a1 #> Mesh: mesh (isotropic covariance) #> Data: dat #> Family: nbinom2(link = 'log') #> #> coef.est coef.se #> (Intercept) 0.52 0.14 #> a1 0.75 0.06 #> #> Dispersion parameter: 0.41 #> Matérn range: 0.29 #> Spatial SD: 0.42 #> ML criterion at convergence: 1735.452 #> #> See ?tidy.sdmTMB to extract these values as a data frame. fit_nb2_miss <- update(fit_nb2, formula. = observed ~ 1) fit_nb2_miss #> Spatial model fit by ML ['sdmTMB'] #> Formula: observed ~ 1 #> Mesh: mesh (isotropic covariance) #> Data: dat #> Family: nbinom2(link = 'log') #> #> coef.est coef.se #> (Intercept) 0.77 0.14 #> #> Dispersion parameter: 0.30 #> Matérn range: 0.21 #> Spatial SD: 0.56 #> ML criterion at convergence: 1817.332 #> #> See ?tidy.sdmTMB to extract these values as a data frame."},{"path":"https://pbs-assess.github.io/sdmTMB/articles/web_only/residual-checking.html","id":"analytical-randomized-quantile-residuals","dir":"Articles > Web_only","previous_headings":"Residual checking with worked examples","what":"Analytical randomized-quantile residuals","title":"Residual checking with sdmTMB","text":"randomized quantile residuals fixed effect MLEs (Maximum Likelihood Estimates) random effects taken single sample approximate distribution (details ): use randomized quantile approach Dunn Smyth (1996). also known PIT (probability-integral-transform) residuals. apply randomization integer response values, transform residuals using distribution function (e.g., pnorm()) reflect uniform(0, 1) distribution, transform values normal(0, 1) consistent model. can see source code https://github.com/pbs-assess/sdmTMB/blob/master/R/residuals.R can see likely issues Poisson model tails.","code":"set.seed(123) rq_res <- residuals(fit_pois, type = \"mle-mvn\") rq_res <- rq_res[is.finite(rq_res)] # some Inf qqnorm(rq_res);abline(0, 1) set.seed(123) rq_res <- residuals(fit_nb2, type = \"mle-mvn\") qqnorm(rq_res);abline(0, 1)"},{"path":"https://pbs-assess.github.io/sdmTMB/articles/web_only/residual-checking.html","id":"mcmc-based-randomized-quantile-residuals","dir":"Articles > Web_only","previous_headings":"Residual checking with worked examples","what":"MCMC-based randomized-quantile residuals","title":"Residual checking with sdmTMB","text":"approach assumes observation random effects can approximated multivariate normal distribution. want relax assumption, can sample random effects MCMC fixed effects held MLEs. sdmTMBextra::predict_mle_mcmc() function sdmTMBextra.","code":"set.seed(123) samps <- sdmTMBextra::predict_mle_mcmc(fit_nb2, mcmc_iter = 800, mcmc_warmup = 400) #> #> SAMPLING FOR MODEL 'tmb_generic' NOW (CHAIN 1). #> Chain 1: #> Chain 1: Gradient evaluation took 0.000971 seconds #> Chain 1: 1000 transitions using 10 leapfrog steps per transition would take 9.71 seconds. #> Chain 1: Adjust your expectations accordingly! #> Chain 1: #> Chain 1: #> Chain 1: Iteration: 1 / 800 [ 0%] (Warmup) #> Chain 1: Iteration: 80 / 800 [ 10%] (Warmup) #> Chain 1: Iteration: 160 / 800 [ 20%] (Warmup) #> Chain 1: Iteration: 240 / 800 [ 30%] (Warmup) #> Chain 1: Iteration: 320 / 800 [ 40%] (Warmup) #> Chain 1: Iteration: 400 / 800 [ 50%] (Warmup) #> Chain 1: Iteration: 401 / 800 [ 50%] (Sampling) #> Chain 1: Iteration: 480 / 800 [ 60%] (Sampling) #> Chain 1: Iteration: 560 / 800 [ 70%] (Sampling) #> Chain 1: Iteration: 640 / 800 [ 80%] (Sampling) #> Chain 1: Iteration: 720 / 800 [ 90%] (Sampling) #> Chain 1: Iteration: 800 / 800 [100%] (Sampling) #> Chain 1: #> Chain 1: Elapsed Time: 6.755 seconds (Warm-up) #> Chain 1: 4.818 seconds (Sampling) #> Chain 1: 11.573 seconds (Total) #> Chain 1: mcmc_res <- residuals(fit_nb2, type = \"mle-mcmc\", mcmc_samples = samps) qqnorm(mcmc_res) abline(0, 1)"},{"path":"https://pbs-assess.github.io/sdmTMB/articles/web_only/residual-checking.html","id":"simulation-based-randomized-quantile-residuals","dir":"Articles > Web_only","previous_headings":"Residual checking with worked examples","what":"Simulation-based randomized-quantile residuals","title":"Residual checking with sdmTMB","text":"can also take simulations fitted model use simulation-based randomized quantile residuals: return matrix row represents row data column simulation draw: can look whether fitted models consistent observed number zeros: obviously zeros data simulated Poisson model NB2 model seems reasonable. Plot DHARMa residuals: also return DHARMa object, lets us use DHARMa tools: QQ residual plots clearly see evidence overdispersion compared Poisson. Note values clumping near 1.0 observed axis deviating downwards towards 0.0 observed. indicative many zeros variance scaling rapidly mean (resulting large outlying value) Poisson distribution. Lets try correct model: Everything looks fine. model missing predictor? plot right represents simulated residuals prediction without random effects, just intercept. Lets try plotting residuals missing predictor: can see trend residuals ‘a1’ since missed including model. can also see difference log likelihood AIC: AIC also supports including ‘a1’ predictor. help interpreting DHARMa residual plots, see vignette(\"DHARMa\", package=\"DHARMa\").","code":"s_pois <- simulate(fit_pois, nsim = 500, type = \"mle-mvn\") s_nb2_miss <- simulate(fit_nb2_miss, nsim = 500, type = \"mle-mvn\") s_nb2 <- simulate(fit_nb2, nsim = 500, type = \"mle-mvn\") dim(s_pois) #> [1] 1000 500 sum(dat$observed == 0) / length(dat$observed) #> [1] 0.527 sum(s_pois == 0)/length(s_pois) #> [1] 0.292788 sum(s_nb2 == 0)/length(s_nb2) #> [1] 0.524644 dharma_residuals(s_pois, fit_pois) r_pois <- dharma_residuals(s_pois, fit_pois, return_DHARMa = TRUE) plot(r_pois) DHARMa::testResiduals(r_pois) #> $uniformity #> #> Asymptotic one-sample Kolmogorov-Smirnov test #> #> data: simulationOutput$scaledResiduals #> D = 0.23338, p-value < 2.2e-16 #> alternative hypothesis: two-sided #> #> #> $dispersion #> #> DHARMa nonparametric dispersion test via sd of residuals fitted vs. #> simulated #> #> data: simulationOutput #> dispersion = 8.6989, p-value < 2.2e-16 #> alternative hypothesis: two.sided #> #> #> $outliers #> #> DHARMa outlier test based on exact binomial test with approximate #> expectations #> #> data: simulationOutput #> outliers at both margin(s) = 111, observations = 1000, p-value < #> 2.2e-16 #> alternative hypothesis: true probability of success is not equal to 0.003992016 #> 95 percent confidence interval: #> 0.09219791 0.13212606 #> sample estimates: #> frequency of outliers (expected: 0.00399201596806387 ) #> 0.111 #> $uniformity #> #> Asymptotic one-sample Kolmogorov-Smirnov test #> #> data: simulationOutput$scaledResiduals #> D = 0.23338, p-value < 2.2e-16 #> alternative hypothesis: two-sided #> #> #> $dispersion #> #> DHARMa nonparametric dispersion test via sd of residuals fitted vs. #> simulated #> #> data: simulationOutput #> dispersion = 8.6989, p-value < 2.2e-16 #> alternative hypothesis: two.sided #> #> #> $outliers #> #> DHARMa outlier test based on exact binomial test with approximate #> expectations #> #> data: simulationOutput #> outliers at both margin(s) = 111, observations = 1000, p-value < #> 2.2e-16 #> alternative hypothesis: true probability of success is not equal to 0.003992016 #> 95 percent confidence interval: #> 0.09219791 0.13212606 #> sample estimates: #> frequency of outliers (expected: 0.00399201596806387 ) #> 0.111 DHARMa::testSpatialAutocorrelation(r_pois, x = dat$X, y = dat$Y) #> #> DHARMa Moran's I test for distance-based autocorrelation #> #> data: r_pois #> observed = -0.0022978, expected = -0.0010010, sd = 0.0026264, p-value = #> 0.6215 #> alternative hypothesis: Distance-based autocorrelation DHARMa::testZeroInflation(r_pois) #> #> DHARMa zero-inflation test via comparison to expected zeros with #> simulation under H0 = fitted model #> #> data: simulationOutput #> ratioObsSim = 1.7999, p-value < 2.2e-16 #> alternative hypothesis: two.sided r_nb2 <- dharma_residuals(s_nb2, fit_nb2, return_DHARMa = TRUE) plot(r_nb2) DHARMa::testZeroInflation(r_nb2) #> #> DHARMa zero-inflation test via comparison to expected zeros with #> simulation under H0 = fitted model #> #> data: simulationOutput #> ratioObsSim = 1.0045, p-value = 0.908 #> alternative hypothesis: two.sided r_nb2_miss <- dharma_residuals(s_nb2_miss, fit_nb2_miss, return_DHARMa = TRUE) plot(r_nb2_miss) DHARMa::plotResiduals(r_nb2_miss, form = dat$a1) AIC(fit_nb2_miss, fit_nb2) #> df AIC #> fit_nb2_miss 4 3642.665 #> fit_nb2 5 3480.904"},{"path":"https://pbs-assess.github.io/sdmTMB/articles/web_only/residual-checking.html","id":"the-need-for-one-sample-residuals","dir":"Articles > Web_only","previous_headings":"","what":"The need for one-sample residuals","title":"Residual checking with sdmTMB","text":"used random effects drawn observed . approach described Waagepetersen (2006) summarized nicely unexported function oneSamplePosterior within TMB. Thygesen et al. (2017) also describes context one-sample MCMC residuals. show necessary. ’ll start simulating data Gaussian observation error spatial spatiotemporal random effects. use empirical Bayes (EB) random effect values (values random effects maximize log likelihood conditional estimated fixed effects), residuals look even though model perfectly matched simulated data: Indeed, test (incorrectly) rejects null hypothesis \\(r_1 \\sim \\operatorname{N}(0, 1)\\), calculated correctly know come \\(\\operatorname{N}(0, 1)\\) instead returned single sample assumed MVN random effect distribution, get ‘correct’ residuals: , (correctly) fail reject hypothesis \\(r_2 \\sim \\operatorname{N}(0, 1)\\). also sample observation random effects using MCMC (fixed effects still held MLEs), relaxes assumptions, much time intensive large models. gets us something similar (correctly) fail reject hypothesis \\(r_3 \\sim \\operatorname{N}(0, 1)\\). similar issue applies simulation-based quantile residuals, implemented DHARMa package. Instead can use draw random effects ‘posterior’ assuming MVN distribution. now looks correct. However, happens sample random effects simulation? get back something incorrect distribution comparison! , need single random effects sample per set simulations.","code":"set.seed(123) predictor_dat <- data.frame( X = runif(1000), Y = runif(1000), year = rep(1:5, each = 200) ) mesh <- make_mesh(predictor_dat, xy_cols = c(\"X\", \"Y\"), cutoff = 0.1) sim_dat <- sdmTMB_simulate( formula = ~ 1, data = predictor_dat, time = \"year\", mesh = mesh, family = gaussian(), range = 0.3, sigma_E = 0.3, phi = 0.1, sigma_O = 0.4, seed = 1, B = 0.2 # intercept ) fit <- sdmTMB(observed ~ 1, data = sim_dat, time = \"year\", mesh = mesh) set.seed(1) r1 <- residuals(fit, type = \"mle-eb\") qqnorm(r1);abline(0, 1) ks.test(r1, pnorm) #> #> Asymptotic one-sample Kolmogorov-Smirnov test #> #> data: r1 #> D = 0.058629, p-value = 0.002067 #> alternative hypothesis: two-sided set.seed(1) r2 <- residuals(fit, type = \"mle-mvn\") qqnorm(r2);abline(0, 1) ks.test(r2, pnorm) #> #> Asymptotic one-sample Kolmogorov-Smirnov test #> #> data: r2 #> D = 0.020639, p-value = 0.7879 #> alternative hypothesis: two-sided samp <- sdmTMBextra::predict_mle_mcmc(fit, mcmc_iter = 400, mcmc_warmup = 200) #> #> SAMPLING FOR MODEL 'tmb_generic' NOW (CHAIN 1). #> Chain 1: #> Chain 1: Gradient evaluation took 0.002067 seconds #> Chain 1: 1000 transitions using 10 leapfrog steps per transition would take 20.67 seconds. #> Chain 1: Adjust your expectations accordingly! #> Chain 1: #> Chain 1: #> Chain 1: Iteration: 1 / 400 [ 0%] (Warmup) #> Chain 1: Iteration: 40 / 400 [ 10%] (Warmup) #> Chain 1: Iteration: 80 / 400 [ 20%] (Warmup) #> Chain 1: Iteration: 120 / 400 [ 30%] (Warmup) #> Chain 1: Iteration: 160 / 400 [ 40%] (Warmup) #> Chain 1: Iteration: 200 / 400 [ 50%] (Warmup) #> Chain 1: Iteration: 201 / 400 [ 50%] (Sampling) #> Chain 1: Iteration: 240 / 400 [ 60%] (Sampling) #> Chain 1: Iteration: 280 / 400 [ 70%] (Sampling) #> Chain 1: Iteration: 320 / 400 [ 80%] (Sampling) #> Chain 1: Iteration: 360 / 400 [ 90%] (Sampling) #> Chain 1: Iteration: 400 / 400 [100%] (Sampling) #> Chain 1: #> Chain 1: Elapsed Time: 30.753 seconds (Warm-up) #> Chain 1: 30.123 seconds (Sampling) #> Chain 1: 60.876 seconds (Total) #> Chain 1: #> Warning: The largest R-hat is 1.08, indicating chains have not mixed. #> Running the chains for more iterations may help. See #> https://mc-stan.org/misc/warnings.html#r-hat #> Warning: Bulk Effective Samples Size (ESS) is too low, indicating posterior means and medians may be unreliable. #> Running the chains for more iterations may help. See #> https://mc-stan.org/misc/warnings.html#bulk-ess #> Warning: Tail Effective Samples Size (ESS) is too low, indicating posterior variances and tail quantiles may be unreliable. #> Running the chains for more iterations may help. See #> https://mc-stan.org/misc/warnings.html#tail-ess r3 <- residuals(fit, type = \"mle-mcmc\", mcmc_samples = samp) qqnorm(r3);abline(0, 1) ks.test(r3, pnorm) #> #> Asymptotic one-sample Kolmogorov-Smirnov test #> #> data: r3 #> D = 0.024522, p-value = 0.5845 #> alternative hypothesis: two-sided set.seed(1) simulate(fit, nsim = 500, type = \"mle-eb\") |> dharma_residuals(fit) #> Warning: It is recommended to use `simulate.sdmTMB(fit, type = 'mle-mvn')` if simulating #> for DHARMa residuals. See the description in ?residuals.sdmTMB under the types #> of residuals section. set.seed(1) simulate(fit, nsim = 500, type = \"mle-mvn\") |> dharma_residuals(fit) set.seed(1) s <- replicate(200, simulate(fit, nsim = 1, type = \"mle-mvn\"), simplify = \"matrix\") attr(s, \"type\") <- \"mle-mvn\" dharma_residuals(s, fit)"},{"path":"https://pbs-assess.github.io/sdmTMB/articles/web_only/residual-checking.html","id":"how-much-this-matters-depends-on-the-ratio-of-observation-error-variance-vs--random-effect-variance","dir":"Articles > Web_only","previous_headings":"","what":"How much this matters depends on the ratio of observation error variance vs. random effect variance","title":"Residual checking with sdmTMB","text":"Lets simulate data large observation error (phi , Gaussian error SD case) smaller levels random field variance (sigma_E sigma_O): Now, doesn’t really matter since ‘incorrect’ random effect distribution swamped observation error effect distribution. Technically, first set ‘wrong’ second set ‘right’, functionally ’d come similar conclusion case.","code":"set.seed(123) sim_dat2 <- sdmTMB_simulate( formula = ~ 1, data = predictor_dat, time = \"year\", mesh = mesh, family = gaussian(), range = 0.3, sigma_E = 0.1, # smaller than before sigma_O = 0.1, # smaller than before phi = 0.5, # bigger than before seed = 1, B = 0.2 ) fit2 <- sdmTMB(observed ~ 1, data = sim_dat2, time = \"year\", mesh = mesh) sanity(fit2) #> ✔ Non-linear minimizer suggests successful convergence #> ✔ Hessian matrix is positive definite #> ✔ No extreme or very small eigenvalues detected #> ✔ No gradients with respect to fixed effects are >= 0.001 #> ✔ No fixed-effect standard errors are NA #> ✔ No standard errors look unreasonably large #> ✔ No sigma parameters are < 0.01 #> ✔ No sigma parameters are > 100 #> ✔ Range parameter doesn't look unreasonably large set.seed(1) r1 <- residuals(fit2, type = \"mle-eb\") qqnorm(r1);abline(0, 1) ks.test(r1, pnorm) #> #> Asymptotic one-sample Kolmogorov-Smirnov test #> #> data: r1 #> D = 0.021085, p-value = 0.7656 #> alternative hypothesis: two-sided r2 <- residuals(fit2, type = \"mle-mvn\") qqnorm(r2);abline(0, 1) ks.test(r2, pnorm) #> #> Asymptotic one-sample Kolmogorov-Smirnov test #> #> data: r2 #> D = 0.016869, p-value = 0.9385 #> alternative hypothesis: two-sided"},{"path":"https://pbs-assess.github.io/sdmTMB/articles/web_only/residual-checking.html","id":"notes-on-uniform-vs--normal-quantile-residuals","dir":"Articles > Web_only","previous_headings":"","what":"Notes on uniform vs. normal quantile residuals","title":"Residual checking with sdmTMB","text":"randomized quantile residuals residuals.sdmTMB() returned normal(0, 1) model consistent data. DHARMa residuals, however, returned uniform(0, 1) circumstances. valid, use preference, ’s important appreciate changes appearance expected residuals. Analytical normal(0, 1): Analytical uniform(0, 1): Simulation-based uniform(0, 1): Simulation-based normal(0, 1): Conclusions: normal(0, 1) residuals probably familiar people. normal(0, 1) residuals put emphasis tails. good bad: ’s easier examine tail behaviour, can often look ‘’ even model fine (example) observations tails distribution definition rarely observed. Uniform(0, 1) residuals give data points equal visual weight emphasize consistency overall distribution rather tails. Either valid, need switch mindset expect see accordingly. example, poor tail behaviour may look like minor issue uniform residuals; conversely tails normal(0, 1) residuals unlikely ever look ‘perfect’ without large sample sizes.","code":"r2 <- residuals(fit2, type = \"mle-mvn\") hist(r2) qqnorm(r2) abline(0, 1) ks.test(r2, pnorm) #> #> Asymptotic one-sample Kolmogorov-Smirnov test #> #> data: r2 #> D = 0.025614, p-value = 0.528 #> alternative hypothesis: two-sided u <- pnorm(r2) hist(u) n <- length(u) m <- seq_len(n) / (n + 1) qqplot(m, u) ks.test(u, punif) #> #> Asymptotic one-sample Kolmogorov-Smirnov test #> #> data: u #> D = 0.025614, p-value = 0.528 #> alternative hypothesis: two-sided abline(0, 1) set.seed(1) s <- simulate(fit2, nsim = 500, type = \"mle-mvn\") |> dharma_residuals(fit2, return_DHARMa = TRUE) hist(s$scaledResiduals) u <- s$scaledResiduals m <- seq_len(length(u)) / (length(u)+ 1) qqplot(m, u) abline(0, 1) ks.test(u, punif) #> #> Asymptotic one-sample Kolmogorov-Smirnov test #> #> data: u #> D = 0.026, p-value = 0.5085 #> alternative hypothesis: two-sided set.seed(1) s <- simulate(fit2, nsim = 1500, type = \"mle-mvn\") |> dharma_residuals(fit2, return_DHARMa = TRUE) u <- s$scaledResiduals r <- qnorm(u) qqnorm(r) abline(0, 1) ks.test(u, punif) #> #> Asymptotic one-sample Kolmogorov-Smirnov test #> #> data: u #> D = 0.028, p-value = 0.4131 #> alternative hypothesis: two-sided"},{"path":"https://pbs-assess.github.io/sdmTMB/articles/web_only/residual-checking.html","id":"how-do-those-randomized-quantile-residuals-work","dir":"Articles > Web_only","previous_headings":"","what":"How do those randomized-quantile residuals work?","title":"Residual checking with sdmTMB","text":"Let’s work simple example gamma distribution: get quantile residuals, transform uniform(0, 1) using pgamma() (optionally) convert normal(0, 1) qnorm(): works distribution can define quantile function. integer values, need add randomization additional step. Let’s work Poisson sample: gamma example illustrated can use distribution function (pgamma()) take gamma-distributed variable turn uniform-distributed variable. Now need Poisson equivalent: ppois(). ppois(y) function gives us cumulative probability density value y. Say Poisson variable mean (lambda) 5 observation value 3. can calculate density value 3 : .e., : , naively apply ppois() observed values ’ll end discrete cumulative probabilities aren’t useful comparing continuous uniform. Instead, need get probability density value one observed “fill ” values observed value desired uniformly distributed samples. First, get density value observed value: can add randomization using runif() since expectation uniform distribution stage fill values : optionally apply qnorm():","code":"set.seed(1) mu <- rep(2, 500) phi <- 0.5 y <- rgamma(length(mu), shape = phi, scale = mu / phi) u <- pgamma(q = y, shape = phi, scale = mu / phi) hist(u) r <- qnorm(u) hist(r) set.seed(1) mu <- rep(2, 500) y <- rpois(length(mu), mu) lambda <- 5 ppois(3, lambda) #> [1] 0.2650259 dpois(0, lambda) + dpois(1, lambda) + dpois(2, lambda) + dpois(3, lambda) #> [1] 0.2650259 hist(ppois(y, mu)) a <- ppois(y - 1, mu) hist(a) b <- ppois(y, mu) hist(b) u <- runif(n = length(y), min = a, max = b) hist(u) r <- qnorm(u) hist(r)"},{"path":"https://pbs-assess.github.io/sdmTMB/articles/web_only/residual-checking.html","id":"how-do-those-simulation-based-residuals-work","dir":"Articles > Web_only","previous_headings":"","what":"How do those simulation-based residuals work?","title":"Residual checking with sdmTMB","text":"DHARMa uses simulation-based quantile residuals. result, don’t need define quantile function analytically. , instead line like : simulate model repeatedly, see observation falls within simulated values, get quantile way. example, instead : :","code":"u <- pgamma(q = y, shape = phi, scale = mu / phi) pnorm(2.2, mean = 0.5, sd = 1) #> [1] 0.9554345 set.seed(1) s <- rnorm(1e6, 0.5, 1) # equivalent of simulate()ing from our model mean(s < 2.2) # equivalent of pnorm() #> [1] 0.955308"},{"path":"https://pbs-assess.github.io/sdmTMB/articles/web_only/residual-checking.html","id":"references","dir":"Articles > Web_only","previous_headings":"","what":"References","title":"Residual checking with sdmTMB","text":"Dunn, P.K., Smyth, G.K. 1996. Randomized Quantile Residuals. Journal Computational Graphical Statistics 5(3): 236–244. Waagepetersen, R. 2006. Simulation-Based Goodness--Fit Test Random Effects Generalized Linear Mixed Models. Scandinavian Journal Statistics 33(4): 721–731. Thygesen, U.H., Albertsen, C.M., Berg, C.W., Kristensen, K., Nielsen, . 2017. Validation ecological state space models using Laplace approximation. Environ Ecol Stat 24(2): 317–339.","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/articles/web_only/visreg.html","id":"delta-models","dir":"Articles > Web_only","previous_headings":"","what":"Delta models","title":"Visualizing sdmTMB conditional effects using visreg","text":"visreg() can also used plot output delta models sdmTMB() using similar code previous plots, using sdmTMB wrapper function visreg_delta() specifying model = 1 encounter (0 vs. non-zero) model model = 2 positive component model (e.g., Gamma, lognormal). example: Note plotting visreg_delta(), categorical variables like year need designated factor data frame, example fyear, rather model formula (e.g., + .factor(year)).","code":"fit_dg <- sdmTMB( density ~ s(depth_scaled, year, k = 8), data = pcod_2011, mesh = pcod_mesh_2011, spatial = \"off\", # for vignette speed family = delta_gamma() ) visreg_delta(fit_dg, xvar = \"depth_scaled\", model = 1, gg = TRUE) #> These are residuals for delta model component 1. Use the `model` argument to #> select the other component. visreg_delta(fit_dg, xvar = \"depth_scaled\", model = 2, gg = TRUE) #> These are residuals for delta model component 2. Use the `model` argument to #> select the other component."},{"path":"https://pbs-assess.github.io/sdmTMB/authors.html","id":null,"dir":"","previous_headings":"","what":"Authors","title":"Authors and Citation","text":"Sean C. Anderson. Author, maintainer. Eric J. Ward. Author. Philina . English. Author. Lewis . K. Barnett. Author. James T. Thorson. Author, copyright holder. VAST author Joe Watson. Contributor. Censored Poisson Julia Indivero. Contributor. Vignette writing Jillian C. Dunic. Contributor. Cole C. Monnahan. Contributor, copyright holder. VAST contributor Mollie Brooks. Contributor, copyright holder. glmmTMB author Ben Bolker. Contributor, copyright holder. glmmTMB author Kasper Kristensen. Contributor, copyright holder. TMB/glmmTMB author Martin Maechler. Contributor, copyright holder. glmmTMB author Arni Magnusson. Contributor, copyright holder. glmmTMB author Hans J. Skaug. Contributor, copyright holder. glmmTMB author, SPDE barrier Anders Nielsen. Contributor, copyright holder. glmmTMB author Casper Berg. Contributor, copyright holder. glmmTMB author Koen van Bentham. Contributor, copyright holder. glmmTMB author Olav Nikolai Breivik. Contributor, copyright holder. SPDE barrier Simon Wood. Contributor, copyright holder. mgcv: smoother prediction Paul-Christian Bürkner. Contributor, copyright holder. brms: smoother matrix parsing Majesty King Right Canada, represented Minister Department Fisheries Oceans. Copyright holder.","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/authors.html","id":"citation","dir":"","previous_headings":"","what":"Citation","title":"Authors and Citation","text":"Anderson SC, Ward EJ, English PA, Barnett LAK, Thorson JT (2024). “sdmTMB: R package fast, flexible, user-friendly generalized linear mixed effects models spatial spatiotemporal random fields.” bioRxiv, 2022.03.24.485545. doi:10.1101/2022.03.24.485545.","code":"@Article{, title = {sdmTMB: an R package for fast, flexible, and user-friendly generalized linear mixed effects models with spatial and spatiotemporal random fields}, author = {Sean C. Anderson and Eric J. Ward and Philina A. English and Lewis A. K. Barnett and James T. Thorson}, year = {2024}, journal = {bioRxiv}, volume = {2022.03.24.485545}, doi = {10.1101/2022.03.24.485545}, }"},{"path":"https://pbs-assess.github.io/sdmTMB/header.html","id":null,"dir":"","previous_headings":"","what":"sdmTMB ","title":"sdmTMB ","text":"Spatial spatiotemporal GLMMs TMB sdmTMB R package fits spatial spatiotemporal GLMMs (Generalized Linear Mixed Effects Models) using Template Model Builder (TMB), R-INLA, Gaussian Markov random fields. One common application species distribution models (SDMs). See documentation site preprint: Anderson, S.C., E.J. Ward, P.. English, L..K. Barnett, J.T. Thorson. 2024. sdmTMB: R package fast, flexible, user-friendly generalized linear mixed effects models spatial spatiotemporal random fields. bioRxiv 2022.03.24.485545; doi: https://doi.org/10.1101/2022.03.24.485545","code":""},{"path":[]},{"path":"https://pbs-assess.github.io/sdmTMB/index.html","id":"sdmtmb-","dir":"","previous_headings":"","what":"Spatial and Spatiotemporal SPDE-Based GLMMs with TMB","title":"Spatial and Spatiotemporal SPDE-Based GLMMs with TMB","text":"Spatial spatiotemporal GLMMs TMB sdmTMB R package fits spatial spatiotemporal GLMMs (Generalized Linear Mixed Effects Models) using Template Model Builder (TMB), R-INLA, Gaussian Markov random fields. One common application species distribution models (SDMs). See documentation site preprint: Anderson, S.C., E.J. Ward, P.. English, L..K. Barnett, J.T. Thorson. 2024. sdmTMB: R package fast, flexible, user-friendly generalized linear mixed effects models spatial spatiotemporal random fields. bioRxiv 2022.03.24.485545; doi: https://doi.org/10.1101/2022.03.24.485545","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/index.html","id":"table-of-contents","dir":"","previous_headings":"","what":"Table of contents","title":"Spatial and Spatiotemporal SPDE-Based GLMMs with TMB","text":"Installation Overview Getting help Citation Related software Basic use Time-varying coefficients Spatially varying coefficients (SVC) Random intercepts Breakpoint threshold effects Simulating data Sampling joint precision matrix Calculating uncertainty spatial predictions Cross validation Priors Bayesian MCMC sampling Stan Turning random fields Using custom fmesher mesh Barrier meshes","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/index.html","id":"installation","dir":"","previous_headings":"","what":"Installation","title":"Spatial and Spatiotemporal SPDE-Based GLMMs with TMB","text":"sdmTMB can installed CRAN: Assuming C++ compiler installed, development version can installed: extra utilities sdmTMBextra package. Importantly, recommended use optimized BLAS library, result major speed improvements TMB () models R (e.g., often 8-fold speed increases sdmTMB models). Suggested installation instructions Mac users, Linux users, Windows users. check ’ve successfully linked optimized BLAS, start new session run: result take fraction second (e.g., 0.03 s), multiple seconds.","code":"install.packages(\"sdmTMB\", dependencies = TRUE) # install.packages(\"remotes\") remotes::install_github(\"pbs-assess/sdmTMB\", dependencies = TRUE) m <- 1e4; n <- 1e3; k <- 3e2 X <- matrix(rnorm(m*k), nrow=m); Y <- matrix(rnorm(n*k), ncol=n) system.time(X %*% Y)"},{"path":"https://pbs-assess.github.io/sdmTMB/index.html","id":"overview","dir":"","previous_headings":"","what":"Overview","title":"Spatial and Spatiotemporal SPDE-Based GLMMs with TMB","text":"Analyzing geostatistical data (coordinate-referenced observations underlying spatial process) becoming increasingly common ecology. sdmTMB implements geostatistical spatial spatiotemporal GLMMs using TMB model fitting R-INLA set SPDE (stochastic partial differential equation) matrices. One common application species distribution models (SDMs), hence package name. goal sdmTMB provide fast, flexible, user-friendly interface—similar popular R package glmmTMB—focus spatial spatiotemporal models SPDE approach. extend generalized linear mixed models (GLMMs) familiar ecologists include following optional features: spatial random fields spatiotemporal random fields may independent year modelled random walks autoregressive processes smooth terms covariates, using familiar s() notation mgcv breakpoint (hockey-stick) logistic covariates time-varying covariates (coefficients modelled random walks) spatially varying coefficient models (SVCs) interpolation forecasting missing future time slices wide range families: standard R families plus tweedie(), nbinom1(), nbinom2(), lognormal(), student(), plus truncated censored families delta/hurdle models including delta_gamma(), delta_lognormal(), delta_truncated_nbinom2() Estimation performed sdmTMB via maximum marginal likelihood objective function calculated TMB minimized R via stats::nlminb() random effects integrated via Laplace approximation. sdmTMB package also allows models passed Stan via tmbstan, allowing Bayesian model estimation. See ?sdmTMB ?predict.sdmTMB complete examples. Also see vignettes (‘Articles’) documentation site preprint appendices linked .","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/index.html","id":"getting-help","dir":"","previous_headings":"","what":"Getting help","title":"Spatial and Spatiotemporal SPDE-Based GLMMs with TMB","text":"questions use sdmTMB interpret models, please post discussion board. email question, likely respond discussion board anonymized version question (without data) think helpful others. Please let us know don’t want us . bugs feature requests, please post issue tracker. Slides recordings workshop sdmTMB.","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/index.html","id":"citation","dir":"","previous_headings":"","what":"Citation","title":"Spatial and Spatiotemporal SPDE-Based GLMMs with TMB","text":"cite sdmTMB publications use: Anderson, S.C., E.J. Ward, P.. English, L..K. Barnett., J.T. Thorson. 2024. sdmTMB: R package fast, flexible, user-friendly generalized linear mixed effects models spatial spatiotemporal random fields. bioRxiv 2022.03.24.485545; doi: https://doi.org/10.1101/2022.03.24.485545 list (known) publications use sdmTMB can found . Please use citation can track publications.","code":"citation(\"sdmTMB\")"},{"path":"https://pbs-assess.github.io/sdmTMB/index.html","id":"related-software","dir":"","previous_headings":"","what":"Related software","title":"Spatial and Spatiotemporal SPDE-Based GLMMs with TMB","text":"sdmTMB heavily inspired VAST R package: Thorson, J.T. 2019. Guidance decisions using Vector Autoregressive Spatio-Temporal (VAST) package stock, ecosystem, habitat climate assessments. Fisheries Research 210: 143–161. https://doi.org/10.1016/j.fishres.2018.10.013. glmmTMB R package: Brooks, M.E., Kristensen, K., van Benthem, K.J., Magnusson, ., Berg, C.W., Nielsen, ., Skaug, H.J., Maechler, M., Bolker, B.M. 2017. glmmTMB balances speed flexibility among packages zero-inflated generalized linear mixed modeling. R Journal 9(2): 378–400. https://doi.org/10.32614/rj-2017-066. INLA inlabru can fit many models sdmTMB (many ) approximate Bayesian inference framework. mgcv can fit similar SPDE-based Gaussian random field models code included Miller et al. (2019). table sdmTMB preprint describes functionality timing comparisons sdmTMB, VAST, INLA/inlabru, mgcv discussion makes suggestions might choose one package another.","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/index.html","id":"basic-use","dir":"","previous_headings":"","what":"Basic use","title":"Spatial and Spatiotemporal SPDE-Based GLMMs with TMB","text":"sdmTMB model requires data frame contains response column, columns predictors, columns spatial coordinates. usually makes sense convert spatial coordinates equidistant projection UTMs distance remains constant throughout study region [e.g., using sf::st_transform()]. , illustrate spatial model fit Pacific cod (Gadus macrocephalus) trawl survey data Queen Charlotte Sound, BC, Canada. model contains main effect depth penalized smoother, spatial random field, Tweedie observation error. data frame pcod (built package) column year year survey, density density Pacific cod given survey tow, present whether density > 0, depth depth meters tow, spatial coordinates X Y, UTM coordinates kilometres. start creating mesh object contains matrices apply SPDE approach. , cutoff defines minimum allowed distance points units X Y (km). Alternatively, created mesh via fmesher INLA packages supplied make_mesh(). can inspect mesh object associated plotting method plot(mesh). Fit spatial model smoother depth: Print model fit: output indicates model fit maximum (marginal) likelihood (ML). also see formula, mesh, fitted data, family. Next see estimated main effects including linear component smoother (sdepth), standard deviation smoother weights (sds(depth)), Tweedie dispersion power parameters, Matérn range distance (distance points effectively independent), marginal spatial field standard deviation, negative log likelihood convergence. can extract parameters data frame: Run basic sanity checks model: Use ggeffects package plot smoother effect: depth effect parametric penalized smoother, alternatively used ggeffects::ggeffect() fast marginal effect plot. Predict new data: switch presence-absence model changing response column family: hurdle/delta model changing family: instead fit spatiotemporal model specifying time column spatiotemporal structure: wanted create area-weighted standardized population index, predict grid covering entire survey (qcs_grid) grid cell area 4 (2 x 2 km) pass predictions get_index(): center gravity: basic features, see vignettes Intro modelling sdmTMB Index standardization sdmTMB.","code":"library(dplyr) library(ggplot2) library(sdmTMB) head(pcod) #> # A tibble: 3 × 6 #> year density present depth X Y #> #> 1 2003 113. 1 201 446. 5793. #> 2 2003 41.7 1 212 446. 5800. #> 3 2003 0 0 220 449. 5802. mesh <- make_mesh(pcod, xy_cols = c(\"X\", \"Y\"), cutoff = 10) fit <- sdmTMB( density ~ s(depth), data = pcod, mesh = mesh, family = tweedie(link = \"log\"), spatial = \"on\" ) fit #> Spatial model fit by ML ['sdmTMB'] #> Formula: density ~ s(depth) #> Mesh: mesh (isotropic covariance) #> Data: pcod #> Family: tweedie(link = 'log') #> #> coef.est coef.se #> (Intercept) 2.37 0.21 #> sdepth 0.62 2.53 #> #> Smooth terms: #> Std. Dev. #> sds(depth) 13.93 #> #> Dispersion parameter: 12.69 #> Tweedie p: 1.58 #> Matérn range: 16.39 #> Spatial SD: 1.86 #> ML criterion at convergence: 6402.136 #> #> See ?tidy.sdmTMB to extract these values as a data frame. tidy(fit, conf.int = TRUE) #> # A tibble: 1 × 5 #> term estimate std.error conf.low conf.high #> #> 1 (Intercept) 2.37 0.215 1.95 2.79 tidy(fit, effects = \"ran_pars\", conf.int = TRUE) #> # A tibble: 4 × 5 #> term estimate std.error conf.low conf.high #> #> 1 range 16.4 4.47 9.60 28.0 #> 2 phi 12.7 0.406 11.9 13.5 #> 3 sigma_O 1.86 0.218 1.48 2.34 #> 4 tweedie_p 1.58 0.00998 1.56 1.60 sanity(fit) #> ✔ Non-linear minimizer suggests successful convergence #> ✔ Hessian matrix is positive definite #> ✔ No extreme or very small eigenvalues detected #> ✔ No gradients with respect to fixed effects are >= 0.001 #> ✔ No fixed-effect standard errors are NA #> ✔ No standard errors look unreasonably large #> ✔ No sigma parameters are < 0.01 #> ✔ No sigma parameters are > 100 #> ✔ Range parameter doesn't look unreasonably large ggeffects::ggpredict(fit, \"depth [50:400, by=2]\") |> plot() p <- predict(fit, newdata = qcs_grid) head(p) #> # A tibble: 3 × 7 #> X Y depth est est_non_rf est_rf omega_s #> #> 1 456 5636 347. -3.06 -3.08 0.0172 0.0172 #> 2 458 5636 223. 2.03 1.99 0.0460 0.0460 #> 3 460 5636 204. 2.89 2.82 0.0747 0.0747 ggplot(p, aes(X, Y, fill = exp(est))) + geom_raster() + scale_fill_viridis_c(trans = \"sqrt\") fit <- sdmTMB( present ~ s(depth), data = pcod, mesh = mesh, family = binomial(link = \"logit\") ) fit <- sdmTMB( density ~ s(depth), data = pcod, mesh = mesh, family = delta_gamma(link1 = \"logit\", link2 = \"log\"), ) fit_spatiotemporal <- sdmTMB( density ~ s(depth, k = 5), data = pcod, mesh = mesh, time = \"year\", family = tweedie(link = \"log\"), spatial = \"off\", spatiotemporal = \"ar1\" ) grid_yrs <- replicate_df(qcs_grid, \"year\", unique(pcod$year)) p_st <- predict(fit_spatiotemporal, newdata = grid_yrs, return_tmb_object = TRUE) index <- get_index(p_st, area = rep(4, nrow(grid_yrs))) ggplot(index, aes(year, est)) + geom_ribbon(aes(ymin = lwr, ymax = upr), fill = \"grey90\") + geom_line(lwd = 1, colour = \"grey30\") + labs(x = \"Year\", y = \"Biomass (kg)\") cog <- get_cog(p_st, format = \"wide\") ggplot(cog, aes(est_x, est_y, colour = year)) + geom_pointrange(aes(xmin = lwr_x, xmax = upr_x)) + geom_pointrange(aes(ymin = lwr_y, ymax = upr_y)) + scale_colour_viridis_c()"},{"path":[]},{"path":"https://pbs-assess.github.io/sdmTMB/index.html","id":"time-varying-coefficients","dir":"","previous_headings":"Advanced functionality","what":"Time-varying coefficients","title":"Spatial and Spatiotemporal SPDE-Based GLMMs with TMB","text":"Time-varying intercept: Time-varying (random walk) effect depth: See vignette Intro modelling sdmTMB details.","code":"fit <- sdmTMB( density ~ 0 + s(depth, k = 5), time_varying = ~ 1, data = pcod, mesh = mesh, time = \"year\", family = tweedie(link = \"log\"), silent = FALSE # see progress ) fit <- sdmTMB( density ~ 1, time_varying = ~ 0 + depth_scaled + depth_scaled2, data = pcod, mesh = mesh, time = \"year\", family = tweedie(link = \"log\"), spatial = \"off\", spatiotemporal = \"ar1\", silent = FALSE )"},{"path":"https://pbs-assess.github.io/sdmTMB/index.html","id":"spatially-varying-coefficients-svc","dir":"","previous_headings":"Advanced functionality","what":"Spatially varying coefficients (SVC)","title":"Spatial and Spatiotemporal SPDE-Based GLMMs with TMB","text":"Spatially varying effect time: See zeta_s output, represents coefficient varying space. ’ll want ensure set model ballpark mean 0 (e.g., including formula ). See vignette Fitting spatial trend models sdmTMB details.","code":"pcod$year_scaled <- as.numeric(scale(pcod$year)) fit <- sdmTMB( density ~ s(depth, k = 5) + year_scaled, spatial_varying = ~ year_scaled, data = pcod, mesh = mesh, time = \"year\", family = tweedie(link = \"log\"), spatiotemporal = \"off\" ) grid_yrs <- replicate_df(qcs_grid, \"year\", unique(pcod$year)) grid_yrs$year_scaled <- (grid_yrs$year - mean(pcod$year)) / sd(pcod$year) p <- predict(fit, newdata = grid_yrs) %>% subset(year == 2011) # any year ggplot(p, aes(X, Y, fill = zeta_s_year_scaled)) + geom_raster() + scale_fill_gradient2()"},{"path":"https://pbs-assess.github.io/sdmTMB/index.html","id":"random-intercepts","dir":"","previous_headings":"Advanced functionality","what":"Random intercepts","title":"Spatial and Spatiotemporal SPDE-Based GLMMs with TMB","text":"can use syntax (1 | group) lme4 glmmTMB fit random intercepts:","code":"pcod$year_factor <- as.factor(pcod$year) fit <- sdmTMB( density ~ s(depth, k = 5) + (1 | year_factor), data = pcod, mesh = mesh, time = \"year\", family = tweedie(link = \"log\") )"},{"path":"https://pbs-assess.github.io/sdmTMB/index.html","id":"breakpoint-and-threshold-effects","dir":"","previous_headings":"Advanced functionality","what":"Breakpoint and threshold effects","title":"Spatial and Spatiotemporal SPDE-Based GLMMs with TMB","text":"See vignette Threshold modeling sdmTMB details.","code":"fit <- sdmTMB( present ~ 1 + breakpt(depth_scaled), data = pcod, mesh = mesh, family = binomial(link = \"logit\") ) fit <- sdmTMB( present ~ 1 + logistic(depth_scaled), data = pcod, mesh = mesh, family = binomial(link = \"logit\") )"},{"path":[]},{"path":"https://pbs-assess.github.io/sdmTMB/index.html","id":"simulating-data-from-scratch","dir":"","previous_headings":"Advanced functionality > Simulating data","what":"Simulating data from scratch","title":"Spatial and Spatiotemporal SPDE-Based GLMMs with TMB","text":"Fit simulated data: See ?sdmTMB_simulate details.","code":"predictor_dat <- expand.grid( X = seq(0, 1, length.out = 100), Y = seq(0, 1, length.out = 100) ) mesh <- make_mesh(predictor_dat, xy_cols = c(\"X\", \"Y\"), cutoff = 0.05) sim_dat <- sdmTMB_simulate( formula = ~ 1, data = predictor_dat, mesh = mesh, family = poisson(link = \"log\"), range = 0.3, sigma_O = 0.4, seed = 1, B = 1 # B0 = intercept ) head(sim_dat) #> # A tibble: 6 × 7 #> X Y omega_s mu eta observed `(Intercept)` #> #> 1 0 0 -0.154 2.33 0.846 1 1 #> 2 0.0101 0 -0.197 2.23 0.803 0 1 #> 3 0.0202 0 -0.240 2.14 0.760 2 1 #> 4 0.0303 0 -0.282 2.05 0.718 2 1 #> 5 0.0404 0 -0.325 1.96 0.675 3 1 #> 6 0.0505 0 -0.367 1.88 0.633 2 1 # sample 200 points for fitting: set.seed(1) sim_dat_obs <- sim_dat[sample(seq_len(nrow(sim_dat)), 200), ] ggplot(sim_dat, aes(X, Y)) + geom_raster(aes(fill = exp(eta))) + # mean without observation error geom_point(aes(size = observed), data = sim_dat_obs, pch = 21) + scale_fill_viridis_c() + scale_size_area() + coord_cartesian(expand = FALSE) mesh <- make_mesh(sim_dat_obs, xy_cols = c(\"X\", \"Y\"), cutoff = 0.05) fit <- sdmTMB( observed ~ 1, data = sim_dat_obs, mesh = mesh, family = poisson() )"},{"path":"https://pbs-assess.github.io/sdmTMB/index.html","id":"simulating-from-an-existing-fit","dir":"","previous_headings":"Advanced functionality > Simulating data","what":"Simulating from an existing fit","title":"Spatial and Spatiotemporal SPDE-Based GLMMs with TMB","text":"See vignette Residual checking sdmTMB, ?simulate.sdmTMB, ?dharma_residuals details.","code":"s <- simulate(fit, nsim = 500) dim(s) #> [1] 969 500 s[1:3,1:4] #> [,1] [,2] [,3] [,4] #> [1,] 0 59.40310 83.20888 0.00000 #> [2,] 0 34.56408 0.00000 19.99839 #> [3,] 0 0.00000 0.00000 0.00000"},{"path":"https://pbs-assess.github.io/sdmTMB/index.html","id":"sampling-from-the-joint-precision-matrix","dir":"","previous_headings":"Advanced functionality","what":"Sampling from the joint precision matrix","title":"Spatial and Spatiotemporal SPDE-Based GLMMs with TMB","text":"can take samples implied parameter distribution assuming MVN covariance matrix internal parameterization: See ?gather_sims ?get_index_sims details.","code":"samps <- gather_sims(fit, nsim = 1000) ggplot(samps, aes(.value)) + geom_histogram() + facet_wrap(~.variable, scales = \"free_x\") #> `stat_bin()` using `bins = 30`. Pick better value with `binwidth`."},{"path":"https://pbs-assess.github.io/sdmTMB/index.html","id":"calculating-uncertainty-on-spatial-predictions","dir":"","previous_headings":"Advanced functionality","what":"Calculating uncertainty on spatial predictions","title":"Spatial and Spatiotemporal SPDE-Based GLMMs with TMB","text":"fastest way get point-wise prediction uncertainty use MVN samples:","code":"p <- predict(fit, newdata = predictor_dat, nsim = 500) predictor_dat$se <- apply(p, 1, sd) ggplot(predictor_dat, aes(X, Y, fill = se)) + geom_raster() + scale_fill_viridis_c(option = \"A\") + coord_cartesian(expand = FALSE)"},{"path":"https://pbs-assess.github.io/sdmTMB/index.html","id":"cross-validation","dir":"","previous_headings":"Advanced functionality","what":"Cross validation","title":"Spatial and Spatiotemporal SPDE-Based GLMMs with TMB","text":"sdmTMB built-functionality cross-validation. set future::plan(), folds fit parallel: See ?sdmTMB_cv details.","code":"mesh <- make_mesh(pcod, c(\"X\", \"Y\"), cutoff = 10) ## Set parallel processing if desired: # library(future) # plan(multisession) m_cv <- sdmTMB_cv( density ~ s(depth, k = 5), data = pcod, mesh = mesh, family = tweedie(link = \"log\"), k_folds = 2 ) #> Running fits with `future.apply()`. #> Set a parallel `future::plan()` to use parallel processing. # Sum of log likelihoods of left-out data: m_cv$sum_loglik #> [1] -6756.28"},{"path":"https://pbs-assess.github.io/sdmTMB/index.html","id":"priors","dir":"","previous_headings":"Advanced functionality","what":"Priors","title":"Spatial and Spatiotemporal SPDE-Based GLMMs with TMB","text":"Priors/penalties can placed parameters. example, place PC (penalized complexity) prior Matérn random field parameters, standard normal prior effect depth, Normal(0, 10^2) prior intercept, half-normal prior Tweedie dispersion parameter (phi): can visualize PC Matérn prior: See ?sdmTMBpriors details.","code":"mesh <- make_mesh(pcod, c(\"X\", \"Y\"), cutoff = 10) fit <- sdmTMB( density ~ depth_scaled, data = pcod, mesh = mesh, family = tweedie(), priors = sdmTMBpriors( matern_s = pc_matern(range_gt = 10, sigma_lt = 5), b = normal(c(0, 0), c(1, 10)), phi = halfnormal(0, 15) ) ) plot_pc_matern(range_gt = 10, sigma_lt = 5)"},{"path":"https://pbs-assess.github.io/sdmTMB/index.html","id":"bayesian-mcmc-sampling-with-stan","dir":"","previous_headings":"Advanced functionality","what":"Bayesian MCMC sampling with Stan","title":"Spatial and Spatiotemporal SPDE-Based GLMMs with TMB","text":"fitted model can passed tmbstan package sample posterior Stan. See Bayesian vignette.","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/index.html","id":"turning-off-random-fields","dir":"","previous_headings":"Advanced functionality","what":"Turning off random fields","title":"Spatial and Spatiotemporal SPDE-Based GLMMs with TMB","text":"can turn random fields model comparison:","code":"fit_sdmTMB <- sdmTMB( present ~ poly(depth_scaled, 2), data = pcod, mesh = mesh, spatial = \"off\", family = binomial() ) fit_glm <- glm( present ~ poly(depth_scaled, 2), data = pcod, family = binomial() ) tidy(fit_sdmTMB) #> # A tibble: 3 × 3 #> term estimate std.error #> #> 1 (Intercept) -0.426 0.0573 #> 2 poly(depth_scaled, 2)1 -31.7 3.03 #> 3 poly(depth_scaled, 2)2 -66.9 4.09 broom::tidy(fit_glm) #> # A tibble: 3 × 5 #> term estimate std.error statistic p.value #> #> 1 (Intercept) -0.426 0.0573 -7.44 1.03e-13 #> 2 poly(depth_scaled, 2)1 -31.7 3.03 -10.5 1.20e-25 #> 3 poly(depth_scaled, 2)2 -66.9 4.09 -16.4 3.50e-60"},{"path":"https://pbs-assess.github.io/sdmTMB/index.html","id":"using-a-custom-fmesher-mesh","dir":"","previous_headings":"Advanced functionality","what":"Using a custom fmesher mesh","title":"Spatial and Spatiotemporal SPDE-Based GLMMs with TMB","text":"Defining mesh directly INLA:","code":"bnd <- INLA::inla.nonconvex.hull(cbind(pcod$X, pcod$Y), convex = -0.1) mesh_inla <- INLA::inla.mesh.2d( boundary = bnd, max.edge = c(25, 50) ) mesh <- make_mesh(pcod, c(\"X\", \"Y\"), mesh = mesh_inla) plot(mesh) fit <- sdmTMB( density ~ s(depth, k = 5), data = pcod, mesh = mesh, family = tweedie(link = \"log\") )"},{"path":"https://pbs-assess.github.io/sdmTMB/index.html","id":"barrier-meshes","dir":"","previous_headings":"Advanced functionality","what":"Barrier meshes","title":"Spatial and Spatiotemporal SPDE-Based GLMMs with TMB","text":"barrier mesh limits correlation across barriers (e.g., land water). See add_barrier_mesh() sdmTMBextra.","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/reference/Effect.sdmTMB.html","id":null,"dir":"Reference","previous_headings":"","what":"Calculate effects — Effect.sdmTMB","title":"Calculate effects — Effect.sdmTMB","text":"Used effects package","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/reference/Effect.sdmTMB.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"Calculate effects — Effect.sdmTMB","text":"","code":"Effect.sdmTMB(focal.predictors, mod, ...)"},{"path":"https://pbs-assess.github.io/sdmTMB/reference/Effect.sdmTMB.html","id":"arguments","dir":"Reference","previous_headings":"","what":"Arguments","title":"Calculate effects — Effect.sdmTMB","text":"focal.predictors character vector one predictors model order. mod regression model object. specific method exists class mod, Effect.default called. ... arguments passed .","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/reference/Effect.sdmTMB.html","id":"value","dir":"Reference","previous_headings":"","what":"Value","title":"Calculate effects — Effect.sdmTMB","text":"Output effects::effect(). Can plotted associated plot() method.","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/reference/Effect.sdmTMB.html","id":"ref-examples","dir":"Reference","previous_headings":"","what":"Examples","title":"Calculate effects — Effect.sdmTMB","text":"","code":"fit <- sdmTMB(present ~ depth_scaled, data = pcod_2011, family = binomial(), spatial = \"off\") effects::effect(\"depth_scaled\", fit) #> #> depth_scaled effect #> depth_scaled #> -3 -2 -0.1 1 3 #> 0.7511661 0.6634557 0.4673280 0.3544365 0.1897228 plot(effects::effect(\"depth_scaled\", fit))"},{"path":"https://pbs-assess.github.io/sdmTMB/reference/add_barrier_mesh.html","id":null,"dir":"Reference","previous_headings":"","what":"Transform a mesh object into a mesh with correlation barriers — add_barrier_mesh","title":"Transform a mesh object into a mesh with correlation barriers — add_barrier_mesh","text":"Moved sdmTMBextra package. Make sure load sdmTMBextra sdmTMB.","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/reference/add_barrier_mesh.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"Transform a mesh object into a mesh with correlation barriers — add_barrier_mesh","text":"","code":"add_barrier_mesh( spde_obj = deprecated(), barrier_sf = deprecated(), range_fraction = 0.2, proj_scaling = 1, plot = FALSE )"},{"path":"https://pbs-assess.github.io/sdmTMB/reference/add_barrier_mesh.html","id":"arguments","dir":"Reference","previous_headings":"","what":"Arguments","title":"Transform a mesh object into a mesh with correlation barriers — add_barrier_mesh","text":"spde_obj Output make_mesh(). barrier_sf sf object polygons defining barriers. example, coastline dataset ocean data. Note object must projection data used generate x y columns spde_obj. range_fraction fraction spatial range barrier triangles . proj_scaling spde_obj created scaling coordinates projection (e.g., dividing UTMs 1000 spatial range reasonable scale) x y values spde_obj multiplied scaling factor applying projection barrier_sf. plot Logical.","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/reference/add_barrier_mesh.html","id":"value","dir":"Reference","previous_headings":"","what":"Value","title":"Transform a mesh object into a mesh with correlation barriers — add_barrier_mesh","text":"Deprecated. See sdmTMBextra package.","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/reference/add_barrier_mesh.html","id":"ref-examples","dir":"Reference","previous_headings":"","what":"Examples","title":"Transform a mesh object into a mesh with correlation barriers — add_barrier_mesh","text":"","code":"if (FALSE) { add_barrier_mesh() }"},{"path":"https://pbs-assess.github.io/sdmTMB/reference/add_utm_columns.html","id":null,"dir":"Reference","previous_headings":"","what":"Add UTM coordinates to a data frame — add_utm_columns","title":"Add UTM coordinates to a data frame — add_utm_columns","text":"Add UTM (Universal Transverse Mercator) coordinates data frame. useful since geostatistical modeling generally performed equal-distance projection. can separately sf::st_as_sf(), sf::st_transform(), sf::st_coordinates() functions sf package.","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/reference/add_utm_columns.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"Add UTM coordinates to a data frame — add_utm_columns","text":"","code":"add_utm_columns( dat, ll_names = c(\"longitude\", \"latitude\"), ll_crs = 4326, utm_names = c(\"X\", \"Y\"), utm_crs = get_crs(dat, ll_names), units = c(\"km\", \"m\") ) get_crs(dat, ll_names = c(\"longitude\", \"latitude\"))"},{"path":"https://pbs-assess.github.io/sdmTMB/reference/add_utm_columns.html","id":"arguments","dir":"Reference","previous_headings":"","what":"Arguments","title":"Add UTM coordinates to a data frame — add_utm_columns","text":"dat Data frame contains longitude latitude columns. ll_names Longitude latitude column names. Note order. ll_crs Input CRS value ll_names. utm_names Output column names UTM columns. utm_crs Output CRS value UTM zone; tries detect get_crs() can specified manually. units UTM units.","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/reference/add_utm_columns.html","id":"value","dir":"Reference","previous_headings":"","what":"Value","title":"Add UTM coordinates to a data frame — add_utm_columns","text":"copy input data frame new columns UTM coordinates.","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/reference/add_utm_columns.html","id":"details","dir":"Reference","previous_headings":"","what":"Details","title":"Add UTM coordinates to a data frame — add_utm_columns","text":"Note longitudes west prime meridian encoded running -180 0 degrees. may wish work km's rather standard UTM meters range parameter estimate small, can cause computational issues. depends scale data.","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/reference/add_utm_columns.html","id":"ref-examples","dir":"Reference","previous_headings":"","what":"Examples","title":"Add UTM coordinates to a data frame — add_utm_columns","text":"","code":"d <- data.frame(lat = c(52.1, 53.4), lon = c(-130.0, -131.4)) get_crs(d, c(\"lon\", \"lat\")) #> Detected UTM zone 9N; CRS = 32609. #> Visit https://epsg.io/32609 to verify. #> [1] 32609 add_utm_columns(d, c(\"lon\", \"lat\")) #> Detected UTM zone 9N; CRS = 32609. #> Visit https://epsg.io/32609 to verify. #> lat lon X Y #> 1 52.1 -130.0 431.5034 5772.632 #> 2 53.4 -131.4 340.4411 5919.452"},{"path":"https://pbs-assess.github.io/sdmTMB/reference/dharma_residuals.html","id":null,"dir":"Reference","previous_headings":"","what":"DHARMa residuals — dharma_residuals","title":"DHARMa residuals — dharma_residuals","text":"Plot (possibly return) DHARMa residuals. wrapper function around DHARMa::createDHARMa() facilitate use sdmTMB() models. Note: recommended set type = \"mle-mvn\" simulate.sdmTMB() resulting residuals expected distribution. default.","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/reference/dharma_residuals.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"DHARMa residuals — dharma_residuals","text":"","code":"dharma_residuals( simulated_response, object, plot = TRUE, return_DHARMa = FALSE, test_uniformity = TRUE, test_outliers = FALSE, test_dispersion = FALSE, ... )"},{"path":"https://pbs-assess.github.io/sdmTMB/reference/dharma_residuals.html","id":"arguments","dir":"Reference","previous_headings":"","what":"Arguments","title":"DHARMa residuals — dharma_residuals","text":"simulated_response Output simulate.sdmTMB(). recommended set type = \"mle-mvn\" call simulate.sdmTMB() residuals expected distribution. object Output sdmTMB(). plot Logical. Calls DHARMa::plotQQunif(). return_DHARMa Logical. Return object DHARMa::createDHARMa()? test_uniformity Passed testUniformity DHARMa::plotQQunif(). test_outliers Passed testOutliers DHARMa::plotQQunif(). test_dispersion Passed testDispersion DHARMa::plotQQunif(). ... arguments pass DHARMa::createDHARMa().","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/reference/dharma_residuals.html","id":"value","dir":"Reference","previous_headings":"","what":"Value","title":"DHARMa residuals — dharma_residuals","text":"data frame observed expected values invisibly returned can assign output object plot residuals . See examples. return_DHARMa = TRUE, object DHARMa::createDHARMa() returned subsequent DHARMa functions can applied.","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/reference/dharma_residuals.html","id":"details","dir":"Reference","previous_headings":"","what":"Details","title":"DHARMa residuals — dharma_residuals","text":"See residuals vignette. Advantages residuals ones residuals.sdmTMB() method (1) work delta/hurdle models combined predictions, just two parts separately, (2) work families, just families worked analytical quantile function, (3) can used various diagnostic tools plots DHARMa package. Disadvantages (1) slower calculate since one must first simulate model, (2) stability distribution residuals depends sufficient number simulation draws, (3) uniformly distributed residuals put less emphasis tails visually (may desired). Note DHARMa returns residuals uniform(0, 1) data consistent model whereas randomized quantile residuals residuals.sdmTMB() expected normal(0, 1). experimental option expected_distribution included transform distributions normal(0, 1) expectation.","code":""},{"path":[]},{"path":"https://pbs-assess.github.io/sdmTMB/reference/dharma_residuals.html","id":"ref-examples","dir":"Reference","previous_headings":"","what":"Examples","title":"DHARMa residuals — dharma_residuals","text":"","code":"# Try Tweedie family: fit <- sdmTMB(density ~ as.factor(year) + s(depth, k = 3), data = pcod_2011, mesh = pcod_mesh_2011, family = tweedie(link = \"log\"), spatial = \"on\") # The `simulated_response` argument is first so the output from # simulate() can be piped to `dharma_residuals()`. # We will work with 100 simulations for fast examples, but you'll # likely want to work with more than this (enough that the results # are stable from run to run). # not great: set.seed(123) simulate(fit, nsim = 100, type = \"mle-mvn\") |> dharma_residuals(fit) # \\donttest{ # delta-lognormal looks better: set.seed(123) fit_dl <- update(fit, family = delta_lognormal()) simulate(fit_dl, nsim = 100, type = \"mle-mvn\") |> dharma_residuals(fit) # or skip the pipe: set.seed(123) s <- simulate(fit_dl, nsim = 100, type = \"mle-mvn\") # and manually plot it: r <- dharma_residuals(s, fit_dl, plot = FALSE) head(r) #> observed expected #> 1 0.0002512885 0.001030928 #> 2 0.0003998549 0.002061856 #> 3 0.0017916820 0.003092784 #> 4 0.0030243763 0.004123711 #> 5 0.0032777465 0.005154639 #> 6 0.0060709249 0.006185567 plot(r$expected, r$observed) abline(0, 1) # return the DHARMa object and work with the DHARMa methods ret <- simulate(fit_dl, nsim = 100, type = \"mle-mvn\") |> dharma_residuals(fit, return_DHARMa = TRUE) plot(ret) # }"},{"path":"https://pbs-assess.github.io/sdmTMB/reference/emmeans.sdmTMB.html","id":null,"dir":"Reference","previous_headings":"","what":"Estimated marginal means with the emmeans package with sdmTMB — emmeans.sdmTMB","title":"Estimated marginal means with the emmeans package with sdmTMB — emmeans.sdmTMB","text":"Methods using emmeans package sdmTMB. emmeans package computes estimated marginal means fixed effects.","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/reference/emmeans.sdmTMB.html","id":"references","dir":"Reference","previous_headings":"","what":"References","title":"Estimated marginal means with the emmeans package with sdmTMB — emmeans.sdmTMB","text":"https://aosmith.rbind.io/2019/03/25/getting-started--emmeans/","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/reference/emmeans.sdmTMB.html","id":"ref-examples","dir":"Reference","previous_headings":"","what":"Examples","title":"Estimated marginal means with the emmeans package with sdmTMB — emmeans.sdmTMB","text":"","code":"mesh <- make_mesh(pcod_2011, c(\"X\", \"Y\"), cutoff = 20) fit <- sdmTMB( present ~ as.factor(year), data = pcod_2011, mesh = mesh, family = binomial() ) fit #> Spatial model fit by ML ['sdmTMB'] #> Formula: present ~ as.factor(year) #> Mesh: mesh (isotropic covariance) #> Data: pcod_2011 #> Family: binomial(link = 'logit') #> #> coef.est coef.se #> (Intercept) -0.61 0.55 #> as.factor(year)2013 0.99 0.22 #> as.factor(year)2015 0.75 0.22 #> as.factor(year)2017 0.01 0.22 #> #> Matérn range: 48.45 #> Spatial SD: 1.84 #> ML criterion at convergence: 564.495 #> #> See ?tidy.sdmTMB to extract these values as a data frame. emmeans::emmeans(fit, ~ year) #> year emmean SE df lower.CL upper.CL #> 2011 -0.606 0.551 963 -1.688 0.475 #> 2013 0.384 0.550 963 -0.695 1.463 #> 2015 0.143 0.551 963 -0.937 1.224 #> 2017 -0.594 0.551 963 -1.674 0.487 #> #> Results are given on the logit (not the response) scale. #> Confidence level used: 0.95 emmeans::emmeans(fit, pairwise ~ year) #> $emmeans #> year emmean SE df lower.CL upper.CL #> 2011 -0.606 0.551 963 -1.688 0.475 #> 2013 0.384 0.550 963 -0.695 1.463 #> 2015 0.143 0.551 963 -0.937 1.224 #> 2017 -0.594 0.551 963 -1.674 0.487 #> #> Results are given on the logit (not the response) scale. #> Confidence level used: 0.95 #> #> $contrasts #> contrast estimate SE df t.ratio p.value #> year2011 - year2013 -0.9901 0.222 963 -4.467 0.0001 #> year2011 - year2015 -0.7496 0.220 963 -3.404 0.0039 #> year2011 - year2017 -0.0127 0.221 963 -0.057 0.9999 #> year2013 - year2015 0.2405 0.217 963 1.110 0.6837 #> year2013 - year2017 0.9774 0.223 963 4.390 0.0001 #> year2015 - year2017 0.7369 0.222 963 3.320 0.0052 #> #> Results are given on the log odds ratio (not the response) scale. #> P value adjustment: tukey method for comparing a family of 4 estimates #> emmeans::emmeans(fit, pairwise ~ year, type = \"response\") #> $emmeans #> year prob SE df lower.CL upper.CL #> 2011 0.353 0.126 963 0.156 0.617 #> 2013 0.595 0.133 963 0.333 0.812 #> 2015 0.536 0.137 963 0.281 0.773 #> 2017 0.356 0.126 963 0.158 0.619 #> #> Confidence level used: 0.95 #> Intervals are back-transformed from the logit scale #> #> $contrasts #> contrast odds.ratio SE df null t.ratio p.value #> year2011 / year2013 0.372 0.0823 963 1 -4.467 0.0001 #> year2011 / year2015 0.473 0.1041 963 1 -3.404 0.0039 #> year2011 / year2017 0.987 0.2182 963 1 -0.057 0.9999 #> year2013 / year2015 1.272 0.2757 963 1 1.110 0.6837 #> year2013 / year2017 2.658 0.5917 963 1 4.390 0.0001 #> year2015 / year2017 2.089 0.4638 963 1 3.320 0.0052 #> #> P value adjustment: tukey method for comparing a family of 4 estimates #> Tests are performed on the log odds ratio scale #> emmeans::emmeans(fit, pairwise ~ year, adjust = \"none\") #> $emmeans #> year emmean SE df lower.CL upper.CL #> 2011 -0.606 0.551 963 -1.688 0.475 #> 2013 0.384 0.550 963 -0.695 1.463 #> 2015 0.143 0.551 963 -0.937 1.224 #> 2017 -0.594 0.551 963 -1.674 0.487 #> #> Results are given on the logit (not the response) scale. #> Confidence level used: 0.95 #> #> $contrasts #> contrast estimate SE df t.ratio p.value #> year2011 - year2013 -0.9901 0.222 963 -4.467 <.0001 #> year2011 - year2015 -0.7496 0.220 963 -3.404 0.0007 #> year2011 - year2017 -0.0127 0.221 963 -0.057 0.9543 #> year2013 - year2015 0.2405 0.217 963 1.110 0.2675 #> year2013 - year2017 0.9774 0.223 963 4.390 <.0001 #> year2015 - year2017 0.7369 0.222 963 3.320 0.0009 #> #> Results are given on the log odds ratio (not the response) scale. #> e <- emmeans::emmeans(fit, ~ year) plot(e) e <- emmeans::emmeans(fit, pairwise ~ year) confint(e) #> $emmeans #> year emmean SE df lower.CL upper.CL #> 2011 -0.606 0.551 963 -1.688 0.475 #> 2013 0.384 0.550 963 -0.695 1.463 #> 2015 0.143 0.551 963 -0.937 1.224 #> 2017 -0.594 0.551 963 -1.674 0.487 #> #> Results are given on the logit (not the response) scale. #> Confidence level used: 0.95 #> #> $contrasts #> contrast estimate SE df lower.CL upper.CL #> year2011 - year2013 -0.9901 0.222 963 -1.560 -0.420 #> year2011 - year2015 -0.7496 0.220 963 -1.316 -0.183 #> year2011 - year2017 -0.0127 0.221 963 -0.581 0.556 #> year2013 - year2015 0.2405 0.217 963 -0.317 0.798 #> year2013 - year2017 0.9774 0.223 963 0.404 1.550 #> year2015 - year2017 0.7369 0.222 963 0.166 1.308 #> #> Results are given on the log odds ratio (not the response) scale. #> Confidence level used: 0.95 #> Conf-level adjustment: tukey method for comparing a family of 4 estimates #> summary(e, infer = TRUE) #> $emmeans #> year emmean SE df lower.CL upper.CL t.ratio p.value #> 2011 -0.606 0.551 963 -1.688 0.475 -1.100 0.2715 #> 2013 0.384 0.550 963 -0.695 1.463 0.698 0.4854 #> 2015 0.143 0.551 963 -0.937 1.224 0.260 0.7948 #> 2017 -0.594 0.551 963 -1.674 0.487 -1.078 0.2812 #> #> Results are given on the logit (not the response) scale. #> Confidence level used: 0.95 #> #> $contrasts #> contrast estimate SE df lower.CL upper.CL t.ratio p.value #> year2011 - year2013 -0.9901 0.222 963 -1.560 -0.420 -4.467 0.0001 #> year2011 - year2015 -0.7496 0.220 963 -1.316 -0.183 -3.404 0.0039 #> year2011 - year2017 -0.0127 0.221 963 -0.581 0.556 -0.057 0.9999 #> year2013 - year2015 0.2405 0.217 963 -0.317 0.798 1.110 0.6837 #> year2013 - year2017 0.9774 0.223 963 0.404 1.550 4.390 0.0001 #> year2015 - year2017 0.7369 0.222 963 0.166 1.308 3.320 0.0052 #> #> Results are given on the log odds ratio (not the response) scale. #> Confidence level used: 0.95 #> Conf-level adjustment: tukey method for comparing a family of 4 estimates #> P value adjustment: tukey method for comparing a family of 4 estimates #> as.data.frame(e) #> Warning: Note: 'as.data.frame' has combined your 2 sets of results into one object, #> and this affects things like adjusted P values. Refer to the annotations. #> year contrast emmean SE df lower.CL upper.CL #> 2011 . -0.6063952 0.5511170 963 -2.1569729 0.9441824 #> 2013 . 0.3836816 0.5497815 963 -1.1631386 1.9305018 #> 2015 . 0.1431957 0.5505275 963 -1.4057236 1.6921150 #> 2017 . -0.5937169 0.5506319 963 -2.1429298 0.9554960 #> . year2011 - year2013 -0.9900768 0.2216228 963 -1.6136166 -0.3665369 #> . year2011 - year2015 -0.7495909 0.2201979 963 -1.3691217 -0.1300601 #> . year2011 - year2017 -0.0126783 0.2209364 963 -0.6342868 0.6089302 #> . year2013 - year2015 0.2404859 0.2167459 963 -0.3693327 0.8503044 #> . year2013 - year2017 0.9773985 0.2226493 963 0.3509706 1.6038264 #> . year2015 - year2017 0.7369126 0.2219749 963 0.1123823 1.3614430 #> #> Results are given on the logit (not the response) scale. #> Confidence level used: 0.95 #> Conf-level adjustment: bonferroni method for 10 estimates # interaction of factor with continuous predictor: fit2 <- sdmTMB( present ~ depth_scaled * as.factor(year), data = pcod_2011, mesh = mesh, family = binomial() ) fit2 #> Spatial model fit by ML ['sdmTMB'] #> Formula: present ~ depth_scaled * as.factor(year) #> Mesh: mesh (isotropic covariance) #> Data: pcod_2011 #> Family: binomial(link = 'logit') #> #> coef.est coef.se #> (Intercept) -0.75 0.48 #> depth_scaled -0.98 0.25 #> as.factor(year)2013 1.03 0.23 #> as.factor(year)2015 0.79 0.23 #> as.factor(year)2017 0.01 0.23 #> depth_scaled:as.factor(year)2013 -0.16 0.26 #> depth_scaled:as.factor(year)2015 0.03 0.26 #> depth_scaled:as.factor(year)2017 -0.01 0.26 #> #> Matérn range: 33.38 #> Spatial SD: 2.19 #> ML criterion at convergence: 546.074 #> #> See ?tidy.sdmTMB to extract these values as a data frame. # slopes for each level: emmeans::emtrends(fit2, ~ year, var = \"depth_scaled\") #> year depth_scaled.trend SE df lower.CL upper.CL #> 2011 -0.980 0.250 959 -1.47 -0.490 #> 2013 -1.140 0.247 959 -1.63 -0.655 #> 2015 -0.950 0.238 959 -1.42 -0.483 #> 2017 -0.987 0.244 959 -1.47 -0.507 #> #> Confidence level used: 0.95 # test difference in slopes: emmeans::emtrends(fit2, pairwise ~ year, var = \"depth_scaled\") #> $emtrends #> year depth_scaled.trend SE df lower.CL upper.CL #> 2011 -0.980 0.250 959 -1.47 -0.490 #> 2013 -1.140 0.247 959 -1.63 -0.655 #> 2015 -0.950 0.238 959 -1.42 -0.483 #> 2017 -0.987 0.244 959 -1.47 -0.507 #> #> Confidence level used: 0.95 #> #> $contrasts #> contrast estimate SE df t.ratio p.value #> year2011 - year2013 0.16009 0.261 959 0.613 0.9281 #> year2011 - year2015 -0.03052 0.265 959 -0.115 0.9995 #> year2011 - year2017 0.00651 0.261 959 0.025 1.0000 #> year2013 - year2015 -0.19061 0.258 959 -0.738 0.8817 #> year2013 - year2017 -0.15358 0.259 959 -0.593 0.9342 #> year2015 - year2017 0.03703 0.263 959 0.141 0.9990 #> #> P value adjustment: tukey method for comparing a family of 4 estimates #> emmeans::emmip(fit2, year ~ depth_scaled, at = list(depth_scaled = seq(-2.5, 2.5, length.out = 50)), CIs = TRUE)"},{"path":"https://pbs-assess.github.io/sdmTMB/reference/extract_mcmc.html","id":null,"dir":"Reference","previous_headings":"","what":"Extract MCMC samples from a model fit with tmbstan. — extract_mcmc","title":"Extract MCMC samples from a model fit with tmbstan. — extract_mcmc","text":"Moved sdmTMBextra package","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/reference/extract_mcmc.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"Extract MCMC samples from a model fit with tmbstan. — extract_mcmc","text":"","code":"extract_mcmc(object = deprecated())"},{"path":"https://pbs-assess.github.io/sdmTMB/reference/extract_mcmc.html","id":"arguments","dir":"Reference","previous_headings":"","what":"Arguments","title":"Extract MCMC samples from a model fit with tmbstan. — extract_mcmc","text":"object Deprecated See sdmTMBextra package. Make sure load sdmTMBextra sdmTMB.","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/reference/extract_mcmc.html","id":"value","dir":"Reference","previous_headings":"","what":"Value","title":"Extract MCMC samples from a model fit with tmbstan. — extract_mcmc","text":"Deprecated. See sdmTMBextra package.","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/reference/extract_mcmc.html","id":"ref-examples","dir":"Reference","previous_headings":"","what":"Examples","title":"Extract MCMC samples from a model fit with tmbstan. — extract_mcmc","text":"","code":"if (FALSE) { extract_mcmc() }"},{"path":"https://pbs-assess.github.io/sdmTMB/reference/families.html","id":null,"dir":"Reference","previous_headings":"","what":"Additional families — Families","title":"Additional families — Families","text":"Additional families compatible sdmTMB().","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/reference/families.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"Additional families — Families","text":"","code":"Beta(link = \"logit\") lognormal(link = \"log\") gengamma(link = \"log\") gamma_mix(link = \"log\") lognormal_mix(link = \"log\") nbinom2_mix(link = \"log\") nbinom2(link = \"log\") nbinom1(link = \"log\") truncated_nbinom2(link = \"log\") truncated_nbinom1(link = \"log\") student(link = \"identity\", df = 3) tweedie(link = \"log\") censored_poisson(link = \"log\") delta_gamma(link1, link2 = \"log\", type = c(\"standard\", \"poisson-link\")) delta_gamma_mix(link1 = \"logit\", link2 = \"log\") delta_gengamma(link1, link2 = \"log\", type = c(\"standard\", \"poisson-link\")) delta_lognormal(link1, link2 = \"log\", type = c(\"standard\", \"poisson-link\")) delta_lognormal_mix(link1, link2 = \"log\", type = c(\"standard\", \"poisson-link\")) delta_truncated_nbinom2(link1 = \"logit\", link2 = \"log\") delta_truncated_nbinom1(link1 = \"logit\", link2 = \"log\") delta_poisson_link_gamma(link1 = \"log\", link2 = \"log\") delta_poisson_link_lognormal(link1 = \"log\", link2 = \"log\") delta_beta(link1 = \"logit\", link2 = \"logit\")"},{"path":"https://pbs-assess.github.io/sdmTMB/reference/families.html","id":"arguments","dir":"Reference","previous_headings":"","what":"Arguments","title":"Additional families — Families","text":"link Link. df Student-t degrees freedom fixed value parameter. link1 Link first part delta/hurdle model. Defaults \"logit\" type = \"standard\" \"log\" type = \"poisson-link\". link2 Link second part delta/hurdle model. type Delta/hurdle family type. \"standard\" classic hurdle model. \"poisson-link\" Poisson-link delta model (Thorson 2018).","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/reference/families.html","id":"value","dir":"Reference","previous_headings":"","what":"Value","title":"Additional families — Families","text":"list elements common standard R family objects including family, link, linkfun, linkinv. Delta/hurdle model families also elements delta (logical) type (standard vs. Poisson-link).","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/reference/families.html","id":"details","dir":"Reference","previous_headings":"","what":"Details","title":"Additional families — Families","text":"delta_poisson_link_gamma() delta_poisson_link_lognormal() deprecated favour delta_gamma(type = \"poisson-link\") delta_lognormal(type = \"poisson-link\"). gengamma() family implemented J.T. Thorson uses Prentice (1974) parameterization lognormal occurs internal parameter gengamma_Q (reported print() \"Generalized gamma lambda\") approaches 0. families ending _mix() 2-component mixtures distribution mean shared scale parameter. (Thorson et al. 2011). See model-description vignette details. parameter plogis(log_p_mix) probability extreme (larger) mean exp(log_ratio_mix) + 1 ratio larger extreme mean \"regular\" mean. can see parameters model$sd_report. nbinom2 negative binomial parameterization NB2 variance grows quadratically mean (Hilbe 2011). nbinom1 negative binomial parameterization lets variance grow linearly mean (Hilbe 2011). student(), degrees freedom parameter currently estimated fixed df.","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/reference/families.html","id":"references","dir":"Reference","previous_headings":"","what":"References","title":"Additional families — Families","text":"Generalized gamma family: Prentice, R.L. 1974. log gamma model maximum likelihood estimation. Biometrika 61(3): 539–544. doi:10.1093/biomet/61.3.539 Stacy, E.W. 1962. Generalization Gamma Distribution. Annals Mathematical Statistics 33(3): 1187–1192. Institute Mathematical Statistics. Families ending _mix(): Thorson, J.T., Stewart, .J., Punt, .E. 2011. Accounting fish shoals single- multi-species survey data using mixture distribution models. Can. J. Fish. Aquat. Sci. 68(9): 1681–1693. doi:10.1139/f2011-086 . Negative binomial families: Hilbe, J. M. 2011. Negative binomial regression. Cambridge University Press. Poisson-link delta families: Thorson, J.T. 2018. Three problems conventional delta-model biomass sampling data, computationally efficient alternative. Canadian Journal Fisheries Aquatic Sciences, 75(9), 1369-1382. doi:10.1139/cjfas-2017-0266","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/reference/families.html","id":"ref-examples","dir":"Reference","previous_headings":"","what":"Examples","title":"Additional families — Families","text":"","code":"Beta(link = \"logit\") #> #> Family: Beta #> Link function: logit #> lognormal(link = \"log\") #> #> Family: lognormal #> Link function: log #> gengamma(link = \"log\") #> #> Family: gengamma #> Link function: log #> gamma_mix(link = \"log\") #> #> Family: gamma_mix #> Link function: log #> lognormal_mix(link = \"log\") #> #> Family: lognormal_mix #> Link function: log #> nbinom2_mix(link = \"log\") #> #> Family: nbinom2_mix #> Link function: log #> nbinom2(link = \"log\") #> #> Family: nbinom2 #> Link function: log #> nbinom1(link = \"log\") #> #> Family: nbinom1 #> Link function: log #> truncated_nbinom2(link = \"log\") #> #> Family: truncated_nbinom2 #> Link function: log #> truncated_nbinom1(link = \"log\") #> #> Family: truncated_nbinom1 #> Link function: log #> student(link = \"identity\") #> #> Family: student #> Link function: identity #> tweedie(link = \"log\") #> #> Family: tweedie #> Link function: log #> censored_poisson(link = \"log\") #> #> Family: censored_poisson #> Link function: log #> delta_gamma() #> #> Family: binomial Gamma #> Link function: logit log #> delta_gamma_mix() #> #> Family: binomial gamma_mix #> Link function: logit log #> delta_gengamma() #> #> Family: binomial gengamma #> Link function: logit log #> delta_lognormal() #> #> Family: binomial lognormal #> Link function: logit log #> delta_lognormal_mix() #> #> Family: binomial lognormal_mix #> Link function: logit log #> delta_truncated_nbinom2() #> #> Family: binomial truncated_nbinom2 #> Link function: logit log #> delta_truncated_nbinom1() #> #> Family: binomial truncated_nbinom1 #> Link function: logit log #> delta_beta() #> #> Family: binomial Beta #> Link function: logit logit #>"},{"path":"https://pbs-assess.github.io/sdmTMB/reference/gather_sims.html","id":null,"dir":"Reference","previous_headings":"","what":"Extract parameter simulations from the joint precision matrix — spread_sims","title":"Extract parameter simulations from the joint precision matrix — spread_sims","text":"spread_sims() returns wide-format data frame. gather_sims() returns long-format data frame. format matches format tidybayes spread_draws() gather_draws() functions.","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/reference/gather_sims.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"Extract parameter simulations from the joint precision matrix — spread_sims","text":"","code":"spread_sims(object, nsim = 200, n_sims = deprecated()) gather_sims(object, nsim = 200, n_sims = deprecated())"},{"path":"https://pbs-assess.github.io/sdmTMB/reference/gather_sims.html","id":"arguments","dir":"Reference","previous_headings":"","what":"Arguments","title":"Extract parameter simulations from the joint precision matrix — spread_sims","text":"object Output sdmTMB(). nsim number simulation draws. n_sims Deprecated: please use nsim.","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/reference/gather_sims.html","id":"value","dir":"Reference","previous_headings":"","what":"Value","title":"Extract parameter simulations from the joint precision matrix — spread_sims","text":"data frame. gather_sims() returns long-format data frame: .iteration: sample ID .variable: parameter name .value: parameter sample value spread_sims() returns wide-format data frame: .iteration: sample ID columns parameter sample per row","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/reference/gather_sims.html","id":"ref-examples","dir":"Reference","previous_headings":"","what":"Examples","title":"Extract parameter simulations from the joint precision matrix — spread_sims","text":"","code":"m <- sdmTMB(density ~ depth_scaled, data = pcod_2011, mesh = pcod_mesh_2011, family = tweedie()) head(spread_sims(m, nsim = 10)) #> .iteration X.Intercept. depth_scaled range phi tweedie_p sigma_O #> 1 1 2.947281 -0.6113464 21.28292 14.96395 1.579234 2.649385 #> 2 2 2.867277 -0.7282022 50.56038 15.63864 1.583225 1.871312 #> 3 3 2.570547 -0.5759993 81.02526 13.86308 1.582175 1.204293 #> 4 4 3.305925 -0.5416168 26.01008 16.21788 1.588699 1.674452 #> 5 5 2.870912 -0.7513359 35.35486 15.06749 1.595577 2.000633 #> 6 6 2.901625 -0.7516155 25.05987 14.38858 1.614982 2.741510 head(gather_sims(m, nsim = 10)) #> .iteration .variable .value #> 1 1 X.Intercept. 3.120602 #> 2 2 X.Intercept. 3.072890 #> 3 3 X.Intercept. 2.688049 #> 4 4 X.Intercept. 3.006871 #> 5 5 X.Intercept. 3.120740 #> 6 6 X.Intercept. 2.509646 samps <- gather_sims(m, nsim = 1000) if (require(\"ggplot2\", quietly = TRUE)) { ggplot(samps, aes(.value)) + geom_histogram() + facet_wrap(~.variable, scales = \"free_x\") } #> `stat_bin()` using `bins = 30`. Pick better value with `binwidth`."},{"path":"https://pbs-assess.github.io/sdmTMB/reference/get_index.html","id":null,"dir":"Reference","previous_headings":"","what":"Extract a relative biomass/abundance index or a center of gravity — get_index","title":"Extract a relative biomass/abundance index or a center of gravity — get_index","text":"Extract relative biomass/abundance index center gravity","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/reference/get_index.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"Extract a relative biomass/abundance index or a center of gravity — get_index","text":"","code":"get_index( obj, bias_correct = FALSE, level = 0.95, area = 1, silent = TRUE, ... ) get_cog( obj, bias_correct = FALSE, level = 0.95, format = c(\"long\", \"wide\"), area = 1, silent = TRUE, ... )"},{"path":"https://pbs-assess.github.io/sdmTMB/reference/get_index.html","id":"arguments","dir":"Reference","previous_headings":"","what":"Arguments","title":"Extract a relative biomass/abundance index or a center of gravity — get_index","text":"obj Output predict.sdmTMB() return_tmb_object = TRUE. bias_correct bias correction implemented TMB::sdreport()? level confidence level. area Grid cell area. vector length newdata predict.sdmTMB() value length 1, repeated internally match. silent Silent? ... Passed TMB::sdreport(). format Long wide.","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/reference/get_index.html","id":"value","dir":"Reference","previous_headings":"","what":"Value","title":"Extract a relative biomass/abundance index or a center of gravity — get_index","text":"get_index(): data frame columns time, estimate, lower upper confidence intervals, log estimate, standard error log estimate. get_cog(): data frame columns time, estimate (center gravity x y coordinates), lower upper confidence intervals, standard error center gravity coordinates.","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/reference/get_index.html","id":"references","dir":"Reference","previous_headings":"","what":"References","title":"Extract a relative biomass/abundance index or a center of gravity — get_index","text":"Geostatistical random-field model-based indices abundance (along many newer papers): Shelton, .O., Thorson, J.T., Ward, E.J., Feist, B.E. 2014. Spatial semiparametric models improve estimates species abundance distribution. Canadian Journal Fisheries Aquatic Sciences 71(11): 1655--1666. doi:10.1139/cjfas-2013-0508 Thorson, J.T., Shelton, .O., Ward, E.J., Skaug, H.J. 2015. Geostatistical delta-generalized linear mixed models improve precision estimated abundance indices West Coast groundfishes. ICES J. Mar. Sci. 72(5): 1297–1310. doi:10.1093/icesjms/fsu243 Geostatistical model-based centre gravity: Thorson, J.T., Pinsky, M.L., Ward, E.J. 2016. Model-based inference estimating shifts species distribution, area occupied centre gravity. Methods Ecol Evol 7(8): 990–1002. doi:10.1111/2041-210X.12567 Bias correction: Thorson, J.T., Kristensen, K. 2016. Implementing generic method bias correction statistical models using random effects, spatial population dynamics examples. Fisheries Research 175: 66–74. doi:10.1016/j.fishres.2015.11.016","code":""},{"path":[]},{"path":"https://pbs-assess.github.io/sdmTMB/reference/get_index.html","id":"ref-examples","dir":"Reference","previous_headings":"","what":"Examples","title":"Extract a relative biomass/abundance index or a center of gravity — get_index","text":"","code":"# \\donttest{ # Use a small number of knots for this example to make it fast: pcod_spde <- make_mesh(pcod, c(\"X\", \"Y\"), n_knots = 60, type = \"kmeans\") m <- sdmTMB( data = pcod, formula = density ~ 0 + as.factor(year), time = \"year\", mesh = pcod_spde, family = tweedie(link = \"log\") ) # make prediction grid: nd <- replicate_df(qcs_grid, \"year\", unique(pcod$year)) # Note `return_tmb_object = TRUE` and the prediction grid: predictions <- predict(m, newdata = nd, return_tmb_object = TRUE) ind <- get_index(predictions) #> Bias correction is turned off. #> It is recommended to turn this on for final inference. if (require(\"ggplot2\", quietly = TRUE)) { ggplot(ind, aes(year, est)) + geom_line() + geom_ribbon(aes(ymin = lwr, ymax = upr), alpha = 0.4) } cog <- get_cog(predictions) cog #> year est lwr upr se coord #> 1 2003 463.5260 446.4142 480.6378 8.730670 X #> 2 2004 476.7402 466.4506 487.0297 5.249871 X #> 3 2005 470.6887 457.7494 483.6280 6.601796 X #> 4 2007 480.8948 464.5560 497.2336 8.336280 X #> 5 2009 477.2029 457.9185 496.4872 9.839144 X #> 6 2011 470.5112 457.6004 483.4221 6.587284 X #> 7 2013 471.9877 455.6078 488.3676 8.357252 X #> 8 2015 463.0289 449.6443 476.4136 6.829028 X #> 9 2017 470.5219 455.4189 485.6249 7.705734 X #> 10 2003 5757.8611 5739.8547 5775.8675 9.187109 Y #> 11 2004 5732.5037 5720.8788 5744.1285 5.931164 Y #> 12 2005 5763.0315 5750.1526 5775.9104 6.570990 Y #> 13 2007 5738.2313 5716.8428 5759.6199 10.912733 Y #> 14 2009 5734.0288 5713.3608 5754.6968 10.545101 Y #> 15 2011 5747.1037 5733.6282 5760.5792 6.875389 Y #> 16 2013 5747.6446 5728.9691 5766.3201 9.528488 Y #> 17 2015 5753.9697 5736.8439 5771.0956 8.737855 Y #> 18 2017 5755.9728 5739.6443 5772.3014 8.331045 Y # }"},{"path":"https://pbs-assess.github.io/sdmTMB/reference/get_index_sims.html","id":null,"dir":"Reference","previous_headings":"","what":"Calculate a population index via simulation from the joint precision matrix — get_index_sims","title":"Calculate a population index via simulation from the joint precision matrix — get_index_sims","text":"Calculate population index via simulation joint precision matrix. Compared get_index(), version can faster bias correction turned get_index() approximately equivalent. experimental function. function usually works reasonably well, make guarantees. recommended use get_index() bias_correct = TRUE final inference.","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/reference/get_index_sims.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"Calculate a population index via simulation from the joint precision matrix — get_index_sims","text":"","code":"get_index_sims( obj, level = 0.95, return_sims = FALSE, area = rep(1, nrow(obj)), est_function = stats::median, area_function = function(x, area) x + log(area), agg_function = function(x) sum(exp(x)) )"},{"path":"https://pbs-assess.github.io/sdmTMB/reference/get_index_sims.html","id":"arguments","dir":"Reference","previous_headings":"","what":"Arguments","title":"Calculate a population index via simulation from the joint precision matrix — get_index_sims","text":"obj predict.sdmTMB() output nsim > 0. level confidence level. return_sims Logical. Return simulation draws? default (FALSE) quantile summary simulation draws. area vector grid cell/polyon areas year-grid cell (row data) obj. Adjust cells unit area area (e.g., cells partially land/water). Note area vector added log(area) raw values obj. words, function assumes log link, typically makes sense. est_function Function summarize estimate (expected value). mean() alternative median(). area_function Function apply area weighting. Assuming log link, function(x, area) x + log(area) default makes sense. natural space, function(x, area) x * area makes sense. agg_function Function aggregate samples within time slice. Assuming log link, function(x) sum(exp(x)) default makes sense. natural space, function(x) sum(x) makes sense.","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/reference/get_index_sims.html","id":"value","dir":"Reference","previous_headings":"","what":"Value","title":"Calculate a population index via simulation from the joint precision matrix — get_index_sims","text":"data frame. return_sims = FALSE: name column (e.g. year) supplied sdmTMB() time argument est: estimate lwr: lower confidence interval value upr: upper confidence interval value log_est: log estimate se: standard error log estimate return_sims = TRUE, samples index values long-format data frame: name column (e.g. year) supplied sdmTMB() time argument .value: sample value .iteration: sample number","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/reference/get_index_sims.html","id":"details","dir":"Reference","previous_headings":"","what":"Details","title":"Calculate a population index via simulation from the joint precision matrix — get_index_sims","text":"Can also used produce index model fit tmbstan. function nothing summarize reshape matrix simulation draws data frame.","code":""},{"path":[]},{"path":"https://pbs-assess.github.io/sdmTMB/reference/get_index_sims.html","id":"ref-examples","dir":"Reference","previous_headings":"","what":"Examples","title":"Calculate a population index via simulation from the joint precision matrix — get_index_sims","text":"","code":"# \\donttest{ m <- sdmTMB(density ~ 0 + as.factor(year), data = pcod_2011, mesh = pcod_mesh_2011, family = tweedie(link = \"log\"), time = \"year\" ) qcs_grid_2011 <- replicate_df(qcs_grid, \"year\", unique(pcod_2011$year)) p <- predict(m, newdata = qcs_grid_2011, nsim = 100) x <- get_index_sims(p) #> We generally recommend using `get_index(..., bias_correct = TRUE)` #> rather than `get_index_sims()`. x_sims <- get_index_sims(p, return_sims = TRUE) #> We generally recommend using `get_index(..., bias_correct = TRUE)` #> rather than `get_index_sims()`. if (require(\"ggplot2\", quietly = TRUE)) { ggplot(x, aes(year, est, ymin = lwr, ymax = upr)) + geom_line() + geom_ribbon(alpha = 0.4) ggplot(x_sims, aes(as.factor(year), .value)) + geom_violin() } # Demo custom functions if working in natural space: ind <- get_index_sims( exp(p), agg_function = function(x) sum(x), area_function = function(x, area) x * area ) #> We generally recommend using `get_index(..., bias_correct = TRUE)` #> rather than `get_index_sims()`. # }"},{"path":"https://pbs-assess.github.io/sdmTMB/reference/get_pars.html","id":null,"dir":"Reference","previous_headings":"","what":"Get TMB parameter list — get_pars","title":"Get TMB parameter list — get_pars","text":"Get TMB parameter list","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/reference/get_pars.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"Get TMB parameter list — get_pars","text":"","code":"get_pars(object)"},{"path":"https://pbs-assess.github.io/sdmTMB/reference/get_pars.html","id":"arguments","dir":"Reference","previous_headings":"","what":"Arguments","title":"Get TMB parameter list — get_pars","text":"object Fit sdmTMB()","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/reference/get_pars.html","id":"value","dir":"Reference","previous_headings":"","what":"Value","title":"Get TMB parameter list — get_pars","text":"named list parameter values","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/reference/get_pars.html","id":"ref-examples","dir":"Reference","previous_headings":"","what":"Examples","title":"Get TMB parameter list — get_pars","text":"","code":"fit <- sdmTMB(present ~ 1, data = pcod_2011, family = binomial(), spatial = \"off\") pars <- get_pars(fit) names(pars) #> [1] \"ln_H_input\" \"b_j\" \"b_j2\" #> [4] \"bs\" \"ln_tau_O\" \"ln_tau_Z\" #> [7] \"ln_tau_E\" \"ln_kappa\" \"thetaf\" #> [10] \"gengamma_Q\" \"logit_p_mix\" \"log_ratio_mix\" #> [13] \"ln_phi\" \"ln_tau_V\" \"rho_time_unscaled\" #> [16] \"ar1_phi\" \"ln_tau_G\" \"RE\" #> [19] \"b_rw_t\" \"omega_s\" \"zeta_s\" #> [22] \"epsilon_st\" \"b_threshold\" \"b_epsilon\" #> [25] \"ln_epsilon_re_sigma\" \"epsilon_re\" \"b_smooth\" #> [28] \"ln_smooth_sigma\""},{"path":"https://pbs-assess.github.io/sdmTMB/reference/ggplot2_installed.html","id":null,"dir":"Reference","previous_headings":"","what":"Check if ggplot2 installed — ggplot2_installed","title":"Check if ggplot2 installed — ggplot2_installed","text":"Check ggplot2 installed","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/reference/ggplot2_installed.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"Check if ggplot2 installed — ggplot2_installed","text":"","code":"ggplot2_installed()"},{"path":"https://pbs-assess.github.io/sdmTMB/reference/ggplot2_installed.html","id":"value","dir":"Reference","previous_headings":"","what":"Value","title":"Check if ggplot2 installed — ggplot2_installed","text":"Returns TRUE FALSE.","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/reference/make_mesh.html","id":null,"dir":"Reference","previous_headings":"","what":"Construct an SPDE mesh for sdmTMB — make_mesh","title":"Construct an SPDE mesh for sdmTMB — make_mesh","text":"Construct SPDE mesh use sdmTMB.","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/reference/make_mesh.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"Construct an SPDE mesh for sdmTMB — make_mesh","text":"","code":"make_mesh( data, xy_cols, type = c(\"kmeans\", \"cutoff\", \"cutoff_search\"), cutoff, n_knots, seed = 42, mesh = NULL, fmesher_func = fmesher::fm_rcdt_2d_inla, convex = NULL, concave = convex, ... ) # S3 method for sdmTMBmesh plot(x, ...)"},{"path":"https://pbs-assess.github.io/sdmTMB/reference/make_mesh.html","id":"arguments","dir":"Reference","previous_headings":"","what":"Arguments","title":"Construct an SPDE mesh for sdmTMB — make_mesh","text":"data data frame. xy_cols character vector x y column names contained data. likely equal distance projection. helper function convert UTMs, see add_utm_columns(). type Method create mesh. Also see mesh argument supply mesh. cutoff optional cutoff type \"cutoff\". minimum allowed triangle edge length. n_knots number desired knots type \"cutoff\". seed Random seed. Affects stats::kmeans() determination knot locations type = \"kmeans\". mesh optional mesh created via fmesher instead using convenience options. fmesher_func fmesher function use. Options include fmesher::fm_rcdt_2d_inla() fmesher::fm_mesh_2d_inla() along version without _inla end. convex specified, passed fmesher::fm_nonconvex_hull(). Distance extend non-convex hull data. concave specified, passed fmesher::fm_nonconvex_hull(). \"Minimum allowed reentrant curvature\". Defaults convex. ... Passed graphics::plot(). x Output make_mesh().","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/reference/make_mesh.html","id":"value","dir":"Reference","previous_headings":"","what":"Value","title":"Construct an SPDE mesh for sdmTMB — make_mesh","text":"make_mesh(): list class sdmTMBmesh. element mesh output fmesher_func (default fmesher::fm_mesh_2d_inla()). See mesh$mesh$n number vertices. plot.sdmTMBmesh(): plot mesh data points. ggplot2 installed, ggplot2 object returned, otherwise base graphics R plot returned. make , pass your_mesh$mesh inlabru::gg().","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/reference/make_mesh.html","id":"ref-examples","dir":"Reference","previous_headings":"","what":"Examples","title":"Construct an SPDE mesh for sdmTMB — make_mesh","text":"","code":"# Extremely simple cutoff: mesh <- make_mesh(pcod, c(\"X\", \"Y\"), cutoff = 5, type = \"cutoff\") plot(mesh) # Using a k-means algorithm to assign vertices: mesh <- make_mesh(pcod, c(\"X\", \"Y\"), n_knots = 50, type = \"kmeans\") plot(mesh) # \\donttest{ # But, it's better to develop more tailored meshes: # Pass arguments via '...' to fmesher::fm_mesh_2d_inla(): mesh <- make_mesh( pcod, c(\"X\", \"Y\"), fmesher_func = fmesher::fm_mesh_2d_inla, cutoff = 8, # minimum triangle edge length max.edge = c(20, 40), # inner and outer max triangle lengths offset = c(5, 40) # inner and outer border widths ) plot(mesh) # Or define a mesh directly with fmesher (formerly in INLA): inla_mesh <- fmesher::fm_mesh_2d_inla( loc = cbind(pcod$X, pcod$Y), # coordinates max.edge = c(25, 50), # max triangle edge length; inner and outer meshes offset = c(5, 25), # inner and outer border widths cutoff = 5 # minimum triangle edge length ) mesh <- make_mesh(pcod, c(\"X\", \"Y\"), mesh = inla_mesh) plot(mesh) # }"},{"path":"https://pbs-assess.github.io/sdmTMB/reference/plot_anisotropy.html","id":null,"dir":"Reference","previous_headings":"","what":"Plot anisotropy from an sdmTMB model — plot_anisotropy","title":"Plot anisotropy from an sdmTMB model — plot_anisotropy","text":"Anisotropy spatial correlation directionally dependent. sdmTMB(), default spatial correlation isotropic, anisotropy can enabled anisotropy = TRUE. plotting functions help visualize estimated anisotropy.","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/reference/plot_anisotropy.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"Plot anisotropy from an sdmTMB model — plot_anisotropy","text":"","code":"plot_anisotropy(object, return_data = FALSE) plot_anisotropy2(object, model = 1)"},{"path":"https://pbs-assess.github.io/sdmTMB/reference/plot_anisotropy.html","id":"arguments","dir":"Reference","previous_headings":"","what":"Arguments","title":"Plot anisotropy from an sdmTMB model — plot_anisotropy","text":"object object sdmTMB(). return_data Logical. Return data frame? plot_anisotropy() . model model delta model (plot_anisotropy2(); plot_anisotropy() always plots ).","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/reference/plot_anisotropy.html","id":"value","dir":"Reference","previous_headings":"","what":"Value","title":"Plot anisotropy from an sdmTMB model — plot_anisotropy","text":"plot_anisotropy(): One ellipses illustrating estimated anisotropy. ellipses centered coordinates zero space X-Y coordinates modeled. ellipses show spatial /spatiotemporal range (distance correlation effectively independent) direction zero. Uses ggplot2. anisotropy turned fitting model, NULL returned instead ggplot2 object. plot_anisotropy2(): plot eigenvectors illustrating estimated anisotropy. list plotted data invisibly returned. Uses base graphics. anisotropy turned fitting model, NULL returned instead plot object.","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/reference/plot_anisotropy.html","id":"references","dir":"Reference","previous_headings":"","what":"References","title":"Plot anisotropy from an sdmTMB model — plot_anisotropy","text":"Code adapted VAST R package","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/reference/plot_anisotropy.html","id":"ref-examples","dir":"Reference","previous_headings":"","what":"Examples","title":"Plot anisotropy from an sdmTMB model — plot_anisotropy","text":"","code":"mesh <- make_mesh(pcod_2011, c(\"X\", \"Y\"), n_knots = 80, type = \"kmeans\") fit <- sdmTMB( data = pcod_2011, formula = density ~ 1, mesh = mesh, family = tweedie(), share_range = FALSE, time = \"year\", anisotropy = TRUE #< ) plot_anisotropy(fit) plot_anisotropy2(fit)"},{"path":"https://pbs-assess.github.io/sdmTMB/reference/plot_pc_matern.html","id":null,"dir":"Reference","previous_headings":"","what":"Plot PC Matérn priors — plot_pc_matern","title":"Plot PC Matérn priors — plot_pc_matern","text":"Plot PC Matérn priors","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/reference/plot_pc_matern.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"Plot PC Matérn priors — plot_pc_matern","text":"","code":"plot_pc_matern( range_gt, sigma_lt, range_prob = 0.05, sigma_prob = 0.05, range_lims = c(range_gt * 0.1, range_gt * 10), sigma_lims = c(0, sigma_lt * 2), plot = TRUE )"},{"path":"https://pbs-assess.github.io/sdmTMB/reference/plot_pc_matern.html","id":"arguments","dir":"Reference","previous_headings":"","what":"Arguments","title":"Plot PC Matérn priors — plot_pc_matern","text":"range_gt value one expects spatial spatiotemporal range greater 1 - range_prob probability. sigma_lt value one expects spatial spatiotemporal marginal standard deviation (sigma_O sigma_E internally) less 1 - sigma_prob probability. range_prob Probability. See description range_gt. sigma_prob Probability. See description sigma_lt. range_lims Plot range variable limits. sigma_lims Plot sigma variable limits. plot Logical controlling whether plot drawn (defaults TRUE).","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/reference/plot_pc_matern.html","id":"value","dir":"Reference","previous_headings":"","what":"Value","title":"Plot PC Matérn priors — plot_pc_matern","text":"plot image(). Invisibly returns underlying matrix data. rows sigmas. columns ranges. Column row names provided.","code":""},{"path":[]},{"path":"https://pbs-assess.github.io/sdmTMB/reference/plot_pc_matern.html","id":"ref-examples","dir":"Reference","previous_headings":"","what":"Examples","title":"Plot PC Matérn priors — plot_pc_matern","text":"","code":"plot_pc_matern(range_gt = 5, sigma_lt = 1) plot_pc_matern(range_gt = 5, sigma_lt = 10) plot_pc_matern(range_gt = 5, sigma_lt = 1, sigma_prob = 0.2) plot_pc_matern(range_gt = 5, sigma_lt = 1, range_prob = 0.2)"},{"path":"https://pbs-assess.github.io/sdmTMB/reference/plot_smooth.html","id":null,"dir":"Reference","previous_headings":"","what":"Plot a smooth term from an sdmTMB model — plot_smooth","title":"Plot a smooth term from an sdmTMB model — plot_smooth","text":"Deprecated: use visreg::visreg(). See visreg_delta() examples.","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/reference/plot_smooth.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"Plot a smooth term from an sdmTMB model — plot_smooth","text":"","code":"plot_smooth( object, select = 1, n = 100, level = 0.95, ggplot = FALSE, rug = TRUE, return_data = FALSE )"},{"path":"https://pbs-assess.github.io/sdmTMB/reference/plot_smooth.html","id":"arguments","dir":"Reference","previous_headings":"","what":"Arguments","title":"Plot a smooth term from an sdmTMB model — plot_smooth","text":"object sdmTMB() model. select smoother term plot. n number equally spaced points evaluate smoother along. level confidence level. ggplot Logical: use ggplot2 package? rug Logical: add rug lines along lower axis? return_data Logical: return predicted data instead making plot?","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/reference/plot_smooth.html","id":"value","dir":"Reference","previous_headings":"","what":"Value","title":"Plot a smooth term from an sdmTMB model — plot_smooth","text":"plot smoother term.","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/reference/plot_smooth.html","id":"details","dir":"Reference","previous_headings":"","what":"Details","title":"Plot a smooth term from an sdmTMB model — plot_smooth","text":"Note: numeric predictor set mean factor predictor set first-level value time element (present) set minimum value x y coordinates set mean values","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/reference/plot_smooth.html","id":"ref-examples","dir":"Reference","previous_headings":"","what":"Examples","title":"Plot a smooth term from an sdmTMB model — plot_smooth","text":"","code":"d <- subset(pcod, year >= 2000 & density > 0) pcod_spde <- make_mesh(d, c(\"X\", \"Y\"), cutoff = 30) m <- sdmTMB( data = d, formula = log(density) ~ s(depth_scaled) + s(year, k = 5), mesh = pcod_spde ) plot_smooth(m) #> This function may be deprecated. #> Consider using `visreg::visreg()` or `visreg_delta()`. #> See ?visreg_delta() for examples."},{"path":"https://pbs-assess.github.io/sdmTMB/reference/predict.sdmTMB.html","id":null,"dir":"Reference","previous_headings":"","what":"Predict from an sdmTMB model — predict.sdmTMB","title":"Predict from an sdmTMB model — predict.sdmTMB","text":"Make predictions sdmTMB model; can predict original new data.","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/reference/predict.sdmTMB.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"Predict from an sdmTMB model — predict.sdmTMB","text":"","code":"# S3 method for sdmTMB predict( object, newdata = NULL, type = c(\"link\", \"response\"), se_fit = FALSE, re_form = NULL, re_form_iid = NULL, nsim = 0, sims_var = \"est\", model = c(NA, 1, 2), offset = NULL, mcmc_samples = NULL, return_tmb_object = FALSE, return_tmb_report = FALSE, return_tmb_data = FALSE, tmbstan_model = deprecated(), sims = deprecated(), area = deprecated(), ... )"},{"path":"https://pbs-assess.github.io/sdmTMB/reference/predict.sdmTMB.html","id":"arguments","dir":"Reference","previous_headings":"","what":"Arguments","title":"Predict from an sdmTMB model — predict.sdmTMB","text":"object model fitted sdmTMB(). newdata data frame make predictions . data frame predictor columns fitted data time column (spatiotemporal model) name fitted data. type est column link (default) response space? se_fit standard errors predictions new locations given newdata calculated? Warning: current implementation can slow large data sets high-resolution projections unless re_form = NA (omitting random fields). faster option approximate point-wise uncertainty use nsim argument. re_form NULL specify including spatial/spatiotemporal random effects predictions. ~0 NA population-level predictions. Likely used conjunction se_fit = TRUE. affect get_index() calculations. re_form_iid NULL specify including random intercepts predictions. ~0 NA population-level predictions. options (e.g., random intercepts) implemented yet. affects predictions newdata. affects get_index(). nsim > 0, simulate joint precision matrix nsim draws. Returns matrix nrow(data) nsim representing estimates linear predictor (.e., link space). Can useful deriving uncertainty predictions (e.g., apply(x, 1, sd)) propagating uncertainty. currently fastest way characterize uncertainty predictions space sdmTMB. sims_var Experimental: TMB reported variable model extracted joint precision matrix simulation draws? Defaults link-space predictions. Options include: \"omega_s\", \"zeta_s\", \"epsilon_st\", \"est_rf\" (described ). options passed verbatim. model Type prediction delta/hurdle model nsim > 0 mcmc_samples supplied: NA returns combined prediction components link scale positive component; 1 2 return first second model component link response scale depending argument type. regular prediction delta models, sets predictions returned. offset numeric vector optional offset values. left default NULL, offset implicitly left 0. mcmc_samples See extract_mcmc() sdmTMBextra package details Bayesian vignette. specified, predict function return matrix similar form nsim > 0 representing Bayesian posterior samples Stan model. return_tmb_object Logical. TRUE, include TMB object list format output. Necessary get_index() get_cog() functions. return_tmb_report Logical: return output TMB report? regular prediction, reported variables MLE parameter values. nsim > 0 mcmc_samples supplied, list element sample contents element output report sample. return_tmb_data Logical: return formatted data TMB? Used internally. tmbstan_model Deprecated. See mcmc_samples. sims Deprecated. Please use nsim instead. area Deprecated. Please use area get_index(). ... implemented.","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/reference/predict.sdmTMB.html","id":"value","dir":"Reference","previous_headings":"","what":"Value","title":"Predict from an sdmTMB model — predict.sdmTMB","text":"return_tmb_object = FALSE (nsim = 0 mcmc_samples = NULL): data frame: est: Estimate link space (everything link space) est_non_rf: Estimate everything random field est_rf: Estimate random fields combined omega_s: Spatial (intercept) random field constant time zeta_s: Spatial slope random field epsilon_st: Spatiotemporal (intercept) random fields, (zero), IID, AR1, random walk return_tmb_object = TRUE (nsim = 0 mcmc_samples = NULL): list: data: data frame described report: TMB report parameter values obj: TMB object returned prediction run fit_obj: original TMB model object case, likely need data element end user. elements included functions. nsim > 0 mcmc_samples NULL: matrix: Columns represent samples Rows represent predictions one row per row newdata","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/reference/predict.sdmTMB.html","id":"ref-examples","dir":"Reference","previous_headings":"","what":"Examples","title":"Predict from an sdmTMB model — predict.sdmTMB","text":"","code":"d <- pcod_2011 mesh <- make_mesh(d, c(\"X\", \"Y\"), cutoff = 30) # a coarse mesh for example speed m <- sdmTMB( data = d, formula = density ~ 0 + as.factor(year) + depth_scaled + depth_scaled2, time = \"year\", mesh = mesh, family = tweedie(link = \"log\") ) # Predictions at original data locations ------------------------------- predictions <- predict(m) head(predictions) #> # A tibble: 6 × 17 #> year X Y depth density present lat lon depth_mean depth_sd #> #> 1 2011 435. 5718. 241 245. 1 51.6 -130. 5.16 0.445 #> 2 2011 487. 5719. 52 0 0 51.6 -129. 5.16 0.445 #> 3 2011 490. 5717. 47 0 0 51.6 -129. 5.16 0.445 #> 4 2011 545. 5717. 157 0 0 51.6 -128. 5.16 0.445 #> 5 2011 404. 5720. 398 0 0 51.6 -130. 5.16 0.445 #> 6 2011 420. 5721. 486 0 0 51.6 -130. 5.16 0.445 #> # ℹ 7 more variables: depth_scaled , depth_scaled2 , est , #> # est_non_rf , est_rf , omega_s , epsilon_st predictions$resids <- residuals(m) # randomized quantile residuals #> Note what used to be the default sdmTMB residuals (before version 0.4.3.9005) #> are now `type = 'mle-eb'`. We recommend using the current default `'mle-mvn'`, #> which takes one sample from the approximate posterior of the random effects or #> `dharma_residuals()` using a similar approach. library(ggplot2) ggplot(predictions, aes(X, Y, col = resids)) + scale_colour_gradient2() + geom_point() + facet_wrap(~year) hist(predictions$resids) qqnorm(predictions$resids);abline(a = 0, b = 1) # Predictions onto new data -------------------------------------------- qcs_grid_2011 <- replicate_df(qcs_grid, \"year\", unique(pcod_2011$year)) predictions <- predict(m, newdata = qcs_grid_2011) # \\donttest{ # A short function for plotting our predictions: plot_map <- function(dat, column = est) { ggplot(dat, aes(X, Y, fill = {{ column }})) + geom_raster() + facet_wrap(~year) + coord_fixed() } plot_map(predictions, exp(est)) + scale_fill_viridis_c(trans = \"sqrt\") + ggtitle(\"Prediction (fixed effects + all random effects)\") plot_map(predictions, exp(est_non_rf)) + ggtitle(\"Prediction (fixed effects and any time-varying effects)\") + scale_fill_viridis_c(trans = \"sqrt\") plot_map(predictions, est_rf) + ggtitle(\"All random field estimates\") + scale_fill_gradient2() plot_map(predictions, omega_s) + ggtitle(\"Spatial random effects only\") + scale_fill_gradient2() plot_map(predictions, epsilon_st) + ggtitle(\"Spatiotemporal random effects only\") + scale_fill_gradient2() # Visualizing a marginal effect ---------------------------------------- # See the visreg package or the ggeffects::ggeffect() or # ggeffects::ggpredict() functions # To do this manually: nd <- data.frame(depth_scaled = seq(min(d$depth_scaled), max(d$depth_scaled), length.out = 100)) nd$depth_scaled2 <- nd$depth_scaled^2 # Because this is a spatiotemporal model, you'll need at least one time # element. If time isn't also a fixed effect then it doesn't matter what you pick: nd$year <- 2011L # L: integer to match original data p <- predict(m, newdata = nd, se_fit = TRUE, re_form = NA) ggplot(p, aes(depth_scaled, exp(est), ymin = exp(est - 1.96 * est_se), ymax = exp(est + 1.96 * est_se))) + geom_line() + geom_ribbon(alpha = 0.4) # Plotting marginal effect of a spline --------------------------------- m_gam <- sdmTMB( data = d, formula = density ~ 0 + as.factor(year) + s(depth_scaled, k = 5), time = \"year\", mesh = mesh, family = tweedie(link = \"log\") ) if (require(\"visreg\", quietly = TRUE)) { visreg::visreg(m_gam, \"depth_scaled\") } # or manually: nd <- data.frame(depth_scaled = seq(min(d$depth_scaled), max(d$depth_scaled), length.out = 100)) nd$year <- 2011L p <- predict(m_gam, newdata = nd, se_fit = TRUE, re_form = NA) ggplot(p, aes(depth_scaled, exp(est), ymin = exp(est - 1.96 * est_se), ymax = exp(est + 1.96 * est_se))) + geom_line() + geom_ribbon(alpha = 0.4) # Forecasting ---------------------------------------------------------- mesh <- make_mesh(d, c(\"X\", \"Y\"), cutoff = 15) unique(d$year) #> [1] 2011 2013 2015 2017 m <- sdmTMB( data = d, formula = density ~ 1, spatiotemporal = \"AR1\", # using an AR1 to have something to forecast with extra_time = 2019L, # `L` for integer to match our data spatial = \"off\", time = \"year\", mesh = mesh, family = tweedie(link = \"log\") ) # Add a year to our grid: grid2019 <- qcs_grid_2011[qcs_grid_2011$year == max(qcs_grid_2011$year), ] grid2019$year <- 2019L # `L` because `year` is an integer in the data qcsgrid_forecast <- rbind(qcs_grid_2011, grid2019) predictions <- predict(m, newdata = qcsgrid_forecast) plot_map(predictions, exp(est)) + scale_fill_viridis_c(trans = \"log10\") plot_map(predictions, epsilon_st) + scale_fill_gradient2() # Estimating local trends ---------------------------------------------- d <- pcod d$year_scaled <- as.numeric(scale(d$year)) mesh <- make_mesh(pcod, c(\"X\", \"Y\"), cutoff = 25) m <- sdmTMB(data = d, formula = density ~ depth_scaled + depth_scaled2, mesh = mesh, family = tweedie(link = \"log\"), spatial_varying = ~ 0 + year_scaled, time = \"year\", spatiotemporal = \"off\") nd <- replicate_df(qcs_grid, \"year\", unique(pcod$year)) nd$year_scaled <- (nd$year - mean(d$year)) / sd(d$year) p <- predict(m, newdata = nd) plot_map(subset(p, year == 2003), zeta_s_year_scaled) + # pick any year ggtitle(\"Spatial slopes\") + scale_fill_gradient2() plot_map(p, est_rf) + ggtitle(\"Random field estimates\") + scale_fill_gradient2() plot_map(p, exp(est_non_rf)) + ggtitle(\"Prediction (fixed effects only)\") + scale_fill_viridis_c(trans = \"sqrt\") plot_map(p, exp(est)) + ggtitle(\"Prediction (fixed effects + all random effects)\") + scale_fill_viridis_c(trans = \"sqrt\") # }"},{"path":"https://pbs-assess.github.io/sdmTMB/reference/priors.html","id":null,"dir":"Reference","previous_headings":"","what":"Prior distributions — sdmTMBpriors","title":"Prior distributions — sdmTMBpriors","text":"Optional priors/penalties model parameters. results penalized likelihood within TMB can used priors model passed tmbstan (see Bayesian vignette). Note Jacobian adjustments made bayesian = TRUE sdmTMB() model fit. .e., final model fit tmbstan priors specified bayesian set TRUE. Otherwise, leave bayesian = FALSE. pc_matern() Penalized Complexity prior Matern covariance function.","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/reference/priors.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"Prior distributions — sdmTMBpriors","text":"","code":"sdmTMBpriors( matern_s = pc_matern(range_gt = NA, sigma_lt = NA), matern_st = pc_matern(range_gt = NA, sigma_lt = NA), phi = halfnormal(NA, NA), ar1_rho = normal(NA, NA), tweedie_p = normal(NA, NA), b = normal(NA, NA), sigma_G = halfnormal(NA, NA) ) normal(location = 0, scale = 1) halfnormal(location = 0, scale = 1) mvnormal(location = 0, scale = diag(length(location))) pc_matern(range_gt, sigma_lt, range_prob = 0.05, sigma_prob = 0.05)"},{"path":"https://pbs-assess.github.io/sdmTMB/reference/priors.html","id":"arguments","dir":"Reference","previous_headings":"","what":"Arguments","title":"Prior distributions — sdmTMBpriors","text":"matern_s PC (Penalized Complexity) prior (pc_matern()) spatial random field Matérn parameters. matern_st matern_s spatiotemporal random field. Note likely want set share_fields = FALSE choose set spatial spatiotemporal Matérn PC prior since include prior spatial range parameter. phi halfnormal() prior dispersion parameter observation distribution. ar1_rho normal() prior AR1 random field parameter. Note parameter support -1 < ar1_rho < 1. tweedie_p normal() prior Tweedie power parameter. Note parameter support 1 < tweedie_p < 2 choose mean appropriately. b normal() priors main population-level 'beta' effects. sigma_G halfnormal() priors random intercept SDs. location Location parameter(s). scale Scale parameter. normal()/halfnormal(): standard deviation(s). mvnormal(): variance-covariance matrix. range_gt value one expects spatial spatiotemporal range greater 1 - range_prob probability. sigma_lt value one expects spatial spatiotemporal marginal standard deviation (sigma_O sigma_E internally) less 1 - sigma_prob probability. range_prob Probability. See description range_gt. sigma_prob Probability. See description sigma_lt.","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/reference/priors.html","id":"value","dir":"Reference","previous_headings":"","what":"Value","title":"Prior distributions — sdmTMBpriors","text":"named list values specified priors.","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/reference/priors.html","id":"details","dir":"Reference","previous_headings":"","what":"Details","title":"Prior distributions — sdmTMBpriors","text":"Meant passed priors argument sdmTMB(). normal() halfnormal() define normal half-normal priors , point, must location (mean) parameter 0. halfnormal() normal() can used make syntax clearer. intended used parameters support > 0. See https://arxiv.org/abs/1503.00256 description PC prior Gaussian random fields. Quoting discussion (substituting argument names pc_matern()): \"simulation study observe good coverage equal-tailed 95% credible intervals prior satisfies P(sigma > sigma_lt) = 0.05 P(range < range_gt) = 0.05, sigma_lt 2.5 40 times true marginal standard deviation range_gt 1/10 1/2.5 true range.\" Keep mind range dependent units scale coordinate system. practice, may choose try fitting model without PC prior constraining model . better option simulate model given range sigma choose reasonable values system base prior knowledge model fit similar system spatial information data.","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/reference/priors.html","id":"references","dir":"Reference","previous_headings":"","what":"References","title":"Prior distributions — sdmTMBpriors","text":"Fuglstad, G.-., Simpson, D., Lindgren, F., Rue, H. (2016) Constructing Priors Penalize Complexity Gaussian Random Fields. arXiv:1503.00256 Simpson, D., Rue, H., Martins, T., Riebler, ., Sørbye, S. (2015) Penalising model component complexity: principled, practical approach constructing priors. arXiv:1403.4630","code":""},{"path":[]},{"path":"https://pbs-assess.github.io/sdmTMB/reference/priors.html","id":"ref-examples","dir":"Reference","previous_headings":"","what":"Examples","title":"Prior distributions — sdmTMBpriors","text":"","code":"normal(0, 1) #> [,1] [,2] #> [1,] 0 1 #> attr(,\"dist\") #> [1] \"normal\" halfnormal(0, 1) #> [,1] [,2] #> [1,] 0 1 #> attr(,\"dist\") #> [1] \"normal\" mvnormal(c(0, 0)) #> [,1] [,2] [,3] #> [1,] 0 1 0 #> [2,] 0 0 1 #> attr(,\"dist\") #> [1] \"mvnormal\" pc_matern(range_gt = 5, sigma_lt = 1) #> [1] 5.00 1.00 0.05 0.05 #> attr(,\"dist\") #> [1] \"pc_matern\" plot_pc_matern(range_gt = 5, sigma_lt = 1) # \\donttest{ d <- subset(pcod, year > 2011) pcod_spde <- make_mesh(d, c(\"X\", \"Y\"), cutoff = 30) # - no priors on population-level effects (`b`) # - halfnormal(0, 10) prior on dispersion parameter `phi` # - Matern PC priors on spatial `matern_s` and spatiotemporal # `matern_st` random field parameters m <- sdmTMB(density ~ s(depth, k = 3), data = d, mesh = pcod_spde, family = tweedie(), share_range = FALSE, time = \"year\", priors = sdmTMBpriors( phi = halfnormal(0, 10), matern_s = pc_matern(range_gt = 5, sigma_lt = 1), matern_st = pc_matern(range_gt = 5, sigma_lt = 1) ) ) # - no prior on intercept # - normal(0, 1) prior on depth coefficient # - no prior on the dispersion parameter `phi` # - Matern PC prior m <- sdmTMB(density ~ depth_scaled, data = d, mesh = pcod_spde, family = tweedie(), spatiotemporal = \"off\", priors = sdmTMBpriors( b = normal(c(NA, 0), c(NA, 1)), matern_s = pc_matern(range_gt = 5, sigma_lt = 1) ) ) # You get a prior, you get a prior, you get a prior! # (except on the annual means; see the `NA`s) m <- sdmTMB(density ~ 0 + depth_scaled + depth_scaled2 + as.factor(year), data = d, time = \"year\", mesh = pcod_spde, family = tweedie(link = \"log\"), share_range = FALSE, spatiotemporal = \"AR1\", priors = sdmTMBpriors( b = normal(c(0, 0, NA, NA, NA), c(2, 2, NA, NA, NA)), phi = halfnormal(0, 10), # tweedie_p = normal(1.5, 2), ar1_rho = normal(0, 1), matern_s = pc_matern(range_gt = 5, sigma_lt = 1), matern_st = pc_matern(range_gt = 5, sigma_lt = 1)) ) # }"},{"path":"https://pbs-assess.github.io/sdmTMB/reference/reexports.html","id":null,"dir":"Reference","previous_headings":"","what":"Objects exported from other packages — reexports","title":"Objects exported from other packages — reexports","text":"objects imported packages. Follow links see documentation. generics tidy","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/reference/replicate_df.html","id":null,"dir":"Reference","previous_headings":"","what":"Replicate a prediction data frame over time — replicate_df","title":"Replicate a prediction data frame over time — replicate_df","text":"Useful replicating prediction grids across time slices used model fitting.","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/reference/replicate_df.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"Replicate a prediction data frame over time — replicate_df","text":"","code":"replicate_df(dat, time_name, time_values)"},{"path":"https://pbs-assess.github.io/sdmTMB/reference/replicate_df.html","id":"arguments","dir":"Reference","previous_headings":"","what":"Arguments","title":"Replicate a prediction data frame over time — replicate_df","text":"dat Data frame. time_name Name time column output. time_values Time values replicate dat .","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/reference/replicate_df.html","id":"value","dir":"Reference","previous_headings":"","what":"Value","title":"Replicate a prediction data frame over time — replicate_df","text":"data frame replicated time_values new column based time_name.","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/reference/replicate_df.html","id":"ref-examples","dir":"Reference","previous_headings":"","what":"Examples","title":"Replicate a prediction data frame over time — replicate_df","text":"","code":"df <- data.frame(variable = c(\"a\", \"b\")) replicate_df(df, time_name = \"year\", time_values = 1:3) #> variable year #> 1 a 1 #> 2 b 1 #> 3 a 2 #> 4 b 2 #> 5 a 3 #> 6 b 3 head(qcs_grid) #> X Y depth depth_scaled depth_scaled2 #> 1 456 5636 347.0834 1.5608122 2.43613479 #> 2 458 5636 223.3348 0.5697699 0.32463771 #> 3 460 5636 203.7408 0.3633693 0.13203724 #> 4 462 5636 183.2987 0.1257046 0.01580166 #> 5 464 5636 182.9998 0.1220368 0.01489297 #> 6 466 5636 186.3892 0.1632882 0.02666303 nd <- replicate_df(qcs_grid, \"year\", unique(pcod$year)) head(nd) #> X Y depth depth_scaled depth_scaled2 year #> 1 456 5636 347.0834 1.5608122 2.43613479 2003 #> 2 458 5636 223.3348 0.5697699 0.32463771 2003 #> 3 460 5636 203.7408 0.3633693 0.13203724 2003 #> 4 462 5636 183.2987 0.1257046 0.01580166 2003 #> 5 464 5636 182.9998 0.1220368 0.01489297 2003 #> 6 466 5636 186.3892 0.1632882 0.02666303 2003 table(nd$year) #> #> 2003 2004 2005 2007 2009 2011 2013 2015 2017 #> 7314 7314 7314 7314 7314 7314 7314 7314 7314"},{"path":"https://pbs-assess.github.io/sdmTMB/reference/residuals.sdmTMB.html","id":null,"dir":"Reference","previous_headings":"","what":"Residuals method for sdmTMB models — residuals.sdmTMB","title":"Residuals method for sdmTMB models — residuals.sdmTMB","text":"See residual-checking vignette: browseVignettes(\"sdmTMB\") documentation site. See notes types residuals 'Details' section .","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/reference/residuals.sdmTMB.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"Residuals method for sdmTMB models — residuals.sdmTMB","text":"","code":"# S3 method for sdmTMB residuals( object, type = c(\"mle-mvn\", \"mle-eb\", \"mle-mcmc\", \"response\", \"pearson\"), model = c(1, 2), mcmc_samples = NULL, qres_func = NULL, ... )"},{"path":"https://pbs-assess.github.io/sdmTMB/reference/residuals.sdmTMB.html","id":"arguments","dir":"Reference","previous_headings":"","what":"Arguments","title":"Residuals method for sdmTMB models — residuals.sdmTMB","text":"object sdmTMB() model. type Residual type. See details. model delta/hurdle model component? mcmc_samples vector MCMC samples linear predictor link space. See predict_mle_mcmc() function sdmTMBextra package. qres_func custom quantile residuals function. Function take arguments object, y, mu, ... return vector length length(y). ... Passed custom qres_func function. Unused.","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/reference/residuals.sdmTMB.html","id":"value","dir":"Reference","previous_headings":"","what":"Value","title":"Residuals method for sdmTMB models — residuals.sdmTMB","text":"vector residuals. Note randomization single random effect posterior sample randomized quantile routines result different residuals call. suggested set randomization seed go \"fishing\" perfect residuals present inspected residuals.","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/reference/residuals.sdmTMB.html","id":"details","dir":"Reference","previous_headings":"","what":"Details","title":"Residuals method for sdmTMB models — residuals.sdmTMB","text":"Randomized quantile residuals: mle-mvn, mle-eb, mle-mcmc implementations randomized quantile residuals (Dunn & Smyth 1996), also known probability integral transform (PIT) residuals (Smith 1985). data consistent model assumptions, residuals distributed normal(0, 1). Randomization added account integer binary response observations. example, Poisson observation likelihood observations y mean predictions mu, create randomized quantile residuals : Types residuals: Acronyms: EB: Empirical Bayes MCMC: Markov chain Monte Carlo MLE: Maximum Likelihood Estimate MVN: Multivariate normal mle-mvn: Fixed effects held MLEs random effects taken single approximate posterior sample. \"approximate\" part refers sample taken random effects' assumed MVN distribution. practice, sample obtained based mode Hessian random effects taking advantage sparsity Hessian computational efficiency. sample taken obj$MC(), obj TMB object created TMB::MakeADFun(). See Waagepetersen (2006) description source code internal TMB function TMB:::oneSamplePosterior(). Residuals converted randomized quantile residuals described . mle-eb: Fixed effects held MLEs random effects taken EB estimates. used default residuals sdmTMB (called mle-laplace). available backwards compatibility research purposes recommended checking goodness fit. Residuals converted randomized quantile residuals described . mle-mcmc: Fixed effects held MLEs random effects taken single posterior sample obtained MCMC. excellent option since make assumption distribution random effects (compared mle-mvn option) can slow obtain. See Waagepetersen (2006) Thygesen et al. (2017). Residuals converted randomized quantile residuals described . See sdmTMBextra package function predict_mle_mcmc(), can generate MCMC samples pass mcmc_samples argument. Ideally MCMC run convergence last iteration can used residuals. defaults may sufficient many models. response: simple observed minus predicted residuals. pearson: Pearson residuals: response residuals scaled standard deviation. weights present, residuals multiplied sqrt(weights).","code":"a <- ppois(y - 1, mu) b <- ppois(y, mu) u <- runif(n = length(y), min = a, max = b) qnorm(u)"},{"path":"https://pbs-assess.github.io/sdmTMB/reference/residuals.sdmTMB.html","id":"references","dir":"Reference","previous_headings":"","what":"References","title":"Residuals method for sdmTMB models — residuals.sdmTMB","text":"Dunn, P.K. & Smyth, G.K. (1996). Randomized Quantile Residuals. Journal Computational Graphical Statistics, 5, 236–244. Smith, J.Q. (1985). Diagnostic checks non-standard time series models. Journal Forecasting, 4, 283–291. Waagepetersen, R. (2006). simulation-based goodness--fit test random effects generalized linear mixed models. Scandinavian Journal Statistics, 33(4), 721-731. Thygesen, U.H., Albertsen, C.M., Berg, C.W., Kristensen, K., Nielsen, . 2017. Validation ecological state space models using Laplace approximation. Environ Ecol Stat 24(2): 317–339. doi:10.1007/s10651-017-0372-4 Rufener, M.-C., Kristensen, K., Nielsen, J.R., Bastardie, F. 2021. Bridging gap commercial fisheries survey data model spatiotemporal dynamics marine species. Ecological Applications. e02453. doi:10.1002/eap.2453","code":""},{"path":[]},{"path":"https://pbs-assess.github.io/sdmTMB/reference/residuals.sdmTMB.html","id":"ref-examples","dir":"Reference","previous_headings":"","what":"Examples","title":"Residuals method for sdmTMB models — residuals.sdmTMB","text":"","code":"mesh <- make_mesh(pcod_2011, c(\"X\", \"Y\"), cutoff = 10) fit <- sdmTMB( present ~ as.factor(year) + poly(depth, 2), data = pcod_2011, mesh = mesh, family = binomial() ) # the default \"mle-mvn\" residuals use fixed effects at their MLE and a # single sample from the approximate random effect posterior: set.seed(9283) r <- residuals(fit, type = \"mle-mvn\") qqnorm(r) abline(0, 1) # response residuals will be not be normally distributed unless # the family is Gaussian: r <- residuals(fit, type = \"response\") qqnorm(r) abline(0, 1) # \"mle-eb\" are quick but are not expected to be N(0, 1); not recommended: set.seed(2321) r <- residuals(fit, type = \"mle-eb\") qqnorm(r) abline(0, 1) # see also \"mle-mcmc\" residuals with the help of the sdmTMBextra package # we can fake them here by taking a single sample from the joint precision # matrix and pretending they are MCMC samples: set.seed(82728) p <- predict(fit, nsim = 1) # pretend these are from sdmTMBextra::predict_mle_mcmc() r <- residuals(fit, mcmc_samples = p) #> Note what used to be the default sdmTMB residuals (before version 0.4.3.9005) #> are now `type = 'mle-eb'`. We recommend using the current default `'mle-mvn'`, #> which takes one sample from the approximate posterior of the random effects or #> `dharma_residuals()` using a similar approach. qqnorm(r) abline(0, 1)"},{"path":[]},{"path":"https://pbs-assess.github.io/sdmTMB/reference/run_extra_optimization.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"Run extra optimization on an already fitted object — run_extra_optimization","text":"","code":"run_extra_optimization(object, nlminb_loops = 0, newton_loops = 1)"},{"path":"https://pbs-assess.github.io/sdmTMB/reference/run_extra_optimization.html","id":"arguments","dir":"Reference","previous_headings":"","what":"Arguments","title":"Run extra optimization on an already fitted object — run_extra_optimization","text":"object object sdmTMB(). nlminb_loops many extra times run stats::nlminb() optimization. Sometimes restarting optimizer previous best values aids convergence. newton_loops many extra Newton optimization loops try stats::optimHess(). Sometimes aids convergence.","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/reference/run_extra_optimization.html","id":"value","dir":"Reference","previous_headings":"","what":"Value","title":"Run extra optimization on an already fitted object — run_extra_optimization","text":"updated model fit class sdmTMB.","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/reference/run_extra_optimization.html","id":"ref-examples","dir":"Reference","previous_headings":"","what":"Examples","title":"Run extra optimization on an already fitted object — run_extra_optimization","text":"","code":"# Run extra optimization steps to help convergence: # (Not typically needed) fit <- sdmTMB(density ~ 0 + poly(depth, 2) + as.factor(year), data = pcod_2011, mesh = pcod_mesh_2011, family = tweedie()) fit_1 <- run_extra_optimization(fit, newton_loops = 1) max(fit$gradients) #> [1] 5.791343e-09 max(fit_1$gradients) #> [1] 5.791343e-09"},{"path":"https://pbs-assess.github.io/sdmTMB/reference/sanity.html","id":null,"dir":"Reference","previous_headings":"","what":"Sanity check of an sdmTMB model — sanity","title":"Sanity check of an sdmTMB model — sanity","text":"Sanity check sdmTMB model","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/reference/sanity.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"Sanity check of an sdmTMB model — sanity","text":"","code":"sanity(object, big_sd_log10 = 2, gradient_thresh = 0.001, silent = FALSE)"},{"path":"https://pbs-assess.github.io/sdmTMB/reference/sanity.html","id":"arguments","dir":"Reference","previous_headings":"","what":"Arguments","title":"Sanity check of an sdmTMB model — sanity","text":"object Fitted model sdmTMB(). big_sd_log10 Value check size standard errors . value 2 indicate standard errors greater 10^2 (.e., 100) flagged. gradient_thresh Gradient threshold issue warning. silent Logical: suppress messages? Useful set TRUE running large numbers models just interested returning sanity list objects.","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/reference/sanity.html","id":"value","dir":"Reference","previous_headings":"","what":"Value","title":"Sanity check of an sdmTMB model — sanity","text":"invisible named list checks.","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/reference/sanity.html","id":"details","dir":"Reference","previous_headings":"","what":"Details","title":"Sanity check of an sdmTMB model — sanity","text":"object NA, NULL, class \"try-error\", sanity() return FALSE. facilitate using sanity() models try() tryCatch(). See examples section.","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/reference/sanity.html","id":"ref-examples","dir":"Reference","previous_headings":"","what":"Examples","title":"Sanity check of an sdmTMB model — sanity","text":"","code":"fit <- sdmTMB( present ~ s(depth), data = pcod_2011, mesh = pcod_mesh_2011, family = binomial() ) sanity(fit) #> ✔ Non-linear minimizer suggests successful convergence #> ✔ Hessian matrix is positive definite #> ✔ No extreme or very small eigenvalues detected #> ✔ No gradients with respect to fixed effects are >= 0.001 #> ✔ No fixed-effect standard errors are NA #> ✔ No standard errors look unreasonably large #> ✔ No sigma parameters are < 0.01 #> ✔ No sigma parameters are > 100 #> ✔ Range parameter doesn't look unreasonably large s <- sanity(fit) #> ✔ Non-linear minimizer suggests successful convergence #> ✔ Hessian matrix is positive definite #> ✔ No extreme or very small eigenvalues detected #> ✔ No gradients with respect to fixed effects are >= 0.001 #> ✔ No fixed-effect standard errors are NA #> ✔ No standard errors look unreasonably large #> ✔ No sigma parameters are < 0.01 #> ✔ No sigma parameters are > 100 #> ✔ Range parameter doesn't look unreasonably large s #> $hessian_ok #> [1] TRUE #> #> $eigen_values_ok #> [1] TRUE #> #> $nlminb_ok #> [1] TRUE #> #> $range_ok #> [1] TRUE #> #> $gradients_ok #> [1] TRUE #> #> $se_magnitude_ok #> [1] TRUE #> #> $se_na_ok #> [1] TRUE #> #> $sigmas_ok #> [1] TRUE #> #> $all_ok #> [1] TRUE #> # If fitting many models in a loop, you may want to wrap # sdmTMB() in try() to handle errors. sanity() will take an object # of class \"try-error\" and return FALSE. # Here, we will use stop() to simulate a failed sdmTMB() fit: failed_fit <- try(stop()) #> Error in try(stop()) : s2 <- sanity(failed_fit) all(unlist(s)) #> [1] TRUE all(unlist(s2)) #> [1] FALSE"},{"path":"https://pbs-assess.github.io/sdmTMB/reference/sdmTMB.html","id":null,"dir":"Reference","previous_headings":"","what":"Fit a spatial or spatiotemporal GLMM with TMB — sdmTMB","title":"Fit a spatial or spatiotemporal GLMM with TMB — sdmTMB","text":"Fit spatial spatiotemporal generalized linear mixed effects model (GLMM) TMB (Template Model Builder) R package SPDE (stochastic partial differential equation) approximation Gaussian random fields.","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/reference/sdmTMB.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"Fit a spatial or spatiotemporal GLMM with TMB — sdmTMB","text":"","code":"sdmTMB( formula, data, mesh, time = NULL, family = gaussian(link = \"identity\"), spatial = c(\"on\", \"off\"), spatiotemporal = c(\"iid\", \"ar1\", \"rw\", \"off\"), share_range = TRUE, time_varying = NULL, time_varying_type = c(\"rw\", \"rw0\", \"ar1\"), spatial_varying = NULL, weights = NULL, offset = NULL, extra_time = NULL, reml = FALSE, silent = TRUE, anisotropy = FALSE, control = sdmTMBcontrol(), priors = sdmTMBpriors(), knots = NULL, bayesian = FALSE, previous_fit = NULL, do_fit = TRUE, do_index = FALSE, predict_args = NULL, index_args = NULL, experimental = NULL )"},{"path":"https://pbs-assess.github.io/sdmTMB/reference/sdmTMB.html","id":"arguments","dir":"Reference","previous_headings":"","what":"Arguments","title":"Fit a spatial or spatiotemporal GLMM with TMB — sdmTMB","text":"formula Model formula. IID random intercepts possible using lme4 syntax, e.g., + (1 | g) g column class character factor representing groups. Penalized splines possible via mgcv s(). Optionally list delta (hurdle) models. See examples details . data data frame. mesh object make_mesh(). time optional time column name (character). Can left NULL model spatial random fields; however, data actually spatiotemporal wish use get_index() get_cog() downstream, supply time argument. family family link. Supports gaussian(), Gamma(), binomial(), poisson(), Beta(), nbinom2(), truncated_nbinom2(), nbinom1(), truncated_nbinom1(), censored_poisson(), gamma_mix(), lognormal_mix(), student(), tweedie(), gengamma(). Supports delta/hurdle models: delta_beta(), delta_gamma(), delta_gamma_mix(), delta_lognormal_mix(), delta_lognormal(), delta_truncated_nbinom2(), binomial family options, see 'Binomial families' Details section . spatial Estimate spatial random fields? Options '' / '' TRUE / FALSE. Optionally, list delta models, e.g. list('', ''). spatiotemporal Estimate spatiotemporal random fields 'iid' (independent identically distributed; default), stationary 'ar1' (first-order autoregressive), random walk ('rw'), fixed 0 ''. set '' time = NULL. delta model, can list. E.g., list('', 'ar1'). Note spatiotemporal standard deviation represents marginal steady-state standard deviation process case AR1. .e., scaled according correlation. See TMB documentation. AR1 correlation coefficient (rho) estimated close 1, say > 0.99, may wish switch random walk 'rw'. Capitalization ignored. TRUE gets converted 'iid' FALSE gets converted ''. share_range Logical: estimate shared spatial spatiotemporal range parameter (TRUE, default) independent range parameters (FALSE). delta model, can list. E.g., list(TRUE, FALSE). time_varying optional one-sided formula describing covariates modelled time-varying process. Set type process time_varying_type. See help time_varying_type warnings modelling first time step. Structure shared delta models. time_varying_type Type time-varying process apply time_varying formula. 'rw' indicates random walk first time step estimated independently (included legacy reasons), 'rw0' indicates random walk first time step estimated mean-zero normal prior, 'ar1' indicates stationary first-order autoregressive process first time step estimated mean-zero prior. case 'rw', careful include covariates (including intercept) main time-varying formula since first time step estimated independently. .e., case, least one ~ 0 ~ -1. Structure shared delta models. spatial_varying optional one-sided formula coefficients vary space random fields. Note likely want include fixed effect variable improve interpretability since random field assumed mean 0. (scaled) time column used, represent local-time-trend model. See doi:10.1111/ecog.05176 spatial trends vignette. Note predictor usually centered mean zero standard deviation approximately 1. spatial intercept controlled spatial argument; therefore, include exclude spatial intercept setting spatial = '' ''. time matters whether spatial_varying excludes intercept case factor predictors. case, spatial_varying excludes intercept (~ 0 ~ -1), set spatial = '' match. Structure must shared delta models. weights numeric vector representing optional likelihood weights conditional model. Implemented glmmTMB: weights sum one internally modified. Can also used trials binomial family; weights argument needs vector name variable data frame. See Details section . offset numeric vector representing model offset character value representing column name offset. delta/hurdle models, applies positive component. Usually log transformed variable. extra_time Optional extra time slices (e.g., years) include interpolation forecasting predict function. See Details section . reml Logical: use REML (restricted maximum likelihood) estimation rather maximum likelihood? Internally, adds fixed effects list random effects integrate . silent Silent include optimization details? Helpful set FALSE models take fit. anisotropy Logical: allow anisotropy (spatial correlation directionally dependent)? See plot_anisotropy(). Must shared across delta models. control Optimization control options via sdmTMBcontrol(). priors Optional penalties/priors via sdmTMBpriors(). Must currently shared across delta models. knots Optional named list containing knot values used basis construction smoothing terms. See mgcv::gam() mgcv::gamm(). E.g., s(x, bs = 'cc', k = 4), knots = list(x = c(1, 2, 3, 4)) bayesian Logical indicating model passed tmbstan. TRUE, Jacobian adjustments applied account parameter transformations priors applied. previous_fit previously fitted sdmTMB model initialize optimization . Can greatly speed fitting. Note model must set exactly way. However, data weights arguments can change, can useful cross-validation. do_fit Fit model (TRUE) return processed data without fitting (FALSE)? do_index index standardization calculations fitting? Saves memory time working large datasets projection grids since TMB object rebuilt predict.sdmTMB() get_index(). TRUE, predict_args must newdata element supplied area can supplied index_args. users can ignore option. fitted object can passed directly get_index(). predict_args list arguments pass predict.sdmTMB() do_index = TRUE. users can ignore option. index_args list arguments pass get_index() do_index = TRUE. Currently, area supported. Bias correction can done calling get_index() resulting fitted object. users can ignore option. experimental named list esoteric -development options. dragons.","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/reference/sdmTMB.html","id":"value","dir":"Reference","previous_headings":"","what":"Value","title":"Fit a spatial or spatiotemporal GLMM with TMB — sdmTMB","text":"object (list) class sdmTMB. Useful elements include: sd_report: output TMB::sdreport() gradients: marginal log likelihood gradients respect fixed effect model: output stats::nlminb() data: fitted data mesh: object supplied mesh argument family: family object, includes inverse link function family$linkinv() tmb_params: parameters list passed TMB::MakeADFun() tmb_map: 'map' list passed TMB::MakeADFun() tmb_data: data list passed TMB::MakeADFun() tmb_obj: TMB object created TMB::MakeADFun()","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/reference/sdmTMB.html","id":"details","dir":"Reference","previous_headings":"","what":"Details","title":"Fit a spatial or spatiotemporal GLMM with TMB — sdmTMB","text":"Model description See model description vignette relevant appendix preprint sdmTMB: doi:10.1101/2022.03.24.485545 Binomial families Following structure stats::glm() glmmTMB, binomial family can specified one 4 ways: (1) response may factor (model classifies first level versus others), (2) response may binomial (0/1), (3) response can matrix form cbind(success, failure), (4) response may observed proportions, 'weights' argument used specify Binomial size (N) parameter (prob ~ ..., weights = N). Smooth terms Smooth terms can included following GAMs (generalized additive models) using + s(x), implements smooth mgcv::s(). sdmTMB uses penalized smooths, constructed via mgcv::smooth2random(). similar approach implemented gamm4 brms, among packages. Within smooths, syntax commonly used mgcv::s() mgcv::t2() can applied, e.g. 2-dimensional smooths may constructed + s(x, y) + t2(x, y); smooths can specific various factor levels, + s(x, = group); basis function dimensions may specified, e.g. + s(x, k = 4); various types splines may constructed cyclic splines model seasonality (perhaps knots argument also supplied). Threshold models linear break-point relationship covariate can included via + breakpt(variable) formula, variable single covariate corresponding column data. case, relationship linear point constant (hockey-stick shaped). Similarly, logistic-function threshold model can included via + logistic(variable). option models relationship logistic function 50% 95% values. similar length- size-based selectivity fisheries, parameterized points f(x) = 0.5 0.95. See threshold vignette. Note single threshold covariate can included covariate included components delta families. Extra time: forecasting interpolating Extra time slices (e.g., years) can included interpolation forecasting predict function via extra_time argument. predict function requires time slices defined fitting model ensure various time indices set correctly. careful including extra time slices model remains identifiable. example, including + .factor(year) formula render model data inform expected value missing year. sdmTMB() makes attempt determine model makes sense forecasting interpolation. options time_varying, spatiotemporal = \"rw\", spatiotemporal = \"ar1\", smoother time column provide mechanisms predict missing time slices process error. extra_time can also used fill missing time steps purposes random walk AR(1) process gaps time steps uneven. extra_time can include extra time steps time steps including found fitted data. latter option may simpler. Regularization priors can achieve regularization via penalties (priors) fixed effect parameters. See sdmTMBpriors(). can fit model without penalties look output print(your_model) tidy(your_model) fit model do_fit = FALSE inspect head(your_model$tmb_data$X_ij[[1]]) want see formula translated fixed effect model matrix. Also see Bayesian vignette. Delta/hurdle models Delta models (also known hurdle models) can fit two separate models time using appropriate delta family. E.g.: delta_gamma(), delta_beta(), delta_lognormal(), delta_truncated_nbinom2(). fit delta family, default formula, spatial, spatiotemporal components shared. elements can specified independently two models using list format. include formula, spatial, spatiotemporal, share_range. first element list binomial component second element positive component (e.g., Gamma). elements must shared now (e.g., spatially varying coefficients, time-varying coefficients). Furthermore, currently limitations specifying two formulas list: two formulas smoothers, threshold effects, random intercepts. now, must specified single formula shared across two models. main advantage specifying models using delta family (compared fitting two separate models) (1) coding simplicity (2) calculation uncertainty derived quantities index abundance get_index() using generalized delta method within TMB. Also, selected parameters can shared across models. See delta-model vignette. Index standardization index standardization, may wish include 0 + .factor(year) (whatever time column called) formula. See basic example index standardization relevant package vignette. need specify time argument. See get_index().","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/reference/sdmTMB.html","id":"references","dir":"Reference","previous_headings":"","what":"References","title":"Fit a spatial or spatiotemporal GLMM with TMB — sdmTMB","text":"Main reference introducing package cite using sdmTMB: Anderson, S.C., E.J. Ward, P.. English, L..K. Barnett. 2022. sdmTMB: R package fast, flexible, user-friendly generalized linear mixed effects models spatial spatiotemporal random fields. bioRxiv 2022.03.24.485545; doi:10.1101/2022.03.24.485545 . Reference local trends: Barnett, L..K., E.J. Ward, S.C. Anderson. 2021. Improving estimates species distribution change incorporating local trends. Ecography. 44(3):427-439. doi:10.1111/ecog.05176 . explanation model application calculating climate velocities: English, P., E.J. Ward, C.N. Rooper, R.E. Forrest, L.. Rogers, K.L. Hunter, .M. Edwards, B.M. Connors, S.C. Anderson. 2021. Contrasting climate velocity impacts warm cool locations show effects marine warming worse already warmer temperate waters. Fish Fisheries. 23(1) 239-255. doi:10.1111/faf.12613 . Discussion illustration decision points fitting models: Commander, C.J.C., L..K. Barnett, E.J. Ward, S.C. Anderson, T.E. Essington. 2022. shadow model: small choices spatially explicit species distribution models affect predictions. PeerJ 10: e12783. doi:10.7717/peerj.12783 . Application description threshold/break-point models: Essington, T.E., S.C. Anderson, L..K. Barnett, H.M. Berger, S.. Siedlecki, E.J. Ward. 2022. Advancing statistical models reveal effect dissolved oxygen spatial distribution marine taxa using thresholds physiologically based index. Ecography. 2022: e06249 doi:10.1111/ecog.06249 . Application fish body condition: Lindmark, M., S.C. Anderson, M. Gogina, M. Casini. Evaluating drivers spatiotemporal individual condition bottom-associated marine fish. bioRxiv 2022.04.19.488709. doi:10.1101/2022.04.19.488709 . Several sections original TMB model code adapted VAST R package: Thorson, J.T. 2019. Guidance decisions using Vector Autoregressive Spatio-Temporal (VAST) package stock, ecosystem, habitat climate assessments. Fish. Res. 210:143–161. doi:10.1016/j.fishres.2018.10.013 . Code family R--TMB implementation, selected parameterizations observation likelihoods, general package structure inspiration, idea behind TMB prediction approach adapted glmmTMB R package: Brooks, M.E., K. Kristensen, K.J. van Benthem, . Magnusson, C.W. Berg, . Nielsen, H.J. Skaug, M. Maechler, B.M. Bolker. 2017. glmmTMB Balances Speed Flexibility Among Packages Zero-inflated Generalized Linear Mixed Modeling. R Journal, 9(2):378-400. doi:10.32614/rj-2017-066 . Implementation geometric anisotropy SPDE use random field GLMMs index standardization: Thorson, J.T., .O. Shelton, E.J. Ward, H.J. Skaug. 2015. Geostatistical delta-generalized linear mixed models improve precision estimated abundance indices West Coast groundfishes. ICES J. Mar. Sci. 72(5): 1297–1310. doi:10.1093/icesjms/fsu243 .","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/reference/sdmTMB.html","id":"ref-examples","dir":"Reference","previous_headings":"","what":"Examples","title":"Fit a spatial or spatiotemporal GLMM with TMB — sdmTMB","text":"","code":"library(sdmTMB) # Build a mesh to implement the SPDE approach: mesh <- make_mesh(pcod_2011, c(\"X\", \"Y\"), cutoff = 20) # - this example uses a fairly coarse mesh so these examples run quickly # - 'cutoff' is the minimum distance between mesh vertices in units of the # x and y coordinates # - 'cutoff = 10' might make more sense in applied situations for this dataset # - or build any mesh in 'fmesher' and pass it to the 'mesh' argument in make_mesh()` # - the mesh is not needed if you will be turning off all # spatial/spatiotemporal random fields # Quick mesh plot: plot(mesh) # Fit a Tweedie spatial random field GLMM with a smoother for depth: fit <- sdmTMB( density ~ s(depth), data = pcod_2011, mesh = mesh, family = tweedie(link = \"log\") ) fit #> Spatial model fit by ML ['sdmTMB'] #> Formula: density ~ s(depth) #> Mesh: mesh (isotropic covariance) #> Data: pcod_2011 #> Family: tweedie(link = 'log') #> #> coef.est coef.se #> (Intercept) 2.16 0.34 #> sdepth 1.94 3.13 #> #> Smooth terms: #> Std. Dev. #> sds(depth) 13.07 #> #> Dispersion parameter: 13.68 #> Tweedie p: 1.58 #> Matérn range: 16.84 #> Spatial SD: 2.20 #> ML criterion at convergence: 2937.789 #> #> See ?tidy.sdmTMB to extract these values as a data frame. # Extract coefficients: tidy(fit, conf.int = TRUE) #> # A tibble: 1 × 5 #> term estimate std.error conf.low conf.high #> #> 1 (Intercept) 2.16 0.340 1.50 2.83 tidy(fit, effects = \"ran_par\", conf.int = TRUE) #> # A tibble: 4 × 5 #> term estimate std.error conf.low conf.high #> #> 1 range 16.8 13.7 3.40 83.3 #> 2 phi 13.7 0.663 12.4 15.0 #> 3 sigma_O 2.20 1.23 0.735 6.59 #> 4 tweedie_p 1.58 0.0153 1.55 1.61 # Perform several 'sanity' checks: sanity(fit) #> ✔ Non-linear minimizer suggests successful convergence #> ✔ Hessian matrix is positive definite #> ✔ No extreme or very small eigenvalues detected #> ✔ No gradients with respect to fixed effects are >= 0.001 #> ✔ No fixed-effect standard errors are NA #> ✔ No standard errors look unreasonably large #> ✔ No sigma parameters are < 0.01 #> ✔ No sigma parameters are > 100 #> ✔ Range parameter doesn't look unreasonably large # Predict on the fitted data; see ?predict.sdmTMB p <- predict(fit) # Predict on new data: p <- predict(fit, newdata = qcs_grid) head(p) #> X Y depth depth_scaled depth_scaled2 est est_non_rf est_rf #> 1 456 5636 347.0834 1.5608122 2.43613479 -4.726638 -4.567385 -0.15925308 #> 2 458 5636 223.3348 0.5697699 0.32463771 2.342470 2.368314 -0.02584421 #> 3 460 5636 203.7408 0.3633693 0.13203724 3.087513 2.979948 0.10756466 #> 4 462 5636 183.2987 0.1257046 0.01580166 3.878560 3.637586 0.24097353 #> 5 464 5636 182.9998 0.1220368 0.01489297 4.020914 3.646532 0.37438240 #> 6 466 5636 186.3892 0.1632882 0.02666303 4.050895 3.543104 0.50779127 #> omega_s #> 1 -0.15925308 #> 2 -0.02584421 #> 3 0.10756466 #> 4 0.24097353 #> 5 0.37438240 #> 6 0.50779127 # \\donttest{ # Visualize the depth effect with ggeffects: ggeffects::ggpredict(fit, \"depth [all]\") |> plot() # Visualize depth effect with visreg: (see ?visreg_delta) visreg::visreg(fit, xvar = \"depth\") # link space; randomized quantile residuals visreg::visreg(fit, xvar = \"depth\", scale = \"response\") visreg::visreg(fit, xvar = \"depth\", scale = \"response\", gg = TRUE, rug = FALSE) # Add spatiotemporal random fields: fit <- sdmTMB( density ~ 0 + as.factor(year), time = \"year\", #< data = pcod_2011, mesh = mesh, family = tweedie(link = \"log\") ) fit #> Spatiotemporal model fit by ML ['sdmTMB'] #> Formula: density ~ 0 + as.factor(year) #> Mesh: mesh (isotropic covariance) #> Time column: year #> Data: pcod_2011 #> Family: tweedie(link = 'log') #> #> coef.est coef.se #> as.factor(year)2011 2.76 0.36 #> as.factor(year)2013 3.10 0.35 #> as.factor(year)2015 3.21 0.35 #> as.factor(year)2017 2.47 0.36 #> #> Dispersion parameter: 14.83 #> Tweedie p: 1.57 #> Matérn range: 13.31 #> Spatial SD: 3.16 #> Spatiotemporal IID SD: 1.79 #> ML criterion at convergence: 3007.552 #> #> See ?tidy.sdmTMB to extract these values as a data frame. # Make the fields AR1: fit <- sdmTMB( density ~ s(depth), time = \"year\", spatial = \"off\", spatiotemporal = \"ar1\", #< data = pcod_2011, mesh = mesh, family = tweedie(link = \"log\") ) fit #> Spatiotemporal model fit by ML ['sdmTMB'] #> Formula: density ~ s(depth) #> Mesh: mesh (isotropic covariance) #> Time column: year #> Data: pcod_2011 #> Family: tweedie(link = 'log') #> #> coef.est coef.se #> (Intercept) 1.84 0.33 #> sdepth 1.96 3.27 #> #> Smooth terms: #> Std. Dev. #> sds(depth) 13.6 #> #> Dispersion parameter: 12.84 #> Tweedie p: 1.55 #> Spatiotemporal AR1 correlation (rho): 0.67 #> Matérn range: 12.22 #> Spatiotemporal marginal AR1 SD: 3.28 #> ML criterion at convergence: 2914.393 #> #> See ?tidy.sdmTMB to extract these values as a data frame. # Make the fields a random walk: fit <- sdmTMB( density ~ s(depth), time = \"year\", spatial = \"off\", spatiotemporal = \"rw\", #< data = pcod_2011, mesh = mesh, family = tweedie(link = \"log\") ) fit #> Spatiotemporal model fit by ML ['sdmTMB'] #> Formula: density ~ s(depth) #> Mesh: mesh (isotropic covariance) #> Time column: year #> Data: pcod_2011 #> Family: tweedie(link = 'log') #> #> coef.est coef.se #> (Intercept) 1.95 0.34 #> sdepth 1.96 3.18 #> #> Smooth terms: #> Std. Dev. #> sds(depth) 13.22 #> #> Dispersion parameter: 12.84 #> Tweedie p: 1.56 #> Matérn range: 14.66 #> Spatiotemporal RW SD: 2.17 #> ML criterion at convergence: 2919.181 #> #> See ?tidy.sdmTMB to extract these values as a data frame. # Depth smoothers by year: fit <- sdmTMB( density ~ s(depth, by = as.factor(year)), #< time = \"year\", spatial = \"off\", spatiotemporal = \"rw\", data = pcod_2011, mesh = mesh, family = tweedie(link = \"log\") ) fit #> Spatiotemporal model fit by ML ['sdmTMB'] #> Formula: density ~ s(depth, by = as.factor(year)) #> Mesh: mesh (isotropic covariance) #> Time column: year #> Data: pcod_2011 #> Family: tweedie(link = 'log') #> #> coef.est coef.se #> (Intercept) 1.76 0.34 #> sdepth):as.factor(year)2011 0.07 4.02 #> sdepth):as.factor(year)2013 4.59 3.28 #> sdepth):as.factor(year)2015 5.97 6.01 #> sdepth):as.factor(year)2017 -1.97 3.22 #> #> Smooth terms: #> Std. Dev. #> sds(depth):as.factor(year)2011) 16.62 #> sds(depth):as.factor(year)2013) 13.57 #> sds(depth):as.factor(year)2015) 28.24 #> sds(depth):as.factor(year)2017) 18.65 #> #> Dispersion parameter: 12.70 #> Tweedie p: 1.55 #> Matérn range: 8.62 #> Spatiotemporal RW SD: 3.14 #> ML criterion at convergence: 2924.193 #> #> See ?tidy.sdmTMB to extract these values as a data frame. # 2D depth-year smoother: fit <- sdmTMB( density ~ s(depth, year), #< spatial = \"off\", data = pcod_2011, mesh = mesh, family = tweedie(link = \"log\") ) fit #> Model fit by ML ['sdmTMB'] #> Formula: density ~ s(depth, year) #> Mesh: mesh (isotropic covariance) #> Data: pcod_2011 #> Family: tweedie(link = 'log') #> #> coef.est coef.se #> (Intercept) 2.55 0.24 #> sdepthyear_1 -0.15 0.09 #> sdepthyear_2 -2.93 2.07 #> #> Smooth terms: #> Std. Dev. #> sds(depth,year) 6.08 #> #> Dispersion parameter: 14.95 #> Tweedie p: 1.60 #> ML criterion at convergence: 2974.143 #> #> See ?tidy.sdmTMB to extract these values as a data frame. # Turn off spatial random fields: fit <- sdmTMB( present ~ poly(log(depth)), spatial = \"off\", #< data = pcod_2011, mesh = mesh, family = binomial() ) fit #> Model fit by ML ['sdmTMB'] #> Formula: present ~ poly(log(depth)) #> Mesh: mesh (isotropic covariance) #> Data: pcod_2011 #> Family: binomial(link = 'logit') #> #> coef.est coef.se #> (Intercept) -0.16 0.07 #> poly(log(depth)) -13.19 2.14 #> #> ML criterion at convergence: 648.334 #> #> See ?tidy.sdmTMB to extract these values as a data frame. # Which, matches glm(): fit_glm <- glm( present ~ poly(log(depth)), data = pcod_2011, family = binomial() ) summary(fit_glm) #> #> Call: #> glm(formula = present ~ poly(log(depth)), family = binomial(), #> data = pcod_2011) #> #> Coefficients: #> Estimate Std. Error z value Pr(>|z|) #> (Intercept) -0.16433 0.06583 -2.496 0.0126 * #> poly(log(depth)) -13.18981 2.14179 -6.158 7.35e-10 *** #> --- #> Signif. codes: 0 ‘***’ 0.001 ‘**’ 0.01 ‘*’ 0.05 ‘.’ 0.1 ‘ ’ 1 #> #> (Dispersion parameter for binomial family taken to be 1) #> #> Null deviance: 1337.2 on 968 degrees of freedom #> Residual deviance: 1296.7 on 967 degrees of freedom #> AIC: 1300.7 #> #> Number of Fisher Scoring iterations: 4 #> AIC(fit, fit_glm) #> df AIC #> fit 2 1300.668 #> fit_glm 2 1300.668 # Delta/hurdle binomial-Gamma model: fit_dg <- sdmTMB( density ~ poly(log(depth), 2), data = pcod_2011, mesh = mesh, spatial = \"off\", family = delta_gamma() #< ) fit_dg #> Model fit by ML ['sdmTMB'] #> Formula: density ~ poly(log(depth), 2) #> Mesh: mesh (isotropic covariance) #> Data: pcod_2011 #> Family: delta_gamma(link1 = 'logit', link2 = 'log') #> #> Delta/hurdle model 1: ----------------------------------- #> Family: binomial(link = 'logit') #> coef.est coef.se #> (Intercept) -0.48 0.09 #> poly(log(depth), 2)1 -23.06 3.15 #> poly(log(depth), 2)2 -48.79 4.45 #> #> #> Delta/hurdle model 2: ----------------------------------- #> Family: Gamma(link = 'log') #> coef.est coef.se #> (Intercept) 4.24 0.08 #> poly(log(depth), 2)1 -5.49 3.50 #> poly(log(depth), 2)2 -13.26 3.23 #> #> Dispersion parameter: 0.64 #> #> ML criterion at convergence: 2936.579 #> #> See ?tidy.sdmTMB to extract these values as a data frame. # Delta model with different formulas and spatial structure: fit_dg <- sdmTMB( list(density ~ depth_scaled, density ~ poly(depth_scaled, 2)), #< data = pcod_2011, mesh = mesh, spatial = list(\"off\", \"on\"), #< family = delta_gamma() ) fit_dg #> Spatial model fit by ML ['sdmTMB'] #> Formula: list(density ~ depth_scaled, density ~ poly(depth_scaled, 2)) #> Mesh: mesh (isotropic covariance) #> Data: pcod_2011 #> Family: delta_gamma(link1 = 'logit', link2 = 'log') #> #> Delta/hurdle model 1: ----------------------------------- #> Family: binomial(link = 'logit') #> coef.est coef.se #> (Intercept) -0.17 0.07 #> depth_scaled -0.43 0.07 #> #> #> Delta/hurdle model 2: ----------------------------------- #> Family: Gamma(link = 'log') #> coef.est coef.se #> (Intercept) 4.08 0.14 #> poly(depth_scaled, 2)1 -6.15 4.52 #> poly(depth_scaled, 2)2 -12.58 4.14 #> #> Dispersion parameter: 0.72 #> Matérn range: 0.01 #> Spatial SD: 2149.14 #> #> ML criterion at convergence: 3034.512 #> #> See ?tidy.sdmTMB to extract these values as a data frame. #> #> **Possible issues detected! Check output of sanity().** # Delta/hurdle truncated NB2: pcod_2011$count <- round(pcod_2011$density) fit_nb2 <- sdmTMB( count ~ s(depth), data = pcod_2011, mesh = mesh, spatial = \"off\", family = delta_truncated_nbinom2() #< ) fit_nb2 #> Model fit by ML ['sdmTMB'] #> Formula: count ~ s(depth) #> Mesh: mesh (isotropic covariance) #> Data: pcod_2011 #> Family: delta_truncated_nbinom2(link1 = 'logit', link2 = 'log') #> #> Delta/hurdle model 1: ----------------------------------- #> Family: binomial(link = 'logit') #> coef.est coef.se #> (Intercept) -0.69 0.21 #> sdepth 0.29 2.37 #> #> Smooth terms: #> Std. Dev. #> sds(depth) 9.6 #> #> #> Delta/hurdle model 2: ----------------------------------- #> Family: truncated_nbinom2(link = 'log') #> coef.est coef.se #> (Intercept) 4.18 0.22 #> sdepth -0.37 1.77 #> #> Smooth terms: #> Std. Dev. #> sds(depth) 6.73 #> #> Dispersion parameter: 0.49 #> #> ML criterion at convergence: 2915.733 #> #> See ?tidy.sdmTMB to extract these values as a data frame. # Regular NB2: fit_nb2 <- sdmTMB( count ~ s(depth), data = pcod_2011, mesh = mesh, spatial = \"off\", family = nbinom2() #< ) fit_nb2 #> Model fit by ML ['sdmTMB'] #> Formula: count ~ s(depth) #> Mesh: mesh (isotropic covariance) #> Data: pcod_2011 #> Family: nbinom2(link = 'log') #> #> coef.est coef.se #> (Intercept) 2.49 0.27 #> sdepth 3.61 3.96 #> #> Smooth terms: #> Std. Dev. #> sds(depth) 16.4 #> #> Dispersion parameter: 0.14 #> ML criterion at convergence: 3006.939 #> #> See ?tidy.sdmTMB to extract these values as a data frame. # IID random intercepts by year: pcod_2011$fyear <- as.factor(pcod_2011$year) fit <- sdmTMB( density ~ s(depth) + (1 | fyear), #< data = pcod_2011, mesh = mesh, family = tweedie(link = \"log\") ) fit #> Spatial model fit by ML ['sdmTMB'] #> Formula: density ~ s(depth) + (1 | fyear) #> Mesh: mesh (isotropic covariance) #> Data: pcod_2011 #> Family: tweedie(link = 'log') #> #> coef.est coef.se #> (Intercept) 2.13 0.37 #> sdepth 1.82 2.99 #> #> Smooth terms: #> Std. Dev. #> sds(depth) 12.51 #> #> Random intercepts: #> Std. Dev. #> fyear 0.3 #> #> Dispersion parameter: 13.55 #> Tweedie p: 1.58 #> Matérn range: 16.66 #> Spatial SD: 2.21 #> ML criterion at convergence: 2933.138 #> #> See ?tidy.sdmTMB to extract these values as a data frame. # Spatially varying coefficient of year: pcod_2011$year_scaled <- as.numeric(scale(pcod_2011$year)) fit <- sdmTMB( density ~ year_scaled, spatial_varying = ~ 0 + year_scaled, #< data = pcod_2011, mesh = mesh, family = tweedie(), time = \"year\" ) fit #> Spatiotemporal model fit by ML ['sdmTMB'] #> Formula: density ~ year_scaled #> Mesh: mesh (isotropic covariance) #> Time column: year #> Data: pcod_2011 #> Family: tweedie(link = 'log') #> #> coef.est coef.se #> (Intercept) 2.86 0.33 #> year_scaled -0.06 0.15 #> #> Dispersion parameter: 14.79 #> Tweedie p: 1.56 #> Matérn range: 20.56 #> Spatial SD: 2.38 #> Spatially varying coefficient SD (year_scaled): 0.81 #> Spatiotemporal IID SD: 1.11 #> ML criterion at convergence: 3008.886 #> #> See ?tidy.sdmTMB to extract these values as a data frame. # Time-varying effects of depth and depth squared: fit <- sdmTMB( density ~ 0 + as.factor(year), time_varying = ~ 0 + depth_scaled + depth_scaled2, #< data = pcod_2011, time = \"year\", mesh = mesh, family = tweedie() ) print(fit) #> Spatiotemporal model fit by ML ['sdmTMB'] #> Formula: density ~ 0 + as.factor(year) #> Mesh: mesh (isotropic covariance) #> Time column: year #> Data: pcod_2011 #> Family: tweedie(link = 'log') #> #> coef.est coef.se #> as.factor(year)2011 3.73 0.30 #> as.factor(year)2013 3.64 0.28 #> as.factor(year)2015 4.00 0.29 #> as.factor(year)2017 3.31 0.32 #> #> Time-varying parameters: #> coef.est coef.se #> depth_scaled-2011 -0.87 0.16 #> depth_scaled-2013 -0.81 0.13 #> depth_scaled-2015 -0.75 0.13 #> depth_scaled-2017 -1.11 0.23 #> depth_scaled2-2011 -1.92 0.29 #> depth_scaled2-2013 -0.92 0.14 #> depth_scaled2-2015 -1.59 0.22 #> depth_scaled2-2017 -2.20 0.35 #> #> Dispersion parameter: 12.80 #> Tweedie p: 1.56 #> Matérn range: 0.01 #> Spatial SD: 2571.71 #> Spatiotemporal IID SD: 2086.94 #> ML criterion at convergence: 2911.371 #> #> See ?tidy.sdmTMB to extract these values as a data frame. #> #> **Possible issues detected! Check output of sanity().** # Extract values: est <- as.list(fit$sd_report, \"Estimate\") se <- as.list(fit$sd_report, \"Std. Error\") est$b_rw_t[, , 1] #> [,1] [,2] #> [1,] -0.8738038 -1.9199722 #> [2,] -0.8115162 -0.9195934 #> [3,] -0.7514766 -1.5858307 #> [4,] -1.1064060 -2.1992575 se$b_rw_t[, , 1] #> [,1] [,2] #> [1,] 0.1619744 0.2944087 #> [2,] 0.1287059 0.1386946 #> [3,] 0.1346182 0.2197914 #> [4,] 0.2305927 0.3514650 # Linear break-point effect of depth: fit <- sdmTMB( density ~ breakpt(depth_scaled), #< data = pcod_2011, mesh = mesh, family = tweedie() ) fit #> Spatial model fit by ML ['sdmTMB'] #> Formula: density ~ breakpt(depth_scaled) #> Mesh: mesh (isotropic covariance) #> Data: pcod_2011 #> Family: tweedie(link = 'log') #> #> coef.est coef.se #> (Intercept) 4.11 0.33 #> depth_scaled-slope 1.07 0.22 #> depth_scaled-breakpt -1.30 0.25 #> #> Dispersion parameter: 15.19 #> Tweedie p: 1.58 #> Matérn range: 23.57 #> Spatial SD: 1.92 #> ML criterion at convergence: 2997.241 #> #> See ?tidy.sdmTMB to extract these values as a data frame. # }"},{"path":"https://pbs-assess.github.io/sdmTMB/reference/sdmTMB_cv.html","id":null,"dir":"Reference","previous_headings":"","what":"Cross validation with sdmTMB models — sdmTMB_cv","title":"Cross validation with sdmTMB models — sdmTMB_cv","text":"Facilitates cross validation sdmTMB models. Returns log likelihood left-data, similar spirit ELPD (expected log pointwise predictive density). function option leave-future-cross validation. default, function creates folds randomly folds can manually assigned via fold_ids argument.","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/reference/sdmTMB_cv.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"Cross validation with sdmTMB models — sdmTMB_cv","text":"","code":"sdmTMB_cv( formula, data, mesh_args, mesh = NULL, time = NULL, k_folds = 8, fold_ids = NULL, lfo = FALSE, lfo_forecast = 1, lfo_validations = 5, parallel = TRUE, use_initial_fit = FALSE, future_globals = NULL, spde = deprecated(), ... )"},{"path":"https://pbs-assess.github.io/sdmTMB/reference/sdmTMB_cv.html","id":"arguments","dir":"Reference","previous_headings":"","what":"Arguments","title":"Cross validation with sdmTMB models — sdmTMB_cv","text":"formula Model formula. data data frame. mesh_args Arguments make_mesh(). supplied, mesh reconstructed fold. mesh Output make_mesh(). supplied, mesh constant across folds. time name time column. Leave NULL spatial data. k_folds Number folds. fold_ids Optional vector containing user fold IDs. Can also single string, e.g. \"fold_id\" representing name variable data. Ignored lfo TRUE lfo Whether implement leave-future-(LFO) cross validation data used predict future folds. time argument sdmTMB() must specified. See Details section . lfo_forecast lfo = TRUE, number time steps forecast. Time steps 1, ..., T used predict T + lfo_forecast last forecasted time step used validation. See Details section . lfo_validations lfo = TRUE, number times step LFOCV process. Defaults 5. See Details section . parallel TRUE future::plan() supplied, run parallel. use_initial_fit Fit first fold use parameter values starting values subsequent folds? Can faster many folds. future_globals character vector global variables used within arguments error returned future.apply find object. vector appended TRUE passed argument future.globals future.apply::future_lapply(). Useful global objects used specify arguments like priors, families, etc. spde Depreciated. Use mesh instead. ... arguments required run sdmTMB() model exception weights, used define folds.","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/reference/sdmTMB_cv.html","id":"value","dir":"Reference","previous_headings":"","what":"Value","title":"Cross validation with sdmTMB models — sdmTMB_cv","text":"list: data: Original data plus columns fold ID, CV predicted value, CV log likelihood. models: list models; one per fold. fold_loglik: Sum left-log likelihoods per fold. positive values better. sum_loglik: Sum fold_loglik across left-data. positive values better. pdHess: Logical vector: Hessian invertible fold? converged: Logical: pdHess TRUE? max_gradients: Max gradient per fold. Prior sdmTMB version '0.3.0.9002', elpd incorrectly returned log average likelihood, another metric compare models , ELPD. maximum likelihood, ELPD equivalent spirit sum log likelihoods.","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/reference/sdmTMB_cv.html","id":"details","dir":"Reference","previous_headings":"","what":"Details","title":"Cross validation with sdmTMB models — sdmTMB_cv","text":"Parallel processing Parallel processing can used setting future::plan(). example: Leave-future-cross validation (LFOCV) example LFOCV 9 time steps, lfo_forecast = 1, lfo_validations = 2: Fit data time steps 1 7, predict validate step 8. Fit data time steps 1 8, predict validate step 9. example LFOCV 9 time steps, lfo_forecast = 2, lfo_validations = 3: Fit data time steps 1 5, predict validate step 7. Fit data time steps 1 6, predict validate step 8. Fit data time steps 1 7, predict validate step 9. See example .","code":"library(future) plan(multisession) # now use sdmTMB_cv() ..."},{"path":"https://pbs-assess.github.io/sdmTMB/reference/sdmTMB_cv.html","id":"ref-examples","dir":"Reference","previous_headings":"","what":"Examples","title":"Cross validation with sdmTMB models — sdmTMB_cv","text":"","code":"mesh <- make_mesh(pcod, c(\"X\", \"Y\"), cutoff = 25) # Set parallel processing first if desired with the future package. # See the Details section above. m_cv <- sdmTMB_cv( density ~ 0 + depth_scaled + depth_scaled2, data = pcod, mesh = mesh, family = tweedie(link = \"log\"), k_folds = 2 ) #> Running fits with `future.apply()`. #> Set a parallel `future::plan()` to use parallel processing. m_cv$fold_loglik #> [1] -3351.292 -3298.167 m_cv$sum_loglik #> [1] -6649.459 head(m_cv$data) #> # A tibble: 6 × 15 #> year X Y depth density present lat lon depth_mean depth_sd #> #> 1 2003 446. 5793. 201 113. 1 52.3 -130. 5.16 0.445 #> 2 2003 446. 5800. 212 41.7 1 52.3 -130. 5.16 0.445 #> 3 2003 449. 5802. 220 0 0 52.4 -130. 5.16 0.445 #> 4 2003 437. 5802. 197 15.7 1 52.4 -130. 5.16 0.445 #> 5 2003 421. 5771. 256 0 0 52.1 -130. 5.16 0.445 #> 6 2003 418. 5772. 293 0 0 52.1 -130. 5.16 0.445 #> # ℹ 5 more variables: depth_scaled , depth_scaled2 , cv_fold , #> # cv_predicted , cv_loglik m_cv$models[[1]] #> Spatial model fit by ML ['sdmTMB'] #> Formula: density ~ 0 + depth_scaled + depth_scaled2 #> Family: tweedie(link = 'log') #> #> coef.est coef.se #> depth_scaled -2.07 0.23 #> depth_scaled2 -1.57 0.14 #> #> Dispersion parameter: 14.61 #> Tweedie p: 1.64 #> Matérn range: 100.81 #> Spatial SD: 3.11 #> ML criterion at convergence: 3191.001 #> #> See ?tidy.sdmTMB to extract these values as a data frame. m_cv$max_gradients #> [1] 2.788299e-08 6.945591e-10 # \\donttest{ # Create mesh each fold: m_cv2 <- sdmTMB_cv( density ~ 0 + depth_scaled + depth_scaled2, data = pcod, mesh_args = list(xy_cols = c(\"X\", \"Y\"), cutoff = 20), family = tweedie(link = \"log\"), k_folds = 2 ) #> Running fits with `future.apply()`. #> Set a parallel `future::plan()` to use parallel processing. # Use fold_ids: m_cv3 <- sdmTMB_cv( density ~ 0 + depth_scaled + depth_scaled2, data = pcod, mesh = mesh, family = tweedie(link = \"log\"), fold_ids = rep(seq(1, 3), nrow(pcod))[seq(1, nrow(pcod))] ) #> Running fits with `future.apply()`. #> Set a parallel `future::plan()` to use parallel processing. # }"},{"path":"https://pbs-assess.github.io/sdmTMB/reference/sdmTMB_simulate.html","id":null,"dir":"Reference","previous_headings":"","what":"Simulate from a spatial/spatiotemporal model — sdmTMB_simulate","title":"Simulate from a spatial/spatiotemporal model — sdmTMB_simulate","text":"sdmTMB_simulate() uses TMB simulate new data given specified parameter values. simulate.sdmTMB(), hand, takes existing model fit simulates new observations optionally new random effects.","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/reference/sdmTMB_simulate.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"Simulate from a spatial/spatiotemporal model — sdmTMB_simulate","text":"","code":"sdmTMB_simulate( formula, data, mesh, family = gaussian(link = \"identity\"), time = NULL, B = NULL, range = NULL, rho = NULL, sigma_O = NULL, sigma_E = NULL, sigma_Z = NULL, phi = NULL, tweedie_p = NULL, df = NULL, threshold_coefs = NULL, fixed_re = list(omega_s = NULL, epsilon_st = NULL, zeta_s = NULL), previous_fit = NULL, seed = sample.int(1e+06, 1), ... )"},{"path":"https://pbs-assess.github.io/sdmTMB/reference/sdmTMB_simulate.html","id":"arguments","dir":"Reference","previous_headings":"","what":"Arguments","title":"Simulate from a spatial/spatiotemporal model — sdmTMB_simulate","text":"formula one-sided formula describing fixed-effect structure. Random intercepts (yet) supported. Fixed effects match corresponding B argument vector coefficient values. data data frame containing predictors described formula time column time specified. mesh Output make_mesh(). family Family sdmTMB(). Delta families supported. Instead, simulate two component models separately combine. time time column name. B vector beta values (fixed-effect coefficient values). range Parameter controls decay spatial correlation. vector length 2, share_range set FALSE spatial spatiotemporal ranges unique. rho Spatiotemporal correlation years; -1 1. sigma_O SD spatial process (Omega). sigma_E SD spatiotemporal process (Epsilon). sigma_Z SD spatially varying coefficient field (Zeta). phi Observation error scale parameter (e.g., SD Gaussian). tweedie_p Tweedie p (power) parameter; 1 2. df Student-t degrees freedom. threshold_coefs optional vector threshold coefficient values formula includes breakpt() logistic(). breakpt(), slope cut values. logistic(), threshold function 50% maximum, threshold function 95% maximum, maximum. See model description vignette details. fixed_re list optional random effects fix specified (e.g., previously estimated) values. Values NULL result random effects simulated. previous_fit (Deprecated; please use simulate.sdmTMB()). optional previous sdmTMB() fit pull parameter values. -ruled non-NULL specified parameter arguments. seed Seed number. ... arguments pass sdmTMB().","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/reference/sdmTMB_simulate.html","id":"value","dir":"Reference","previous_headings":"","what":"Value","title":"Simulate from a spatial/spatiotemporal model — sdmTMB_simulate","text":"data frame : 1st column time variable (present). 2nd 3rd columns spatial coordinates. omega_s represents simulated spatial random effects (present). zeta_s represents simulated spatial varying covariate field (present). epsilon_st represents simulated spatiotemporal random effects (present). eta true value link space mu true value inverse link space. observed represents simulated process observation error. remaining columns fixed-effect model matrix.","code":""},{"path":[]},{"path":"https://pbs-assess.github.io/sdmTMB/reference/sdmTMB_simulate.html","id":"ref-examples","dir":"Reference","previous_headings":"","what":"Examples","title":"Simulate from a spatial/spatiotemporal model — sdmTMB_simulate","text":"","code":"set.seed(123) # make fake predictor(s) (a1) and sampling locations: predictor_dat <- data.frame( X = runif(300), Y = runif(300), a1 = rnorm(300), year = rep(1:6, each = 50) ) mesh <- make_mesh(predictor_dat, xy_cols = c(\"X\", \"Y\"), cutoff = 0.1) sim_dat <- sdmTMB_simulate( formula = ~ 1 + a1, data = predictor_dat, time = \"year\", mesh = mesh, family = gaussian(), range = 0.5, sigma_E = 0.1, phi = 0.1, sigma_O = 0.2, seed = 42, B = c(0.2, -0.4) # B0 = intercept, B1 = a1 slope ) head(sim_dat) #> year X Y omega_s epsilon_st mu eta #> 1 1 0.2875775 0.784575267 -0.02131861 -0.02779393 0.4369843 0.4369843 #> 2 1 0.7883051 0.009429905 0.28852319 0.09092583 0.8805246 0.8805246 #> 3 1 0.4089769 0.779065883 0.13541643 -0.08468148 0.6261504 0.6261504 #> 4 1 0.8830174 0.729390652 0.28597232 -0.01660011 0.8903775 0.8903775 #> 5 1 0.9404673 0.630131853 0.21070545 0.02005202 0.6056213 0.6056213 #> 6 1 0.0455565 0.480910830 -0.08071932 -0.11409909 -0.1272901 -0.1272901 #> observed (Intercept) a1 #> 1 0.4176273 1 -0.7152422 #> 2 0.8802910 1 -0.7526890 #> 3 0.6248675 1 -0.9385387 #> 4 0.9055722 1 -1.0525133 #> 5 0.6654724 1 -0.4371595 #> 6 -0.1399113 1 0.3311792 if (require(\"ggplot2\", quietly = TRUE)) { ggplot(sim_dat, aes(X, Y, colour = observed)) + geom_point() + facet_wrap(~year) + scale_color_gradient2() } # fit to the simulated data: fit <- sdmTMB(observed ~ a1, data = sim_dat, mesh = mesh, time = \"year\") fit #> Spatiotemporal model fit by ML ['sdmTMB'] #> Formula: observed ~ a1 #> Mesh: mesh (isotropic covariance) #> Time column: year #> Data: sim_dat #> Family: gaussian(link = 'identity') #> #> coef.est coef.se #> (Intercept) 0.23 0.09 #> a1 -0.39 0.01 #> #> Dispersion parameter: 0.09 #> Matérn range: 0.40 #> Spatial SD: 0.21 #> Spatiotemporal IID SD: 0.11 #> ML criterion at convergence: -162.527 #> #> See ?tidy.sdmTMB to extract these values as a data frame."},{"path":"https://pbs-assess.github.io/sdmTMB/reference/sdmTMB_stacking.html","id":null,"dir":"Reference","previous_headings":"","what":"Perform stacking with log scores on sdmTMB_cv() output — sdmTMB_stacking","title":"Perform stacking with log scores on sdmTMB_cv() output — sdmTMB_stacking","text":"approach described Yao et al. (2018) doi:10.1214/17-BA1091 . general method minimizes (maximizes) quantity across models. simple models normal error, may root mean squared error (RMSE), approaches include log score. adopt latter , log scores used generate stacking predictive distributions","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/reference/sdmTMB_stacking.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"Perform stacking with log scores on sdmTMB_cv() output — sdmTMB_stacking","text":"","code":"sdmTMB_stacking(model_list, include_folds = NULL)"},{"path":"https://pbs-assess.github.io/sdmTMB/reference/sdmTMB_stacking.html","id":"arguments","dir":"Reference","previous_headings":"","what":"Arguments","title":"Perform stacking with log scores on sdmTMB_cv() output — sdmTMB_stacking","text":"model_list list models fit sdmTMB_cv() generate estimates predictive densities. want set seed value fitting model manually construct fold IDs across models. include_folds optional numeric vector specifying folds include calculations. example, 5 folds used k-fold cross validation, first 4 needed generate weights, include_folds = 1:4.","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/reference/sdmTMB_stacking.html","id":"value","dir":"Reference","previous_headings":"","what":"Value","title":"Perform stacking with log scores on sdmTMB_cv() output — sdmTMB_stacking","text":"vector model weights.","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/reference/sdmTMB_stacking.html","id":"references","dir":"Reference","previous_headings":"","what":"References","title":"Perform stacking with log scores on sdmTMB_cv() output — sdmTMB_stacking","text":"Yao, Y., Vehtari, ., Simpson, D., Gelman, . 2018. Using Stacking Average Bayesian Predictive Distributions (Discussion). Bayesian Analysis 13(3): 917–1007. International Society Bayesian Analysis. doi:10.1214/17-BA1091","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/reference/sdmTMB_stacking.html","id":"ref-examples","dir":"Reference","previous_headings":"","what":"Examples","title":"Perform stacking with log scores on sdmTMB_cv() output — sdmTMB_stacking","text":"","code":"# \\donttest{ # Set parallel processing if desired. See 'Details' in ?sdmTMB_cv # Depth as quadratic: set.seed(1) m_cv_1 <- sdmTMB_cv( density ~ 0 + depth_scaled + depth_scaled2, data = pcod_2011, mesh = pcod_mesh_2011, family = tweedie(link = \"log\"), k_folds = 2 ) #> Running fits with `future.apply()`. #> Set a parallel `future::plan()` to use parallel processing. # Depth as linear: set.seed(1) m_cv_2 <- sdmTMB_cv( density ~ 0 + depth_scaled, data = pcod_2011, mesh = pcod_mesh_2011, family = tweedie(link = \"log\"), k_folds = 2 ) #> Running fits with `future.apply()`. #> Set a parallel `future::plan()` to use parallel processing. # Only an intercept: set.seed(1) m_cv_3 <- sdmTMB_cv( density ~ 1, data = pcod_2011, mesh = pcod_mesh_2011, family = tweedie(link = \"log\"), k_folds = 2 ) #> Running fits with `future.apply()`. #> Set a parallel `future::plan()` to use parallel processing. models <- list(m_cv_1, m_cv_2, m_cv_3) weights <- sdmTMB_stacking(models) weights #> [1] 0.9038042 0.0182349 0.0779609 # }"},{"path":"https://pbs-assess.github.io/sdmTMB/reference/sdmTMBcontrol.html","id":null,"dir":"Reference","previous_headings":"","what":"Optimization control options — sdmTMBcontrol","title":"Optimization control options — sdmTMBcontrol","text":"sdmTMB() stats::nlminb() control options.","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/reference/sdmTMBcontrol.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"Optimization control options — sdmTMBcontrol","text":"","code":"sdmTMBcontrol( eval.max = 2000L, iter.max = 1000L, normalize = FALSE, nlminb_loops = 1L, newton_loops = 1L, mgcv = deprecated(), quadratic_roots = FALSE, start = NULL, map_rf = deprecated(), map = NULL, lower = NULL, upper = NULL, censored_upper = NULL, multiphase = TRUE, profile = FALSE, get_joint_precision = TRUE, parallel = getOption(\"sdmTMB.cores\", 1L), ... )"},{"path":"https://pbs-assess.github.io/sdmTMB/reference/sdmTMBcontrol.html","id":"arguments","dir":"Reference","previous_headings":"","what":"Arguments","title":"Optimization control options — sdmTMBcontrol","text":"eval.max Maximum number evaluations objective function allowed. iter.max Maximum number iterations allowed. normalize Logical: use TMB::normalize() normalize process likelihood using Laplace approximation? Can result substantial speed boost cases. used default FALSE prior May 2021. Currently working models fit REML random intercepts. nlminb_loops many times run stats::nlminb() optimization. Sometimes restarting optimizer previous best values aids convergence. maximum gradient still large, try increasing 2. newton_loops many Newton optimization steps try running stats::nlminb(). sometimes aids convergence reducing log-likelihood gradient respect fixed effects. calculates Hessian current MLE stats::optimHess() using finite-difference approach uses update fixed effect estimates. mgcv Deprecated Parse formula mgcv::gam()? quadratic_roots Experimental feature internal use right now; may moved branch. Logical: quadratic roots calculated? Note: sdmTMB side, first two coefficients used generate quadratic parameters. means want generate quadratic profile depth, depth depth^2 part formula, need make sure listed first intercept included. example, formula = cpue ~ 0 + depth + depth2 + .factor(year). start named list specifying starting values parameters. can see necessary structure fitting model inspecting your_model$tmb_obj$env$parList(). Elements start specified replace default starting values. map_rf Deprecated use spatial = '', spatiotemporal = '' sdmTMB(). map named list factor NAs specifying parameter values fixed constant value. See documentation TMB::MakeADFun(). usually used start specify fixed value. lower optional named list lower bounds within optimization. Parameter vectors name (e.g., b_j ln_kappa cases) can specified numeric vector. E.g. lower = list(b_j = c(-5, -5)). upper optional named list upper bounds within optimization. censored_upper optional vector upper bounds sdmTMBcontrol(). Values NA indicate unbounded right-censored distribution, values greater observation indicate upper bound, values equal observation indicate censoring. multiphase Logical: estimate fixed random effects phases? Phases usually faster stable. profile Logical: population-level/fixed effects profiled likelihood? appended random effects vector without Laplace approximation. See TMB::MakeADFun(). can dramatically speed model fit many fixed effects experimental stage. get_joint_precision Logical. Passed getJointPrecision TMB::sdreport(). Must TRUE use simulation-based methods predict.sdmTMB() [get_index_sims()]. needed, setting FALSE reduce object size. parallel Argument currently ignored. parallel processing 3 cores, example, use TMB::openmp(n = 3, DLL = \"sdmTMB\"). careful, always faster cores definitely upper limit. ... Anything else. See 'Control parameters' section stats::nlminb().","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/reference/sdmTMBcontrol.html","id":"value","dir":"Reference","previous_headings":"","what":"Value","title":"Optimization control options — sdmTMBcontrol","text":"list control arguments","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/reference/sdmTMBcontrol.html","id":"details","dir":"Reference","previous_headings":"","what":"Details","title":"Optimization control options — sdmTMBcontrol","text":"Usually used within sdmTMB(). example:","code":"sdmTMB(..., control = sdmTMBcontrol(newton_loops = 2))"},{"path":"https://pbs-assess.github.io/sdmTMB/reference/sdmTMBcontrol.html","id":"ref-examples","dir":"Reference","previous_headings":"","what":"Examples","title":"Optimization control options — sdmTMBcontrol","text":"","code":"sdmTMBcontrol() #> $eval.max #> [1] 2000 #> #> $iter.max #> [1] 1000 #> #> $normalize #> [1] FALSE #> #> $nlminb_loops #> [1] 1 #> #> $newton_loops #> [1] 1 #> #> $profile #> [1] FALSE #> #> $quadratic_roots #> [1] FALSE #> #> $start #> NULL #> #> $map #> NULL #> #> $lower #> NULL #> #> $upper #> NULL #> #> $censored_upper #> NULL #> #> $multiphase #> [1] TRUE #> #> $parallel #> [1] 1 #> #> $get_joint_precision #> [1] TRUE #>"},{"path":"https://pbs-assess.github.io/sdmTMB/reference/set_delta_model.html","id":null,"dir":"Reference","previous_headings":"","what":"Set delta model for ggeffects::ggpredict() — set_delta_model","title":"Set delta model for ggeffects::ggpredict() — set_delta_model","text":"Set delta model component predict ggeffects::ggpredict().","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/reference/set_delta_model.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"Set delta model for ggeffects::ggpredict() — set_delta_model","text":"","code":"set_delta_model(x, model = c(NA, 1, 2))"},{"path":"https://pbs-assess.github.io/sdmTMB/reference/set_delta_model.html","id":"arguments","dir":"Reference","previous_headings":"","what":"Arguments","title":"Set delta model for ggeffects::ggpredict() — set_delta_model","text":"x sdmTMB() model fit delta family delta_gamma(). model delta/hurdle model component predict/plot . NA combined prediction, 1 binomial part, 2 positive part.","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/reference/set_delta_model.html","id":"value","dir":"Reference","previous_headings":"","what":"Value","title":"Set delta model for ggeffects::ggpredict() — set_delta_model","text":"fitted model new attribute named delta_model_predict. suggest use set_delta_model() pipe (examples) attribute persist. Otherwise, predict.sdmTMB() choose model component default. can also remove attribute :","code":"attr(fit, \"delta_model_predict\") <- NULL"},{"path":"https://pbs-assess.github.io/sdmTMB/reference/set_delta_model.html","id":"details","dir":"Reference","previous_headings":"","what":"Details","title":"Set delta model for ggeffects::ggpredict() — set_delta_model","text":"complete version examples : run CRAN version ggeffects > 1.3.2 CRAN. now, can install GitHub version ggeffects. https://github.com/strengejacke/ggeffects.","code":"fit <- sdmTMB(density ~ poly(depth_scaled, 2), data = pcod_2011, spatial = \"off\", family = delta_gamma()) # binomial part: set_delta_model(fit, model = 1) |> ggeffects::ggpredict(\"depth_scaled [all]\") # gamma part: set_delta_model(fit, model = 2) |> ggeffects::ggpredict(\"depth_scaled [all]\") # combined: set_delta_model(fit, model = NA) |> ggeffects::ggpredict(\"depth_scaled [all]\")"},{"path":"https://pbs-assess.github.io/sdmTMB/reference/set_delta_model.html","id":"ref-examples","dir":"Reference","previous_headings":"","what":"Examples","title":"Set delta model for ggeffects::ggpredict() — set_delta_model","text":"","code":"fit <- sdmTMB(density ~ poly(depth_scaled, 2), data = pcod_2011, spatial = \"off\", family = delta_gamma()) # binomial part: set_delta_model(fit, model = 1) #> Model fit by ML ['sdmTMB'] #> Formula: density ~ poly(depth_scaled, 2) #> Mesh: NULL (isotropic covariance) #> Data: pcod_2011 #> Family: delta_gamma(link1 = 'logit', link2 = 'log') #> #> Delta/hurdle model 1: ----------------------------------- #> Family: binomial(link = 'logit') #> coef.est coef.se #> (Intercept) -0.48 0.09 #> poly(depth_scaled, 2)1 -23.06 3.15 #> poly(depth_scaled, 2)2 -48.79 4.45 #> #> #> Delta/hurdle model 2: ----------------------------------- #> Family: Gamma(link = 'log') #> coef.est coef.se #> (Intercept) 4.24 0.08 #> poly(depth_scaled, 2)1 -5.49 3.50 #> poly(depth_scaled, 2)2 -13.26 3.23 #> #> Dispersion parameter: 0.64 #> #> ML criterion at convergence: 2936.579 #> #> See ?tidy.sdmTMB to extract these values as a data frame. # gamma part: set_delta_model(fit, model = 2) #> Model fit by ML ['sdmTMB'] #> Formula: density ~ poly(depth_scaled, 2) #> Mesh: NULL (isotropic covariance) #> Data: pcod_2011 #> Family: delta_gamma(link1 = 'logit', link2 = 'log') #> #> Delta/hurdle model 1: ----------------------------------- #> Family: binomial(link = 'logit') #> coef.est coef.se #> (Intercept) -0.48 0.09 #> poly(depth_scaled, 2)1 -23.06 3.15 #> poly(depth_scaled, 2)2 -48.79 4.45 #> #> #> Delta/hurdle model 2: ----------------------------------- #> Family: Gamma(link = 'log') #> coef.est coef.se #> (Intercept) 4.24 0.08 #> poly(depth_scaled, 2)1 -5.49 3.50 #> poly(depth_scaled, 2)2 -13.26 3.23 #> #> Dispersion parameter: 0.64 #> #> ML criterion at convergence: 2936.579 #> #> See ?tidy.sdmTMB to extract these values as a data frame. # combined: set_delta_model(fit, model = NA) #> Model fit by ML ['sdmTMB'] #> Formula: density ~ poly(depth_scaled, 2) #> Mesh: NULL (isotropic covariance) #> Data: pcod_2011 #> Family: delta_gamma(link1 = 'logit', link2 = 'log') #> #> Delta/hurdle model 1: ----------------------------------- #> Family: binomial(link = 'logit') #> coef.est coef.se #> (Intercept) -0.48 0.09 #> poly(depth_scaled, 2)1 -23.06 3.15 #> poly(depth_scaled, 2)2 -48.79 4.45 #> #> #> Delta/hurdle model 2: ----------------------------------- #> Family: Gamma(link = 'log') #> coef.est coef.se #> (Intercept) 4.24 0.08 #> poly(depth_scaled, 2)1 -5.49 3.50 #> poly(depth_scaled, 2)2 -13.26 3.23 #> #> Dispersion parameter: 0.64 #> #> ML criterion at convergence: 2936.579 #> #> See ?tidy.sdmTMB to extract these values as a data frame."},{"path":"https://pbs-assess.github.io/sdmTMB/reference/simulate.sdmTMB.html","id":null,"dir":"Reference","previous_headings":"","what":"Simulate from a fitted sdmTMB model — simulate.sdmTMB","title":"Simulate from a fitted sdmTMB model — simulate.sdmTMB","text":"simulate.sdmTMB S3 method producing matrix simulations fitted model. similar lme4::simulate.merMod() glmmTMB::simulate.glmmTMB(). can used DHARMa package among uses.","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/reference/simulate.sdmTMB.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"Simulate from a fitted sdmTMB model — simulate.sdmTMB","text":"","code":"# S3 method for sdmTMB simulate( object, nsim = 1L, seed = sample.int(1e+06, 1L), type = c(\"mle-eb\", \"mle-mvn\"), model = c(NA, 1, 2), re_form = NULL, mcmc_samples = NULL, silent = TRUE, ... )"},{"path":"https://pbs-assess.github.io/sdmTMB/reference/simulate.sdmTMB.html","id":"arguments","dir":"Reference","previous_headings":"","what":"Arguments","title":"Simulate from a fitted sdmTMB model — simulate.sdmTMB","text":"object sdmTMB model nsim Number response lists simulate. Defaults 1. seed Random number seed type parameters treated. \"mle-eb\": fixed effects maximum likelihood (MLE) estimates random effects empirical Bayes (EB) estimates. \"mle-mvn\": fixed effects MLEs random effects taken single approximate sample. latter option suggested approach simulations used goodness fit testing (e.g., DHARMa package). model delta/hurdle model, model simulate ? NA = combined, 1 = first model, 2 = second mdoel. re_form NULL specify simulation conditional fitted random effects (simulates observation error). ~0 NA simulate new random affects (smoothers, internally random effects, simulated new). mcmc_samples optional matrix MCMC samples. See extract_mcmc() sdmTMBextra package. silent Logical. Silent? ... Extra arguments (used)","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/reference/simulate.sdmTMB.html","id":"value","dir":"Reference","previous_headings":"","what":"Value","title":"Simulate from a fitted sdmTMB model — simulate.sdmTMB","text":"Returns matrix; number columns nsim.","code":""},{"path":[]},{"path":"https://pbs-assess.github.io/sdmTMB/reference/simulate.sdmTMB.html","id":"ref-examples","dir":"Reference","previous_headings":"","what":"Examples","title":"Simulate from a fitted sdmTMB model — simulate.sdmTMB","text":"","code":"# start with some data simulated from scratch: set.seed(1) predictor_dat <- data.frame(X = runif(300), Y = runif(300), a1 = rnorm(300)) mesh <- make_mesh(predictor_dat, xy_cols = c(\"X\", \"Y\"), cutoff = 0.1) dat <- sdmTMB_simulate( formula = ~ 1 + a1, data = predictor_dat, mesh = mesh, family = poisson(), range = 0.5, sigma_O = 0.2, seed = 42, B = c(0.2, -0.4) # B0 = intercept, B1 = a1 slope ) fit <- sdmTMB(observed ~ 1 + a1, data = dat, family = poisson(), mesh = mesh) # simulate from the model: s1 <- simulate(fit, nsim = 300) dim(s1) #> [1] 300 300 # test whether fitted models are consistent with the observed number of zeros: sum(s1 == 0)/length(s1) #> [1] 0.3297667 sum(dat$observed == 0) / length(dat$observed) #> [1] 0.3466667 # simulate with random effects sampled from their approximate posterior s2 <- simulate(fit, nsim = 1, params = \"mle-mvn\") # these may be useful in conjunction with DHARMa simulation-based residuals # simulate with new random fields: s3 <- simulate(fit, nsim = 1, re_form = ~ 0)"},{"path":"https://pbs-assess.github.io/sdmTMB/reference/surveydata.html","id":null,"dir":"Reference","previous_headings":"","what":"Example fish survey data — pcod","title":"Example fish survey data — pcod","text":"Various fish survey datasets.","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/reference/surveydata.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"Example fish survey data — pcod","text":"","code":"pcod pcod_2011 pcod_mesh_2011 qcs_grid dogfish yelloweye hbll_s_grid wcvi_grid"},{"path":"https://pbs-assess.github.io/sdmTMB/reference/surveydata.html","id":"format","dir":"Reference","previous_headings":"","what":"Format","title":"Example fish survey data — pcod","text":"pcod: Trawl survey data Pacific Cod Queen Charlotte Sound. data frame. pcod_2011: version pcod years 2011 (smaller speed). data frame. pcod_mesh_2011: mesh pre-built pcod_2011 examples. list class sdmTMBmesh. qcs_grid 2x2km prediction grid Queen Charlotte Sound. data frame. dogfish: Trawl survey data Pacific Spiny Dogfish West Coast Vancouver Island. data frame. yelloweye: Survey data Yelloweye Rockfish Hard Bottom Longline Survey (South) West Coast Vancouver Island. hbll_s_grid: survey domain grid go yelloweye. data frame. wcvi_grid: survey domain grid go dogfish. data frame.","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/reference/tidy.sdmTMB.html","id":null,"dir":"Reference","previous_headings":"","what":"Turn sdmTMB model output into a tidy data frame — tidy.sdmTMB","title":"Turn sdmTMB model output into a tidy data frame — tidy.sdmTMB","text":"Turn sdmTMB model output tidy data frame","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/reference/tidy.sdmTMB.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"Turn sdmTMB model output into a tidy data frame — tidy.sdmTMB","text":"","code":"# S3 method for sdmTMB tidy( x, effects = c(\"fixed\", \"ran_pars\", \"ran_vals\"), model = 1, conf.int = FALSE, conf.level = 0.95, exponentiate = FALSE, silent = FALSE, ... )"},{"path":"https://pbs-assess.github.io/sdmTMB/reference/tidy.sdmTMB.html","id":"arguments","dir":"Reference","previous_headings":"","what":"Arguments","title":"Turn sdmTMB model output into a tidy data frame — tidy.sdmTMB","text":"x Output sdmTMB(). effects character value. One \"fixed\" ('fixed' main-effect parameters), \"ran_pars\" (standard deviations, spatial range, random effect dispersion-related terms), \"ran_vals\" (individual random intercepts, included; behaves like ranef()). model model tidy delta model (1 2). conf.int Include confidence interval? conf.level Confidence level CI. exponentiate Whether exponentiate fixed-effect coefficient estimates confidence intervals. silent Omit messages? ... Extra arguments (used).","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/reference/tidy.sdmTMB.html","id":"value","dir":"Reference","previous_headings":"","what":"Value","title":"Turn sdmTMB model output into a tidy data frame — tidy.sdmTMB","text":"data frame","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/reference/tidy.sdmTMB.html","id":"details","dir":"Reference","previous_headings":"","what":"Details","title":"Turn sdmTMB model output into a tidy data frame — tidy.sdmTMB","text":"Follows conventions broom broom.mixed packages. Currently, effects = \"ran_pars\" also includes dispersion-related terms (e.g., phi), actually associated random effects. Standard errors spatial variance terms fit log space (e.g., variance terms, range, parameters associated observation error) omitted avoid confusion. Confidence intervals still available.","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/reference/tidy.sdmTMB.html","id":"ref-examples","dir":"Reference","previous_headings":"","what":"Examples","title":"Turn sdmTMB model output into a tidy data frame — tidy.sdmTMB","text":"","code":"fit <- sdmTMB(density ~ poly(depth_scaled, 2, raw = TRUE), data = pcod_2011, mesh = pcod_mesh_2011, family = tweedie() ) tidy(fit) #> # A tibble: 3 × 3 #> term estimate std.error #> #> 1 (Intercept) 3.65 0.281 #> 2 poly(depth_scaled, 2, raw = TRUE)1 -1.54 0.186 #> 3 poly(depth_scaled, 2, raw = TRUE)2 -1.11 0.101 tidy(fit, conf.int = TRUE) #> # A tibble: 3 × 5 #> term estimate std.error conf.low conf.high #> #> 1 (Intercept) 3.65 0.281 3.10 4.20 #> 2 poly(depth_scaled, 2, raw = TRUE)1 -1.54 0.186 -1.90 -1.17 #> 3 poly(depth_scaled, 2, raw = TRUE)2 -1.11 0.101 -1.31 -0.913 tidy(fit, \"ran_pars\", conf.int = TRUE) #> # A tibble: 4 × 5 #> term estimate std.error conf.low conf.high #> #> 1 range 19.1 14.0 4.58 80.0 #> 2 phi 14.0 0.677 12.8 15.4 #> 3 sigma_O 2.14 0.941 0.906 5.07 #> 4 tweedie_p 1.58 0.0153 1.55 1.61 pcod_2011$fyear <- as.factor(pcod_2011$year) fit <- sdmTMB(density ~ poly(depth_scaled, 2, raw = TRUE) + (1 | fyear), data = pcod_2011, mesh = pcod_mesh_2011, family = tweedie() ) tidy(fit, \"ran_vals\") #> # A tibble: 4 × 3 #> term estimate std.error #> #> 1 fyear_2011 0.0163 0.187 #> 2 fyear_2013 0.177 0.188 #> 3 fyear_2015 0.232 0.189 #> 4 fyear_2017 -0.431 0.205"},{"path":"https://pbs-assess.github.io/sdmTMB/reference/visreg_delta.html","id":null,"dir":"Reference","previous_headings":"","what":"Plot sdmTMB models with the visreg package — visreg_delta","title":"Plot sdmTMB models with the visreg package — visreg_delta","text":"sdmTMB models fit regular (non-delta) families can passed visreg::visreg() visreg::visreg2d() directly. Examples shown . Delta models can use helper functions visreg_delta() visreg2d_delta() described .","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/reference/visreg_delta.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"Plot sdmTMB models with the visreg package — visreg_delta","text":"","code":"visreg_delta(object, ..., model = c(1, 2)) visreg2d_delta(object, ..., model = c(1, 2))"},{"path":"https://pbs-assess.github.io/sdmTMB/reference/visreg_delta.html","id":"arguments","dir":"Reference","previous_headings":"","what":"Arguments","title":"Plot sdmTMB models with the visreg package — visreg_delta","text":"object Fit sdmTMB() ... arguments passed visreg::visreg() visreg::visreg2d() model 1st 2nd delta model","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/reference/visreg_delta.html","id":"value","dir":"Reference","previous_headings":"","what":"Value","title":"Plot sdmTMB models with the visreg package — visreg_delta","text":"plot visreg package. Optionally, data plotted invisibly plot = FALSE. useful want make plot .","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/reference/visreg_delta.html","id":"details","dir":"Reference","previous_headings":"","what":"Details","title":"Plot sdmTMB models with the visreg package — visreg_delta","text":"Note residuals currently randomized quantile residuals, deviance residuals usual GLMs visreg.","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/reference/visreg_delta.html","id":"ref-examples","dir":"Reference","previous_headings":"","what":"Examples","title":"Plot sdmTMB models with the visreg package — visreg_delta","text":"","code":"if (require(\"ggplot2\", quietly = TRUE) && require(\"visreg\", quietly = TRUE)) { # \\donttest{ fit <- sdmTMB( density ~ s(depth_scaled), data = pcod_2011, spatial = \"off\", family = tweedie() ) visreg::visreg(fit, xvar = \"depth_scaled\") visreg::visreg(fit, xvar = \"depth_scaled\", scale = \"response\") v <- visreg::visreg(fit, xvar = \"depth_scaled\") head(v$fit) # now use ggplot2 etc. if desired # Delta model example: fit_dg <- sdmTMB( density ~ s(depth_scaled, year, k = 8), data = pcod_2011, mesh = pcod_mesh_2011, spatial = \"off\", family = delta_gamma() ) visreg_delta(fit_dg, xvar = \"depth_scaled\", model = 1, gg = TRUE) visreg_delta(fit_dg, xvar = \"depth_scaled\", model = 2, gg = TRUE) visreg_delta(fit_dg, xvar = \"depth_scaled\", model = 1, scale = \"response\", gg = TRUE ) visreg_delta(fit_dg, xvar = \"depth_scaled\", model = 2, scale = \"response\" ) visreg_delta(fit_dg, xvar = \"depth_scaled\", model = 2, scale = \"response\", gg = TRUE, rug = FALSE ) visreg2d_delta(fit_dg, xvar = \"depth_scaled\", yvar = \"year\", model = 2, scale = \"response\" ) visreg2d_delta(fit_dg, xvar = \"depth_scaled\", yvar = \"year\", model = 1, scale = \"response\", plot.type = \"persp\" ) visreg2d_delta(fit_dg, xvar = \"depth_scaled\", yvar = \"year\", model = 2, scale = \"response\", plot.type = \"gg\" ) # } } #> These are residuals for delta model component 1. Use the `model` argument to #> select the other component. #> These are residuals for delta model component 2. Use the `model` argument to #> select the other component. #> These are residuals for delta model component 1. Use the `model` argument to #> select the other component. #> These are residuals for delta model component 2. Use the `model` argument to #> select the other component. #> These are residuals for delta model component 2. Use the `model` argument to #> select the other component."},{"path":"https://pbs-assess.github.io/sdmTMB/news/index.html","id":"sdmtmb-development-version","dir":"Changelog","previous_headings":"","what":"sdmTMB (development version)","title":"sdmTMB (development version)","text":"Pass several arguments DHARMa::plotQQunif() set uniformity p-value argumemt TRUE default. Add silent option simulate.sdmTMB(). Setting FALSE allows monitoring simulations larger models. Fix bug est_non_rf1 est_non_rf2 columns following conditions true: predicting new data using delta model including IID random intercepts time-varying coefficients See #342. Thanks @tom-peatman issue report. Fix delta-gamma binomial link printing type = 'poisson-link' #340 Add suggestion use optimized BLAS library README. Add warning ’s detected problems reloading (e.g., readRDS()) fitted model. Simultaneously revert approach reloaded models reattached. Check random effects haven’t reverted zero readRDS() fitted model. E.g. get_pars(fit)$omega_s spatial random fields enabled. Move log_ratio_mix parameter 2nd phase starting value -1 instead 0 improve convergence. Fix bugs nbinom1() nbinom2_mix() simulation. Allow profile argument control list take character vector parameters. Add theoretical quantile residuals generalized gamma distribution. Thanks J.C. Dunic. #333 Add poisson-link delta-mixture lognormal. Fix bug simulation Poisson-link delta models. Simplify internal treatment extra time slices (extra_time). #329 less bug prone also fixes recently created bug. #335","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/news/index.html","id":"sdmtmb-050","dir":"Changelog","previous_headings":"","what":"sdmTMB 0.5.0","title":"sdmTMB 0.5.0","text":"CRAN release: 2024-04-03 Overhaul residuals vignette (‘article’) https://pbs-assess.github.io/sdmTMB/articles/web_only/residual-checking.html including brief intros randomized quantile residuals, simulation-based residuals, ‘one-sample’ residuals, uniform vs. Gaussian residuals. Add check prediction coordinates appear outside fitted coordinates. #285 Fix memory issue Tweedie family large datasets. #302 Add experimental option return standard normal residuals dharma_residuals(). Make simulate.sdmTMB() include extra_time elements. Improved re-initialization saved fitted model objects new sessions. Fix important bug simulate.sdmTMB() method delta families positive linear predictor getting simulated observations present fitted data. Add new \"mle-mvn\" type residuals.sdmTMB() make default. fast option evaluating goodness fit better previous default. See details section ?residuals.sdmTMB details. previous default now called \"mvn-eb\" recommended. Bring dharma_residuals() back sdmTMBextra sdmTMB. Add new option type argument (\"mle-mvn\") make simulation residuals consistent expected distribution. See new documentation ?residuals.sdmTMB. examples ?dharma_residuals illustrate suggested use. Fix bug sanity() gradient checks missing abs() large negative gradients weren’t getting caught. #324 Return offset vector fitted object element. Ensure extra time rows data data element fitted object include extra time slices. Add experimental residuals option “mle-mvn” single approximate posterior sample random effects drawn combined MLE fixed effects produce residuals. may become default option. Add generalized gamma distribution (thanks J.T. Thorson additional work J.C. Dunic.) See gengamma(). distribution still testing phase recommended applied use yet. #286 Detect possible issue factor(time) formula column name used time extra_time specified. #320 Improve sanity() check output NA fixed effect standard errors. Set intern = FALSE within index bias correction, seems considerably faster testing models.","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/news/index.html","id":"sdmtmb-043","dir":"Changelog","previous_headings":"","what":"sdmTMB 0.4.3","title":"sdmTMB 0.4.3","text":"CRAN release: 2024-02-29 Fix bug likely introduced July 2023 caused issues extra_time specified. important bug models fit extra_time date (using GitHub version) v0.4.2.9004 (2024-02-24) checked current version sdmTMB (v0.4.2.9005 greater). CRAN, affected v0.4.0 (2023-10-20) v0.4.2. Details: essence bug extra_time works padding data fake row data every extra time element (using first row data template). supposed omitted likelihood impact model fitting beyond spacing time-series processes appropriately setting internal structures forecasting. Unfortunately, bug introduced caused fake data (1 per extra time element) included likelihood. Issue error time column NAs. #298 #299 Fix bug get_cog(..., format = \"wide\") time column hardcoded \"year\" accident. Poisson-link delta models now use type argument delta_gamma() delta_lognormal(). delta_poisson_link_gamma() delta_poisson_link_lognormal() deprecated. #290 Delta families can now pass links different default \"logit\" \"log\". #290","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/news/index.html","id":"sdmtmb-042","dir":"Changelog","previous_headings":"","what":"sdmTMB 0.4.2","title":"sdmTMB 0.4.2","text":"CRAN release: 2024-01-18 Force rebuild CRAN binaries fix issue breaking Matrix ABI change causing NaN gradient errors. #288 #287 Fix crash sdmTMB(..., do_index = TRUE) extra_time supplied along predict_args = list(newdata = ...) lacked extra_time elements. Allow get_index() work missing time elements. Add ability pass custom randomized quantile function qres_func residuals.sdmTMB(). Add check factor random intercept columns newdata avoid crash. #278 #280 Improve warnings/errors around use do_index = TRUE get_index() newdata = NULL. #276 Fix prediction offset newdata NULL offset specified. #274 Fix prediction failure offset nsim provided model includes extra_time. #273","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/news/index.html","id":"sdmtmb-041","dir":"Changelog","previous_headings":"","what":"sdmTMB 0.4.1","title":"sdmTMB 0.4.1","text":"CRAN release: 2023-11-03 Fix memory issues detected CRAN ‘Additional issues’ clang-UBSAN, valgrind. Fix bug predicting new data specified offset extra_time. #270 Add warning around non-factor handling spatial_varying formula. #269 Add experimental set_delta_model() plotting delta models ggeffects::ggpredict() (GitHub version next CRAN version).","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/news/index.html","id":"sdmtmb-040","dir":"Changelog","previous_headings":"","what":"sdmTMB 0.4.0","title":"sdmTMB 0.4.0","text":"CRAN release: 2023-10-20 Move add_barrier_mesh() sdmTMBextra avoid final INLA dependency. https://github.com/pbs-assess/sdmTMBextra Switch using new fmesher package mesh/SPDE calculations. INLA longer dependency. Switch diagonal.penalty = FALSE mgcv::smoothCon(). changes scale linear component smoother, result model. https://github.com/glmmTMB/glmmTMB/issues/928#issuecomment-1642862066 Implement cross validation delta models #239 Remove ELPD cross validation output. Use sum_loglik instead. #235 Turn Newton optimization default. #182 print() now checks sanity() issues warning may issues. #176 Poisson-link delta models censored likelihood distributions made considerably robust. #186 Standard errors now available SD parameters etc. tidy() #240 Fix bug print()/tidy() delta-model positive model component sigma_E. recently introduce bug causing sigma_E 2nd model reported 1st model component sigma_E. Add new anisotropy plotting function. Add anisotropic range printing. #149 @jdunic","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/news/index.html","id":"sdmtmb-030","dir":"Changelog","previous_headings":"","what":"sdmTMB 0.3.0","title":"sdmTMB 0.3.0","text":"CRAN release: 2023-01-28 Create sdmTMBextra package remove rstan/tmbstan helpers, causing memory sanitizer errors CRAN. https://github.com/pbs-assess/sdmTMBextra following functions affected: predict.sdmTMB() now takes mcmc_samples, output sdmTMBextra::extract_mcmc(). simulate.sdmTMB() now takes mcmc_samples, output sdmTMBextra::extract_mcmc(). residuals.sdmTMB() now takes mcmc_samples, output sdmTMBextra::predict_mle_mcmc(). affects residuals(..., type = \"mle-mcmc\"). Move dharma_residuals() sdmTMBextra reduce heavy dependencies. See examples Bayesian residuals vignettes help files functions within sdmTMBextra.","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/news/index.html","id":"sdmtmb-022","dir":"Changelog","previous_headings":"","what":"sdmTMB 0.2.2","title":"sdmTMB 0.2.2","text":"Various fixes pass CRAN checks. #158 Fix memory issue highlighted Additional issues CRAN checks. #158 ‘offset’ argument can now character value indicating column name. preferred way using offset parallel cross validation. #165 Fix parallel cross validation using offset vector. #165 Add leave-future-cross validation functionality. #156 Example data qcs_grid longer replicated year save package space. #158 Add message tidy(fit, \"ran_pars\") SEs NA. Add anisotropy print() #157 Fix predict(..., type = \"response\", se_fit = TRUE), involves issuing warning sticking link space. #140","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/news/index.html","id":"sdmtmb-021","dir":"Changelog","previous_headings":"","what":"sdmTMB 0.2.1","title":"sdmTMB 0.2.1","text":"CRAN release: 2023-01-10 Fixes resubmission CRAN.","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/news/index.html","id":"sdmtmb-020","dir":"Changelog","previous_headings":"","what":"sdmTMB 0.2.0","title":"sdmTMB 0.2.0","text":"Initial submission CRAN.","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/news/index.html","id":"sdmtmb-014","dir":"Changelog","previous_headings":"","what":"sdmTMB 0.1.4","title":"sdmTMB 0.1.4","text":"Relax range parameter sanity() check 1x 1.5x greatest distance data. Add Pearson residuals several families. residuals(fit, type = \"pearson\") Useful checking overdispersion N > 1 binomial Poisson families, among uses. See overdisp_fun() function : https://bbolker.github.io/mixedmodels-misc/glmmFAQ.html#testing--overdispersioncomputing-overdispersion-factor Fix bug using residuals() simulate() binomial families specified via cbind() weights = N. binomial sample size wasn’t passed typically resulting Inf/-Inf. Add mixture families: gamma_mix(), lognormal_mix() associated delta/hurdle families: delta_gamma_mix(), delta_lognormal_mix(). families feature mixture two distributions different means shared variance parameters. Add delta_beta() family.","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/news/index.html","id":"sdmtmb-013","dir":"Changelog","previous_headings":"","what":"sdmTMB 0.1.3","title":"sdmTMB 0.1.3","text":"Tweak sanity() checking standard error size. Export previously experimental plot_anisotropy() function. old function now plot_anisotropy2(). Allow passing offset data predict.sdmTMB() via offset argument.","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/news/index.html","id":"sdmtmb-012","dir":"Changelog","previous_headings":"","what":"sdmTMB 0.1.2","title":"sdmTMB 0.1.2","text":"Switch effects = 'ran_vals' random intercept values tidy.sdmTMB() match broom.mixed package. Make tidy.sdmTMB() return tibble tibble package installed. Note affect old code since drop = FALSE default tibbles drop = TRUE default data frames (.e., tibbles always return data frame subsetted). Fix longstanding issue predicting newdata mgcv’s t2(). Previously disabled issues. now works expected. Add knots argument sdmTMB(), passed mgcv. common use specify end points cyclical spline (e.g., s(x, bs = 'cc', k = 4), knots = list(x = c(1, 3, 5, 7))) data don’t extend fully boundaries match .","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/news/index.html","id":"sdmtmb-011","dir":"Changelog","previous_headings":"","what":"sdmTMB 0.1.1","title":"sdmTMB 0.1.1","text":"Preparing release CRAN. Add time-varying AR1 option (originally always random walk). See time_varying_type argument ?sdmTMB. Allow prediction newdata missing time elements. #130 Add check offset() (work sdmTMB, use offset argument instead). #131 Add check random slopes (sdmTMB currently random intercepts, although slopes can vary spatially). #131","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/news/index.html","id":"sdmtmb-010","dir":"Changelog","previous_headings":"","what":"sdmTMB 0.1.0","title":"sdmTMB 0.1.0","text":"ADREPORT several parameters natural space. https://github.com/pbs-assess/sdmTMB/discussions/113 Improve robustness model print() esoteric mgcv smoothers. Let sims_var work multiple spatially varying slopes (zeta_s); return output named list coefficients. #107 Add threshold_coefs sdmTMB_simulate(). Don’t make fake mesh non-spatial model (faster).","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/news/index.html","id":"sdmtmb-00269001","dir":"Changelog","previous_headings":"","what":"sdmTMB 0.0.26.9001","title":"sdmTMB 0.0.26.9001","text":"Add vignettes visreg, ggeffects, delta families (thanks J. Indivero!) #83 #87 #89 Forecasting presence-vignettes merged soon. Add support emmeans package. See ?emmeans.sdmTMB examples. Add support effects package. ggeffects::ggeffect() function can used make fast marginal effects plots. ggeffects::ggpredict() works custom fork ggeffects. pull request made shortly. #101 Add vcov(), fixef(), df.residual(), formula(), terms(), model.frame() methods. Add support \"cloglog\" link. Code adapted glmmTMB robust likelihood implementation. delta models, default share anisotropy parameters VAST. Separate anisotropy (old behavior) can estimated control = sdmTMBcontrol(map = list(ln_H_input = factor(c(1, 2, 3, 4)))) Add experimental do_index, predict_args, index_args sdmTMB(). can used perform prediction index calculation time fitting. large datasets meshes can save time compared fitting, predicting, index calculation 3 separate steps since TMB AD object doesn’t rebuilt. somewhat slow initial fitting. Remove max_gradient bad_eig get_index() output. Use unique locations prediction huge speedups large newdata gridded data. Fix bug rare cases get_index() return gibberish small values. Add bayesian argument, TRUE adds Jacobian adjustments non-linear transformed parameters. TRUE model passed tmbstan, FALSE otherwise. #95 Add experimental -yet-exported sdmTMB:::plot_anisotropy2(). Add many anisotropy, delta model, index calculation unit tests.","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/news/index.html","id":"sdmtmb-00249001","dir":"Changelog","previous_headings":"","what":"sdmTMB 0.0.24.9001","title":"sdmTMB 0.0.24.9001","text":"Enable random walk random field TMB simulation sdmTMB_simulate(). Add check irregular time AR1 random walk processes. Fix bugs introduced delta model code (offsets extra_time threshold model prediction). Fix bug sanity() message small random field SDs.","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/news/index.html","id":"sdmtmb-00249000","dir":"Changelog","previous_headings":"","what":"sdmTMB 0.0.24.9000","title":"sdmTMB 0.0.24.9000","text":"Add support ‘delta’ (‘hurdle’) models. See examples documentation ?sdmTMB. resulted substantial restructuring internal model code. default model components (e.g., binomial & Gamma) share formula, spatial, spatiotemporal structure, can separated supplying argument values lists first element corresponds first model second element corresponds second model (limitations described ?sdmTMB documentation ‘Details’). Add support multiple spatially varying coefficients (used limited single variable). Add compatibility ‘visreg’ package visualizing conditional effects parameters. See ?visreg_delta examples. Add MCMC residual type residuals.sdmTMB(). ‘better’ residuals slower calculate. See documentation ‘Details’ ?residuals.sdmTMB. Make offset argument sdmTMB(). Using reserved word offset formula now deprecated. Add sanity() function perform basic sanity checks model fits. Make sdmTMB() model object compatible update() method. Remove several deprecated arguments. Overhaul examples ?sdmTMB. Use faster “low-rank sparse hessian bias-correction” TMB bias correction. Add parallel processing support. See parallel argument sdmTMBcontrol. default, grabs value sdmTMB.cores option. E.g. options(sdmTMB.cores = 4). currently enabled Mac/Linux. Using many cores can much slower 1 core. Use ‘cli’ package cli_abort()/cli_warn()/cli_inform() stop()/warning()/message(). Add many unit tests.","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/news/index.html","id":"sdmtmb-00239000","dir":"Changelog","previous_headings":"","what":"sdmTMB 0.0.23.9000","title":"sdmTMB 0.0.23.9000","text":"package version number used internal testing ‘delta’ branch several people.","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/news/index.html","id":"sdmtmb-00229001","dir":"Changelog","previous_headings":"","what":"sdmTMB 0.0.22.9001","title":"sdmTMB 0.0.22.9001","text":"Switch TMBad library ~3-fold speedup(!)","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/news/index.html","id":"sdmtmb-00229000","dir":"Changelog","previous_headings":"","what":"sdmTMB 0.0.22.9000","title":"sdmTMB 0.0.22.9000","text":"Fix bug predictions poly(..., raw = FALSE) newdata. #77","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/news/index.html","id":"sdmtmb-00219009","dir":"Changelog","previous_headings":"","what":"sdmTMB 0.0.21.9009","title":"sdmTMB 0.0.21.9009","text":"Add experimental sdmTMB_stacking() ensemble model stacking weights. Add fake mesh random fields . #59 Make predict(..., newdata = NULL) also use last.par.best instead last.par match newdata = df. Fix bug MVN fixed-effect prior indexing sims n_sims arguments deprecated standardized nsim match simulate() S3 method. Bias correction get_index() get_cog() now selective just applied necessary derived parameters. INLA projection matrix ‘’ now shared across spatial spatiotemporal fields. Add add_utm_columns() ease adding UTM columns.","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/news/index.html","id":"sdmtmb-00209001","dir":"Changelog","previous_headings":"","what":"sdmTMB 0.0.20.9001","title":"sdmTMB 0.0.20.9001","text":"Add dharma_residuals(). Fix bug simulate.sdmTMB() residuals.sdmTMB() binomial family.","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/news/index.html","id":"sdmtmb-00209000","dir":"Changelog","previous_headings":"","what":"sdmTMB 0.0.20.9000","title":"sdmTMB 0.0.20.9000","text":"Smoothers now appear print() output. format roughly match brms. main-effect component (e.g., sdepth s(depth)) represents linear component random effect (e.g., sds(depth)) component output corresponds standard deviation penalized weights. Add censored_poisson(link = 'log') family; implemented @joenomiddlename fields sdmTMB() now deprecated replaced spatiotemporal. include_spatial sdmTMB() now deprecated replaced spatial. spatial_only sdmTMB() now deprecated replaced spatiotemporal. E.g. spatial_only = TRUE now spatiotemporal = '' leaving time = NULL. spde sdmTMB() now deprecated replaced mesh. sdmTMB_simulate() new likely replace sdmTMB_sim() eventually. sdmTMB_simulate() set take formula data frame easier use want different spatial observations (covariates) time slice. can also take fitted model modify parts simulate. Finally, function uses TMB simulation much faster flexible can simulate (e.g., anisotropy) previous version. spatial_trend now spatial_varying accepts one-sided formula single predictor coefficient varying space random field. Note may want include fixed effect variable improve interpretability. (scaled) time column used, represent local-time-trend model . Tweedie power (p) parameter now print() tidy() output. thetaf now tweedie_p sdmTMB_sim().","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/news/index.html","id":"sdmtmb-00199003","dir":"Changelog","previous_headings":"","what":"sdmTMB 0.0.19.9003","title":"sdmTMB 0.0.19.9003","text":"Fix bug affecting prediction se_fit = TRUE breakpoint models.","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/news/index.html","id":"sdmtmb-00199002","dir":"Changelog","previous_headings":"","what":"sdmTMB 0.0.19.9002","title":"sdmTMB 0.0.19.9002","text":"Simulation parameter covariance matrix works random effects turned . #57","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/news/index.html","id":"sdmtmb-00199000","dir":"Changelog","previous_headings":"","what":"sdmTMB 0.0.19.9000","title":"sdmTMB 0.0.19.9000","text":"Smoothers s() now penalized smoothers: determine degree wiggliness (mgcv) longer necessary choose appropriate k value priori. Models fit previous versions sdmTMB s(x, k = ...) match models specified way version >= 0.0.19 since basis functions now penalized. various mgcv::s() options supported t2() (ti() te()) supported.","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/news/index.html","id":"sdmtmb-00189001","dir":"Changelog","previous_headings":"","what":"sdmTMB 0.0.18.9001","title":"sdmTMB 0.0.18.9001","text":"Add ELPD (expected log predictive density) sdmTMB_cv() https://arxiv.org/abs/1507.04544 Fix bug evaluating ... sdmTMB_cv() called within function. #54","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/news/index.html","id":"sdmtmb-00189000","dir":"Changelog","previous_headings":"","what":"sdmTMB 0.0.18.9000","title":"sdmTMB 0.0.18.9000","text":"Fix minor error PC Matern prior","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/news/index.html","id":"sdmtmb-00179000","dir":"Changelog","previous_headings":"","what":"sdmTMB 0.0.17.9000","title":"sdmTMB 0.0.17.9000","text":"Add random walk option: fields = \"RW\". Depreciate ar1_fields argument. See new fields argument `sdmTMB(). Many packages moved ‘Imports’ ‘Suggests’","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/news/index.html","id":"sdmtmb-00169000","dir":"Changelog","previous_headings":"","what":"sdmTMB 0.0.16.9000","title":"sdmTMB 0.0.16.9000","text":"Lower default nlminb() eval.max iter.max 1000 2000. Added profile option sdmTMBcontrol(). can dramatically improve model fitting speed many fixed effects. Note result likely slightly different TRUE vs. FALSE. Added simulation MVN precision matrix predict.sdmTMB(). See sims argument. Added gather_sims() spread_sims() extract parameter simulations joint precision matrix format matches tidybayes package. Added get_index_sims() population index calculated MVN simulation draws. Added extract_mcmc() extract MCMC samples model passed tmbstan. Added ability predict model fitted tmbstan. See tmbstan_model argument predict.sdmTMB(). Allowed separate random field Matern range parameters spatial spatiotemporal fields. E.g. sdmTMB(shared_range = FALSE) Bounded AR1 rho parameter -0.999 0.999 improve convergence; -1 1. Please post issue creates problems model. Added map, start, lower, upper options control model fitting. See sdmTMBcontrol(). Added priors parameters. See ?sdmTMB::priors priors argument sdmTMB(). PC priors available random fields. See ?pc_matern details . Moved many less-common arguments sdmTMB() sdmTMBcontrol(). Fix bug sdmTMB_cv() fitting testing data splits reversed. .e., small chunk fit; big chunk tested.","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/news/index.html","id":"sdmtmb-00159000","dir":"Changelog","previous_headings":"","what":"sdmTMB 0.0.15.9000","title":"sdmTMB 0.0.15.9000","text":"Added experimental penalized complexity (PC) prior used INLA. See arguments matern_prior_O matern_prior_E. Added back normalize argument sdmTMB() default FALSE. Setting TRUE can dramatically speed model fits (~4 times test models).","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/news/index.html","id":"sdmtmb-00149003","dir":"Changelog","previous_headings":"","what":"sdmTMB 0.0.14.9003","title":"sdmTMB 0.0.14.9003","text":"Added vignette making pretty maps output","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/news/index.html","id":"sdmtmb-00149001","dir":"Changelog","previous_headings":"","what":"sdmTMB 0.0.14.9001","title":"sdmTMB 0.0.14.9001","text":"AR1 spatial-model Missing factor levels time Coordinate systems big","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/news/index.html","id":"sdmtmb-00149000","dir":"Changelog","previous_headings":"","what":"sdmTMB 0.0.14.9000","title":"sdmTMB 0.0.14.9000","text":"Add re_form_iid predict.sdmTMB(). Add map_rf option sdmTMB(). lets map (fix starting values zero) random fields produce classic GLM/GLMM.","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/news/index.html","id":"sdmtmb-00139000","dir":"Changelog","previous_headings":"","what":"sdmTMB 0.0.13.9000","title":"sdmTMB 0.0.13.9000","text":"Add IID random intercepts interface. E.g. ... + (1 | g) #34","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/news/index.html","id":"sdmtmb-00129000","dir":"Changelog","previous_headings":"","what":"sdmTMB 0.0.12.9000","title":"sdmTMB 0.0.12.9000","text":"Add epsilon_predictor argument sdmTMB() allow model spatiotemporal variance time.","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/news/index.html","id":"sdmtmb-00119000","dir":"Changelog","previous_headings":"","what":"sdmTMB 0.0.11.9000","title":"sdmTMB 0.0.11.9000","text":"Add penalties argument allow regularization.","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/news/index.html","id":"sdmtmb-00109001","dir":"Changelog","previous_headings":"","what":"sdmTMB 0.0.10.9001","title":"sdmTMB 0.0.10.9001","text":"Fix Student-t degrees freedom randomized quantile residuals","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/news/index.html","id":"sdmtmb-00109000","dir":"Changelog","previous_headings":"","what":"sdmTMB 0.0.10.9000","title":"sdmTMB 0.0.10.9000","text":"Fixed parameter initialization inverse links #35 Switched Gamma ‘phi’ parameter representing shape instead CV match glm(), glmmTMB(), etc.","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/news/index.html","id":"sdmtmb-0099000","dir":"Changelog","previous_headings":"","what":"sdmTMB 0.0.9.9000","title":"sdmTMB 0.0.9.9000","text":"Switched density/abundance index calculation use link function opposed hardcoded log() get_generic() function can used grab things like standardized average values response across grid. used log_total raw TMB output now link_total users shouldn’t notice difference.","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/news/index.html","id":"sdmtmb-0089000","dir":"Changelog","previous_headings":"","what":"sdmTMB 0.0.8.9000","title":"sdmTMB 0.0.8.9000","text":"Overhauled simulation function. function now called sdmTMB_sim() uses INLA functions instead RandomFields functions simulating random fields. simulation function can now accommodate families links takes INLA mesh input.","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/news/index.html","id":"sdmtmb-0079001","dir":"Changelog","previous_headings":"","what":"sdmTMB 0.0.7.9001","title":"sdmTMB 0.0.7.9001","text":"Allow specifying degrees freedom Student-t family #29","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/news/index.html","id":"sdmtmb-0079000","dir":"Changelog","previous_headings":"","what":"sdmTMB 0.0.7.9000","title":"sdmTMB 0.0.7.9000","text":"Added tidy() method (broom broom.mixed) return data frame parameter estimates. function can extract fixed effects random effect parameters (variances, AR1 correlation, spatial range). Added argument extra_time sdmTMB(). introduces additional time slices can predict want interpolate forecast. Internally, uses Eric Ward’s ‘weights hack’. also useful data unevenly spaced time want gaps evenly spaced random walk AR1 process (add missing years extra_time). make_spde() now replaced make_mesh() make_spde() soft deprecated. make_mesh() carries x y column names predict function line tidyverse style taking data frame first. make_mesh() can accept cutoff argument (INLA), likely better default way specify mesh since scales across regions better line literature INLA. make_mesh() can use binary search algorithm find cutoff best matches desired number knots (thanks Kelli Johnson idea). Barrier meshes now possible. See add_barrier_mesh() example. pkgdown website now gets auto generated GitHub actions. start model description vignette. much work progress.","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/news/index.html","id":"sdmtmb-0069009","dir":"Changelog","previous_headings":"","what":"sdmTMB 0.0.6.9009","title":"sdmTMB 0.0.6.9009","text":"Fixed bug dlnorm","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/news/index.html","id":"sdmtmb-0069005","dir":"Changelog","previous_headings":"","what":"sdmTMB 0.0.6.9005","title":"sdmTMB 0.0.6.9005","text":"Fixed bug predictions standard errors one(?) parameter (breakpoint parameter) passed initial instead MLE value.","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/news/index.html","id":"sdmtmb-0069004","dir":"Changelog","previous_headings":"","what":"sdmTMB 0.0.6.9004","title":"sdmTMB 0.0.6.9004","text":"Fixed bug predictions new data models break points Overhauled cross validation function. function now: uses Eric’s weights hack can also used forecasting initializes subsequent folds MLE first fold considerable speed increases works parallel future plan initialized; see examples Added threshold parameters print method Added forecasting example weights hack Fixed bug linear break point models","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/news/index.html","id":"sdmtmb-0069002","dir":"Changelog","previous_headings":"","what":"sdmTMB 0.0.6.9002","title":"sdmTMB 0.0.6.9002","text":"Fixed GAM predictions 0s new data. Add linear logistic threshold models. #17","code":""},{"path":"https://pbs-assess.github.io/sdmTMB/news/index.html","id":"sdmtmb-0059000","dir":"Changelog","previous_headings":"","what":"sdmTMB 0.0.5.9000","title":"sdmTMB 0.0.5.9000","text":"Added parsing mgcv formulas splines. #16 Added ability predict standard errors population level. helps making marginal-effect plots. #15 Added optimization options aid convergence. Also added run_extra_optimization() run already fit models. Default extra optimization. Added binomial likelihood cross validation. Git hash ee3f3ba. Started keeping track news NEWS.md.","code":""}]