diff --git a/README.md b/README.md index f0e3385..4b696ee 100644 --- a/README.md +++ b/README.md @@ -43,7 +43,7 @@ together with other packages required to run the analyses: ```r # install.packages("remotes") -remotes::install_url("https://github.com/HIDDA/forecasting/releases/download/v1.1.1/HIDDA.forecasting_1.1.1.tar.gz", dependencies = TRUE) +remotes::install_url("https://github.com/HIDDA/forecasting/releases/download/v1.1.2/HIDDA.forecasting_1.1.2.tar.gz", dependencies = TRUE) ``` Alternatively, to install **HIDDA.forecasting** from the current sources, diff --git a/docs/404.html b/docs/404.html index 2ef87a7..a2a4bf2 100644 --- a/docs/404.html +++ b/docs/404.html @@ -1,79 +1,27 @@ - - - - + + + + - Page not found (404) • HIDDA.forecasting - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - + + + + + + - - - - - - - - - - - + +
-
- -
+
-
@@ -158,5 +104,3 @@

Contents

- - diff --git a/docs/articles/BNV.html b/docs/articles/BNV.html index b75ca01..cf56c3f 100644 --- a/docs/articles/BNV.html +++ b/docs/articles/BNV.html @@ -13,9 +13,6 @@ - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -Articles • HIDDA.forecasting - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -Articles • HIDDA.forecasting - - - - - - - - - - - +
-
- -
- -
+ +
Forecasting Swiss ILI counts using `kcde::kcde`
+
+
Forecasting Swiss ILI counts using `tscount::tsglm`
+
+
Long-term forecasts of age-stratified norovirus incidence in Berlin
+
+
- - - - + diff --git a/docs/authors.html b/docs/authors.html index cbd26c7..6c52d09 100644 --- a/docs/authors.html +++ b/docs/authors.html @@ -1,78 +1,12 @@ - - - - - - - -Citation and Authors • HIDDA.forecasting - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -Authors and Citation • HIDDA.forecasting - - - - - - - - - - - +
-
- -
- -
+
-
-
- - - + diff --git a/docs/index.html b/docs/index.html index 45b06ad..d0b027b 100644 --- a/docs/index.html +++ b/docs/index.html @@ -19,11 +19,8 @@ (see the corresponding vignettes): Univariate forecasting of Swiss ILI counts using forecast, glarma, surveillance and prophet, - and an age-stratified analysis of norovirus counts in Berlin - using the multivariate time-series model "hhh4" from surveillance.'> - - - + and an age-stratified analysis of norovirus gastroenteritis in Berlin + using the multivariate time-series model implemented in surveillance::hhh4().'> - - - - - - -Changelog • HIDDA.forecasting - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -Changelog • HIDDA.forecasting - - - - - - - - - - - +
-
- -
- -
+
-
-

-HIDDA.forecasting 1.1.1 (2021-03-31)

-
    -
  • +
    + +
    • Vignettes have been rebuilt using up-to-date versions of all involved packages in R 4.3.2 (on a new machine). This only gave minor numerical differences in the long-term forecasts of vignette("CHILI_prophet").

    • +
    • Accommodate restricted checks without suggested packages.

    • +
    +
    + +
    • Vignettes have been rebuilt using up-to-date versions of all involved packages in R 4.0.4, resulting in the following changes:

      -
        -
      • vignette("BNV"): age-specific amplitude-shift parameter transformations were wrong in summary(mg_Cpower); bug fixed in surveillance 1.18.0.

      • +
        • vignette("BNV"): age-specific amplitude-shift parameter transformations were wrong in summary(mg_Cpower); bug fixed in surveillance 1.18.0.

        • surveillance::pit() values for degenerate forecasts with zero probability for observed counts may differ due to a change in surveillance 1.17.1 (they still produce warnings).

        • vignette("CHILI_hhh4"): no rounding of n from 213 to 210 in printed calibrationTest() (a bug fixed in R 3.6.0).

        • vignette("CHILI_prophet"): minor numerical differences in model fit and predictions due to changes in prophet.

        • -
        - -
      -
    -
    -

    -HIDDA.forecasting 1.1.0 (2019-03-29)

    -
      -
    • Use standard PIT for continuous forecasts (arima, prophet, naive). Differences to the previously used non-randomized PIT histograms for count data are negligible.

    • +
  • +
+
+ +
  • Use standard PIT for continuous forecasts (arima, prophet, naive). Differences to the previously used non-randomized PIT histograms for count data are negligible.

  • Add scores for discretized log-normal forecasts, via new function scores_lnorm_discrete(). These scores are almost identical to the continuous scores, essentially due to the large counts.

  • Vignettes have been rebuilt using up-to-date versions of all involved packages (forecast 8.5, glarma 1.6.0, hhh4contacts 0.13.0, prophet 0.4, scoringRules 0.9.5, surveillance 1.17.0) in R 3.5.3.

  • -
-
-
-

-HIDDA.forecasting 1.0.0 (2018-09-04)

-
    -
  • This is the version used for the book chapter.

  • +
+
+ +
  • This is the version used for the book chapter.

  • The contained vignettes have been built using R 3.5.1 with all dependent packages’ versions as of 25 July 2018 from CRAN. The versions of the main packages were:

    - -
+ +
+
-
- - - + diff --git a/docs/pkgdown.css b/docs/pkgdown.css index 1273238..80ea5b8 100644 --- a/docs/pkgdown.css +++ b/docs/pkgdown.css @@ -56,8 +56,10 @@ img.icon { float: right; } -img { +/* Ensure in-page images don't run outside their container */ +.contents img { max-width: 100%; + height: auto; } /* Fix bug in bootstrap (only seen in firefox) */ @@ -78,11 +80,10 @@ dd { /* Section anchors ---------------------------------*/ a.anchor { - margin-left: -30px; - display:inline-block; - width: 30px; - height: 30px; - visibility: hidden; + display: none; + margin-left: 5px; + width: 20px; + height: 20px; background-image: url(./link.svg); background-repeat: no-repeat; @@ -90,17 +91,15 @@ a.anchor { background-position: center center; } -.hasAnchor:hover a.anchor { - visibility: visible; -} - -@media (max-width: 767px) { - .hasAnchor:hover a.anchor { - visibility: hidden; - } +h1:hover .anchor, +h2:hover .anchor, +h3:hover .anchor, +h4:hover .anchor, +h5:hover .anchor, +h6:hover .anchor { + display: inline-block; } - /* Fixes for fixed navbar --------------------------*/ .contents h1, .contents h2, .contents h3, .contents h4 { @@ -264,31 +263,26 @@ table { /* Syntax highlighting ---------------------------------------------------- */ -pre { - word-wrap: normal; - word-break: normal; - border: 1px solid #eee; -} - -pre, code { +pre, code, pre code { background-color: #f8f8f8; color: #333; } +pre, pre code { + white-space: pre-wrap; + word-break: break-all; + overflow-wrap: break-word; +} -pre code { - overflow: auto; - word-wrap: normal; - white-space: pre; +pre { + border: 1px solid #eee; } -pre .img { +pre .img, pre .r-plt { margin: 5px 0; } -pre .img img { +pre .img img, pre .r-plt img { background-color: #fff; - display: block; - height: auto; } code a, pre a { @@ -305,9 +299,8 @@ a.sourceLine:hover { .kw {color: #264D66;} /* keyword */ .co {color: #888888;} /* comment */ -.message { color: black; font-weight: bolder;} -.error { color: orange; font-weight: bolder;} -.warning { color: #6A0366; font-weight: bolder;} +.error {font-weight: bolder;} +.warning {font-weight: bolder;} /* Clipboard --------------------------*/ @@ -365,3 +358,27 @@ mark { content: ""; } } + +/* Section anchors --------------------------------- + Added in pandoc 2.11: https://github.com/jgm/pandoc-templates/commit/9904bf71 +*/ + +div.csl-bib-body { } +div.csl-entry { + clear: both; +} +.hanging-indent div.csl-entry { + margin-left:2em; + text-indent:-2em; +} +div.csl-left-margin { + min-width:2em; + float:left; +} +div.csl-right-inline { + margin-left:2em; + padding-left:1em; +} +div.csl-indent { + margin-left: 2em; +} diff --git a/docs/pkgdown.js b/docs/pkgdown.js index 7e7048f..6f0eee4 100644 --- a/docs/pkgdown.js +++ b/docs/pkgdown.js @@ -80,7 +80,7 @@ $(document).ready(function() { var copyButton = ""; - $(".examples, div.sourceCode").addClass("hasCopyButton"); + $("div.sourceCode").addClass("hasCopyButton"); // Insert copy buttons: $(copyButton).prependTo(".hasCopyButton"); @@ -91,7 +91,7 @@ // Initialize clipboard: var clipboardBtnCopies = new ClipboardJS('[data-clipboard-copy]', { text: function(trigger) { - return trigger.parentNode.textContent; + return trigger.parentNode.textContent.replace(/\n#>[^\n]*/g, ""); } }); diff --git a/docs/pkgdown.yml b/docs/pkgdown.yml index 6414190..f3eb7f4 100644 --- a/docs/pkgdown.yml +++ b/docs/pkgdown.yml @@ -1,6 +1,6 @@ -pandoc: 2.11.4 -pkgdown: 1.6.1.9001 -pkgdown_sha: 38f0d8a2db6e26027bc4da8ded7d035e4ddaf987 +pandoc: 3.1.8 +pkgdown: 2.0.7.9000 +pkgdown_sha: ~ articles: BNV: BNV.html CHILI: CHILI.html @@ -9,10 +9,10 @@ articles: CHILI_hhh4: CHILI_hhh4.html CHILI_naive: CHILI_naive.html CHILI_prophet: CHILI_prophet.html - extra/BNV_addon: BNV_addon.html - extra/CHILI_kcde: CHILI_kcde.html - extra/CHILI_tscount: CHILI_tscount.html -last_built: 2021-03-31T15:54Z + BNV_addon: extra/BNV_addon.html + CHILI_kcde: extra/CHILI_kcde.html + CHILI_tscount: extra/CHILI_tscount.html +last_built: 2023-11-29T13:56Z urls: reference: https://HIDDA.github.io/forecasting/reference article: https://HIDDA.github.io/forecasting/articles diff --git a/docs/reference/CHILI-1.png b/docs/reference/CHILI-1.png index efa94f4..37c45ce 100644 Binary files a/docs/reference/CHILI-1.png and b/docs/reference/CHILI-1.png differ diff --git a/docs/reference/CHILI.html b/docs/reference/CHILI.html index 1907483..6ce3714 100644 --- a/docs/reference/CHILI.html +++ b/docs/reference/CHILI.html @@ -1,83 +1,14 @@ - - - - - - - -Swiss Surveillance Data on Influenza Like Illness, 2000-2016 — CHILI • HIDDA.forecasting - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -Swiss Surveillance Data on Influenza Like Illness, 2000-2016 — CHILI • HIDDA.forecasting - - - - - - - - - - - - - - - - - - - - - - - - - - - +
-
- -
- -
+
-
data("CHILI")
- - -

Format

+
+
data("CHILI")
+
-

a univariate time series of class zoo, - where the time index is of class Date +

+

Format

+

a univariate time series of class zoo, + where the time index is of class Date and always refers to the Tuesday of the notification week

-

Source

- +
+
+

Source

The Swiss ILI data has been received on 19 January 2017 by courtesy of:

-

Swiss Federal Office of Public Health
- Public Health Directorate
- Communicable Diseases Division
- 3003 Bern
+

Swiss Federal Office of Public Health
+ Public Health Directorate
+ Communicable Diseases Division
+ 3003 Bern
SWITZERLAND

+
-

Examples

-
summary(CHILI) -
#> Index CHILI -#> Min. :2000-01-04 Min. : 31 -#> 1st Qu.:2004-04-02 1st Qu.: 328 -#> Median :2008-07-01 Median : 973 -#> Mean :2008-07-01 Mean : 4140 -#> 3rd Qu.:2012-09-28 3rd Qu.: 3351 -#> Max. :2016-12-27 Max. :48473
plot(CHILI) -
+
+

Examples

+
summary(CHILI)
+#>      Index                CHILI      
+#>  Min.   :2000-01-04   Min.   :   31  
+#>  1st Qu.:2004-04-02   1st Qu.:  328  
+#>  Median :2008-07-01   Median :  973  
+#>  Mean   :2008-07-01   Mean   : 4140  
+#>  3rd Qu.:2012-09-28   3rd Qu.: 3351  
+#>  Max.   :2016-12-27   Max.   :48473  
+plot(CHILI)
+
+
+
+
-
- - - + diff --git a/docs/reference/Rplot001.png b/docs/reference/Rplot001.png index 36f9251..87d67c9 100644 Binary files a/docs/reference/Rplot001.png and b/docs/reference/Rplot001.png differ diff --git a/docs/reference/Rplot002.png b/docs/reference/Rplot002.png index 72bde76..dc61cec 100644 Binary files a/docs/reference/Rplot002.png and b/docs/reference/Rplot002.png differ diff --git a/docs/reference/Rplot003.png b/docs/reference/Rplot003.png index 109918c..e30d680 100644 Binary files a/docs/reference/Rplot003.png and b/docs/reference/Rplot003.png differ diff --git a/docs/reference/dhhh4sims-1.png b/docs/reference/dhhh4sims-1.png index 6c8698e..6602f0b 100644 Binary files a/docs/reference/dhhh4sims-1.png and b/docs/reference/dhhh4sims-1.png differ diff --git a/docs/reference/dhhh4sims-2.png b/docs/reference/dhhh4sims-2.png index b6ecd5f..40ee107 100644 Binary files a/docs/reference/dhhh4sims-2.png and b/docs/reference/dhhh4sims-2.png differ diff --git a/docs/reference/dhhh4sims-3.png b/docs/reference/dhhh4sims-3.png index 10d904f..844956a 100644 Binary files a/docs/reference/dhhh4sims-3.png and b/docs/reference/dhhh4sims-3.png differ diff --git a/docs/reference/dhhh4sims.html b/docs/reference/dhhh4sims.html index e976c1a..0e8327a 100644 --- a/docs/reference/dhhh4sims.html +++ b/docs/reference/dhhh4sims.html @@ -1,87 +1,18 @@ - - - - - - - -hhh4-Based Forecast Distributions — dhhh4sims • HIDDA.forecasting - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -hhh4-Based Forecast Distributions — dhhh4sims • HIDDA.forecasting - - - - - - - - - - - - - - - - - - - - - - - - - - - +
-
- -
- -
+

The function dhhh4sims constructs a (non-vectorized) probability mass function from the result of -surveillance::simulate.hhh4() (and the corresponding +surveillance::simulate.hhh4() (and the corresponding model), as a function of the time point within the simulation period. The distribution at each time point is obtained as a mixture of negative binomial (or Poisson) distributions based on the samples from the previous time point.

-
dhhh4sims(sims, model)
+
+
dhhh4sims(sims, model)
+
+ +
+

Arguments

+
sims
+

a "hhh4sims" object from surveillance::simulate.hhh4().

+ -

Arguments

- - - - - - - - - - -
sims

a "hhh4sims" object from surveillance::simulate.hhh4().

model

the "hhh4" object underlying sims.

+
model
+

the "hhh4" object underlying sims.

-

Value

+
+
+

Value

+ -

a function(x, tp = 1, log = FALSE), which takes a +

a function(x, tp = 1, log = FALSE), which takes a vector of model$nUnit counts and calculates the (log-)probability of observing these counts (given the model) at the tp'th time point of the simulation -period (index or character string matching rownames(sims)).

-

See also

- -

logs_hhh4sims() where this function is used.

-

Author

- +period (index or character string matching rownames(sims)).

+
+
+

See also

+

logs_hhh4sims() where this function is used.

+
+
+

Author

Sebastian Meyer

+
-

Examples

-
#> Lade nötiges Paket: sp
#> Lade nötiges Paket: xtable
#> This is surveillance 1.19.1. For overview type ‘help(surveillance)’.
CHILI.sts <- sts(observed = CHILI, - epoch = as.integer(index(CHILI)), epochAsDate = TRUE) - -## fit a simple hhh4 model -(f1 <- addSeason2formula(~ 1, period = 365.2425)) -
#> ~1 + sin(2 * pi * t/365.2425) + cos(2 * pi * t/365.2425)
fit <- hhh4( - stsObj = CHILI.sts, - control = list(ar = list(f = f1), end = list(f = f1), family = "NegBin1") -) - -## simulate the last four weeks (only 200 runs, for speed) -sims <- simulate(fit, nsim = 200, seed = 1, subset = 884:nrow(CHILI.sts), - y.start = observed(CHILI.sts)[883,]) -if (requireNamespace("fanplot")) { - plot(sims, "fan", fan.args = list(ln = c(5,95)/100), - observed.args = list(pch = 19), means.args = list(type = "b")) -} -
#> Lade nötigen Namensraum: fanplot
-## derive the weekly forecast distributions -dfun <- dhhh4sims(sims, fit) -dfun(4000, tp = 1) -
#> [1] 0.0002194578
dfun(4000, tp = 4) -
#> [1] 0.0001069622
curve(sapply(x, dfun, tp = 4), 0, 30000, type = "h", - main = "4-weeks-ahead forecast", - xlab = "No. infected", ylab = "Probability") -
-## compare the forecast distributions with the simulated counts -par(mfrow = n2mfrow(nrow(sims))) -for (tp in 1:nrow(sims)) { - MASS::truehist(sims[tp,,], xlab = "counts", ylab = "Probability") - curve(sapply(x, dfun, tp = tp), add = TRUE, lwd = 2) -} -
-# \dontshow{ -## verify distribution at the first time point (i.e., one-step-ahead NegBin) -stopifnot(identical( - sapply(0:100, dfun, tp = 1), - dnbinom(0:100, - mu = meanHHH(fit$coefficients, terms(fit), subset = 884, total.only = TRUE), - size = sizeHHH(fit$coefficients, terms(fit), subset = 884)) -)) -## check that we have a probability distribution at the last time point -.xgrid <- seq(0, 100000, by = 500) -stopifnot(abs(1 - - integrate(approxfun(.xgrid, sapply(.xgrid, dfun, tp = 4)), 0, 100000)$value -) < 0.001) -# } -
+
+

Examples

+
library("surveillance")
+#> Loading required package: sp
+#> Loading required package: xtable
+#> This is surveillance 1.22.1; see ‘package?surveillance’ or
+#> https://surveillance.R-Forge.R-project.org/ for an overview.
+CHILI.sts <- sts(observed = CHILI,
+                 epoch = as.integer(index(CHILI)), epochAsDate = TRUE)
+
+## fit a simple hhh4 model
+(f1 <- addSeason2formula(~ 1, period = 365.2425))
+#> ~1 + sin(2 * pi * t/365.2425) + cos(2 * pi * t/365.2425)
+fit <- hhh4(
+    stsObj = CHILI.sts,
+    control = list(ar = list(f = f1), end = list(f = f1), family = "NegBin1")
+)
+
+## simulate the last four weeks (only 200 runs, for speed)
+sims <- simulate(fit, nsim = 200, seed = 1, subset = 884:nrow(CHILI.sts),
+                 y.start = observed(CHILI.sts)[883,])
+if (requireNamespace("fanplot")) {
+    plot(sims, "fan", fan.args = list(ln = c(5,95)/100),
+         observed.args = list(pch = 19), means.args = list(type = "b"))
+}
+#> Loading required namespace: fanplot
+
+
+## derive the weekly forecast distributions
+dfun <- dhhh4sims(sims, fit)
+dfun(4000, tp = 1)
+#> [1] 0.0002194578
+dfun(4000, tp = 4)
+#> [1] 0.0001069622
+curve(sapply(x, dfun, tp = 4), 0, 30000, type = "h",
+      main = "4-weeks-ahead forecast",
+      xlab = "No. infected", ylab = "Probability")
+
+
+## compare the forecast distributions with the simulated counts
+par(mfrow = n2mfrow(nrow(sims)))
+for (tp in 1:nrow(sims)) {
+    MASS::truehist(sims[tp,,], xlab = "counts", ylab = "Probability")
+    curve(sapply(x, dfun, tp = tp), add = TRUE, lwd = 2)
+}
+
+
+# \dontshow{
+## verify distribution at the first time point (i.e., one-step-ahead NegBin)
+stopifnot(identical(
+    sapply(0:100, dfun, tp = 1),
+    dnbinom(0:100,
+            mu = meanHHH(fit$coefficients, terms(fit), subset = 884, total.only = TRUE),
+            size = sizeHHH(fit$coefficients, terms(fit), subset = 884))
+))
+## check that we have a probability distribution at the last time point
+.xgrid <- seq(0, 100000, by = 500)
+stopifnot(abs(1 -
+    integrate(approxfun(.xgrid, sapply(.xgrid, dfun, tp = 4)), 0, 100000)$value
+) < 0.001)
+# }
+
+
+
-
- - - + diff --git a/docs/reference/dnbmix-1.png b/docs/reference/dnbmix-1.png index fcbb9fa..6aa6d09 100644 Binary files a/docs/reference/dnbmix-1.png and b/docs/reference/dnbmix-1.png differ diff --git a/docs/reference/dnbmix.html b/docs/reference/dnbmix.html index 095c223..43f80be 100644 --- a/docs/reference/dnbmix.html +++ b/docs/reference/dnbmix.html @@ -1,85 +1,16 @@ - - - - - - - -Simulation-Based Forecast Distributions — dnbmix • HIDDA.forecasting - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -Simulation-Based Forecast Distributions — dnbmix • HIDDA.forecasting - - - - - - - - - - - - - - - - - - - - - - - - - - - +
-
- -
- -
+
@@ -139,103 +63,106 @@

Simulation-Based Forecast Distributions

as a mixture of negative binomial (or Poisson) distributions.

-
dnbmix(means, size = NULL)
- -

Arguments

- - - - - - - - - - -
means

a n.ahead x n.sim matrix of means.

size

the dispersion parameter of the dnbinom() distribution +

+
dnbmix(means, size = NULL)
+
+ +
+

Arguments

+
means
+

a n.ahead x n.sim matrix of means.

+ + +
size
+

the dispersion parameter of the dnbinom() distribution or NULL (Poisson forecasts). Can also be time-varying (of length -n.ahead).

+n.ahead).

-

Value

+
+
+

Value

+ -

a function(x, tp = 1, log = FALSE), which takes a vector of +

a function(x, tp = 1, log = FALSE), which takes a vector of counts x and calculates the (log-)probabilities of observing each of these numbers at the tp'th time point of the simulation period (indexing the rows of means).

-

See also

- -

logs_nbmix() where this function is used.

-

Author

- +
+
+

See also

+

logs_nbmix() where this function is used.

+
+
+

Author

Sebastian Meyer

+
-

Examples

-
## a GLARMA example -library("glarma") -y <- as.vector(CHILI) - -## fit a simple NegBin-GLARMA model -X <- t(sapply(2*pi*seq_along(y)/52.1775, - function (x) c(sin = sin(x), cos = cos(x)))) -X <- cbind(intercept = 1, X) -fit <- glarma(y = y[1:883], X = X[1:883,], type = "NegBin", phiLags = 1) - -## simulate the last four weeks (only 500 runs, for speed) -set.seed(1) -means <- replicate(500, { - forecast(fit, n.ahead = 4, newdata = X[884:887,], newoffset = rep(0,4))$mu -}) - -## derive the weekly forecast distributions -dfun <- dnbmix(means, coef(fit, type = "NB")) -dfun(4000, tp = 1) -
#> [1] 0.0001458309
dfun(4000, tp = 4) -
#> [1] 7.550036e-05
curve(dfun(x, tp = 4), 0, 30000, type = "h", - main = "4-weeks-ahead forecast", - xlab = "No. infected", ylab = "Probability") -
-# \dontshow{ -## verify distribution at the first time point (i.e., one-step-ahead NegBin) -stopifnot(all.equal( - dfun(0:100, tp = 1), - dnbinom(0:100, - mu = forecast(fit, n.ahead=1, newdata=X[884,,drop=FALSE])$mu, - size = coef(fit, type = "NB")) -)) -## check that we have a probability distribution at the second time point -.xgrid <- seq(0, 200000, by = 500) -stopifnot(abs(1 - - integrate(approxfun(.xgrid, dfun(.xgrid, tp = 2)), 0, 200000)$value -) < 0.01) -# } -
+
+

Examples

+
## a GLARMA example
+library("glarma")
+y <- as.vector(CHILI)
+
+## fit a simple NegBin-GLARMA model
+X <- t(sapply(2*pi*seq_along(y)/52.1775,
+              function (x) c(sin = sin(x), cos = cos(x))))
+X <- cbind(intercept = 1, X)
+fit <- glarma(y = y[1:883], X = X[1:883,], type = "NegBin", phiLags = 1)
+
+## simulate the last four weeks (only 500 runs, for speed)
+set.seed(1)
+means <- replicate(500, {
+    forecast(fit, n.ahead = 4, newdata = X[884:887,], newoffset = rep(0,4))$mu
+})
+
+## derive the weekly forecast distributions
+dfun <- dnbmix(means, coef(fit, type = "NB"))
+dfun(4000, tp = 1)
+#> [1] 0.0001458309
+dfun(4000, tp = 4)
+#> [1] 7.550036e-05
+curve(dfun(x, tp = 4), 0, 30000, type = "h",
+      main = "4-weeks-ahead forecast",
+      xlab = "No. infected", ylab = "Probability")
+
+
+# \dontshow{
+## verify distribution at the first time point (i.e., one-step-ahead NegBin)
+stopifnot(all.equal(
+    dfun(0:100, tp = 1),
+    dnbinom(0:100,
+            mu = forecast(fit, n.ahead=1, newdata=X[884,,drop=FALSE])$mu,
+            size = coef(fit, type = "NB"))
+))
+## check that we have a probability distribution at the second time point
+.xgrid <- seq(0, 200000, by = 500)
+stopifnot(abs(1 -
+    integrate(approxfun(.xgrid, dfun(.xgrid, tp = 2)), 0, 200000)$value
+) < 0.01)
+# }
+
+
+
- - - - + diff --git a/docs/reference/index.html b/docs/reference/index.html index 84e8de1..ec6b5e3 100644 --- a/docs/reference/index.html +++ b/docs/reference/index.html @@ -1,78 +1,12 @@ - - - - - - - -Function reference • HIDDA.forecasting - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -Function reference • HIDDA.forecasting - - - - - - - - - - - +
-
- -
- -
+
- - - - - - - - - - -
-

Datasets

-

The R package HIDDA.forecasting provides the Swiss ILI data (CHILI). The age-stratified norovirus time series data analyzed in vignette("BNV") are available from the hhh4contacts package as hhh4contacts::counts.

+ - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
+

Datasets

+

The R package HIDDA.forecasting provides the Swiss ILI data (CHILI). The age-stratified norovirus time series data analyzed in vignette("BNV") are available from the hhh4contacts package as hhh4contacts::counts.

+

CHILI

Swiss Surveillance Data on Influenza Like Illness, 2000-2016

-

Fanplot with scores

+
+

Fanplot with scores

+

osaplot()

Plot (One-Step-Ahead) Forecasts with Scores

-

Simulation-based forecast distributions

+
+

Simulation-based forecast distributions

+

dhhh4sims()

hhh4-Based Forecast Distributions

+

logs_hhh4sims()

Simulation-Based Logarithmic Score Using dhhh4sims

+

dnbmix()

Simulation-Based Forecast Distributions

+

logs_nbmix()

Simulation-Based Logarithmic Score Via dnbmix

-

Auxiliary functions for proper scoring rules

+
+

Auxiliary functions for proper scoring rules

+

scores_lnorm()

Proper Scoring Rules for Log-Normal Forecasts

+

scores_lnorm_discrete()

Proper Scoring Rules for Discretized Log-Normal Forecasts

+

scores_sample()

Proper Scoring Rules based on Simulations

-

Auxiliary functions for ARIMA models

+
+

Auxiliary functions for ARIMA models

+

update(<Arima>)

Refit an ARIMA Model on a Subset of the Time Series

- +
+
-
- - - + diff --git a/docs/reference/logs_hhh4sims.html b/docs/reference/logs_hhh4sims.html index c417ca4..0230b02 100644 --- a/docs/reference/logs_hhh4sims.html +++ b/docs/reference/logs_hhh4sims.html @@ -1,85 +1,18 @@ - - - - - - - -Simulation-Based Logarithmic Score Using dhhh4sims — logs_hhh4sims • HIDDA.forecasting - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -Simulation-Based Logarithmic Score Using dhhh4sims — logs_hhh4sims • HIDDA.forecasting - - - - - - - - - - - - - - - - - - - - - - - - - +
-
- -
- -
+

The function logs_hhh4sims computes the logarithmic score of the -forecast distributions based on a surveillance::hhh4() model and +forecast distributions based on a surveillance::hhh4() model and simulations (sims) thereof. The -forecast distributions are obtained via dhhh4sims() as sequential +forecast distributions are obtained via dhhh4sims() as sequential mixtures of negative binomial (or Poisson) distributions, which is different from the kernel density estimation approach employed in -scores_sample().

+scores_sample().

-
logs_hhh4sims(observed = NULL, sims, model)
+
+
logs_hhh4sims(observed = NULL, sims, model)
+
-

Arguments

- - - - - - - - - - - - - - -
observed

a vector or matrix of observed counts during the +

+

Arguments

+
observed
+

a vector or matrix of observed counts during the simulation period. By default (NULL), this is taken from -attr(sims, "stsObserved").

sims

a "hhh4sims" object from -surveillance::simulate.hhh4().

model

the surveillance::hhh4() fit underlying sims.

- -

Value

- -

a vector or matrix of log-scores for the observed counts.

-

See also

- -

scores_sample() for an alternative approach of calculating -the logarithmic score from simulation-based forecasts

-

Author

+attr(sims, "stsObserved").

+ +
sims
+

a "hhh4sims" object from +surveillance::simulate.hhh4().

+ + +
model
+

the surveillance::hhh4() fit underlying sims.

+ +
+
+

Value

+ + +

a vector or matrix of log-scores for the observed counts.

+
+
+

See also

+

scores_sample() for an alternative approach of calculating +the logarithmic score from simulation-based forecasts

+
+
+

Author

Sebastian Meyer

+
+
- - - - + diff --git a/docs/reference/logs_nbmix.html b/docs/reference/logs_nbmix.html index cc8e9c4..b725d32 100644 --- a/docs/reference/logs_nbmix.html +++ b/docs/reference/logs_nbmix.html @@ -1,82 +1,15 @@ - - - - - - - -Simulation-Based Logarithmic Score Via dnbmix — logs_nbmix • HIDDA.forecasting - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -Simulation-Based Logarithmic Score Via dnbmix — logs_nbmix • HIDDA.forecasting - - - - - - - - - - - - - - - - - - - - - - - - - +
-
- -
- -
+

The function logs_nbmix computes the logarithmic score of forecasts based on mixtures of negative binomial (or Poisson) distributions via -dnbmix(). This is different from the kernel density estimation -approach available via scores_sample().

+dnbmix(). This is different from the kernel density estimation +approach available via scores_sample().

-
logs_nbmix(observed, means, size)
- -

Arguments

- - - - - - - - - - - - - - -
observed

a vector of observed counts during the simulation -period.

means

a n.ahead x n.sim matrix of means.

size

the dispersion parameter of the dnbinom() distribution -or NULL (Poisson forecasts). Can also be time-varying (of length -n.ahead).

+
+
logs_nbmix(observed, means, size)
+
-

Value

+
+

Arguments

+
observed
+

a vector of observed counts during the simulation +period.

-

a vector of log-scores for the observed counts.

-

See also

-

scores_sample() for an alternative approach of calculating -the logarithmic score from simulation-based forecasts

-

Author

+
means
+

a n.ahead x n.sim matrix of means.

+ +
size
+

the dispersion parameter of the dnbinom() distribution +or NULL (Poisson forecasts). Can also be time-varying (of length +n.ahead).

+ +
+
+

Value

+ + +

a vector of log-scores for the observed counts.

+
+
+

See also

+

scores_sample() for an alternative approach of calculating +the logarithmic score from simulation-based forecasts

+
+
+

Author

Sebastian Meyer

+
+
-
- - - + diff --git a/docs/reference/osaplot.html b/docs/reference/osaplot.html index 83334d3..0142fc7 100644 --- a/docs/reference/osaplot.html +++ b/docs/reference/osaplot.html @@ -1,82 +1,15 @@ - - - - - - - -Plot (One-Step-Ahead) Forecasts with Scores — osaplot • HIDDA.forecasting - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -Plot (One-Step-Ahead) Forecasts with Scores — osaplot • HIDDA.forecasting - - - - - - - - - - - +
-
- -
- -
+

This function produces a fan chart of sequential (one-step-ahead) forecasts -with dots for the observed values, using surveillance::fanplot(), which -itself wraps fanplot::fan(). A matplot() of score +with dots for the observed values, using surveillance::fanplot(), which +itself wraps fanplot::fan(). A matplot() of score values at each time point is added below ("slicing").

-
osaplot(quantiles, probs, means, observed, scores, start = 1,
-  xlab = "Time", fan.args = list(), means.args = list(),
-  observed.args = list(), key.args = list(), ..., scores.args = list(),
-  legend.args = list(), heights = c(0.6, 0.4))
- -

Arguments

- - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
quantiles

a time x probs matrix of forecast quantiles at each time -point.

probs

numeric vector of probabilities with values between 0 and 1.

means

(optional) numeric vector of point forecasts at each time -point.

observed

(optional) numeric vector of observed values.

scores

(optional) numeric vector (or matrix) of associated scores.

start

time index (x-coordinate) of the first prediction.

xlab

x-axis label.

fan.args

a list of graphical parameters for the fanplot::fan(), -e.g., to employ a different colorRampPalette() as -fan.col, or to enable contour lines via ln.

means.args

a list of graphical parameters for lines() -to modify the plotting style of the point predictions.

observed.args

a list of graphical parameters for lines() -to modify the plotting style of the observed values.

key.args

if a list, a color key (in fanplot::fan()'s +

+
osaplot(quantiles, probs, means, observed, scores, start = 1,
+  xlab = "Time", fan.args = list(), means.args = list(),
+  observed.args = list(), key.args = list(), ..., scores.args = list(),
+  legend.args = list(), heights = c(0.6, 0.4))
+
+ +
+

Arguments

+
quantiles
+

a time x probs matrix of forecast quantiles at each time +point.

+ + +
probs
+

numeric vector of probabilities with values between 0 and 1.

+ + +
means
+

(optional) numeric vector of point forecasts at each time +point.

+ + +
observed
+

(optional) numeric vector of observed values.

+ + +
scores
+

(optional) numeric vector (or matrix) of associated scores.

+ + +
start
+

time index (x-coordinate) of the first prediction.

+ + +
xlab
+

x-axis label.

+ + +
fan.args
+

a list of graphical parameters for the fanplot::fan(), +e.g., to employ a different colorRampPalette() as +fan.col, or to enable contour lines via ln.

+ + +
means.args
+

a list of graphical parameters for lines() +to modify the plotting style of the point predictions.

+ + +
observed.args
+

a list of graphical parameters for lines() +to modify the plotting style of the observed values.

+ + +
key.args
+

if a list, a color key (in fanplot::fan()'s "boxfan"-style) is added to the fan chart. The list may include positioning parameters start (the x-position) and ylim (the y-range of the color key), space to modify the width of the color key, and rlab to modify the labels. An alternative way of labeling the -quantiles is via the argument ln in fan.args.

...

further arguments are passed to plot.default().

scores.args

a list of graphical parameters for matplot() to modify -the style of the scores subplot at the bottom.

legend.args

if a list (of parameters for legend()) and -ncol(scores) > 1, a legend is added to the scores subplot.

heights

numeric vector of length 2 specifying the relative height of -the two subplots.

- -

Author

+quantiles is via the argument ln in fan.args.

+ + +
...
+

further arguments are passed to plot.default().

+ +
scores.args
+

a list of graphical parameters for matplot() to modify +the style of the scores subplot at the bottom.

+ + +
legend.args
+

if a list (of parameters for legend()) and +ncol(scores) > 1, a legend is added to the scores subplot.

+ + +
heights
+

numeric vector of length 2 specifying the relative height of +the two subplots.

+ +
+
+

Author

Sebastian Meyer

+
+
- - - - + diff --git a/docs/reference/reexports.html b/docs/reference/reexports.html index e7c8d90..dcf158d 100644 --- a/docs/reference/reexports.html +++ b/docs/reference/reexports.html @@ -1,84 +1,19 @@ - - - - - - - -Objects exported from other packages — reexports • HIDDA.forecasting - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -Objects exported from other packages — reexports • HIDDA.forecasting - - - - - + zoo +autoplot.zoo, fortify.zoo, index - - - - - - - - - - - - - - - - - - +
-
- -
- -
+

These objects are imported from other packages. Follow the links below to see their documentation.

-
-
zoo

autoplot.zoo, fortify.zoo, index

+
zoo
+

autoplot.zoo, fortify.zoo, index

-
-
+
+
- - - - + diff --git a/docs/reference/scores_lnorm.html b/docs/reference/scores_lnorm.html index a41b415..50de89a 100644 --- a/docs/reference/scores_lnorm.html +++ b/docs/reference/scores_lnorm.html @@ -1,82 +1,15 @@ - - - - - - - -Proper Scoring Rules for Log-Normal Forecasts — scores_lnorm • HIDDA.forecasting - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -Proper Scoring Rules for Log-Normal Forecasts — scores_lnorm • HIDDA.forecasting - - - - - - - - - - - - - - - - - - - - - - - - - +
-
- -
- -
+
-

This is a simple wrapper around functions from the scoringRules -package for predictions with a LN(meanlog, sdlog) +

This is a simple wrapper around functions from the scoringRules +package for predictions with a LN(meanlog, sdlog) distribution. The function is vectorized and preserves the dimension of the input.

-
scores_lnorm(x, meanlog, sdlog, which = c("dss", "logs"))
- -

Arguments

- - - - - - - - - - - - - - -
x

the observed counts.

meanlog, sdlog

parameters of the log-normal distribution, i.e., mean -and standard deviation of the distribution on the log scale.

which

a character vector specifying which scoring rules to apply. The +

+
scores_lnorm(x, meanlog, sdlog, which = c("dss", "logs"))
+
+ +
+

Arguments

+
x
+

the observed counts.

+ + +
meanlog, sdlog
+

parameters of the log-normal distribution, i.e., mean +and standard deviation of the distribution on the log scale.

+ + +
which
+

a character vector specifying which scoring rules to apply. The Dawid-Sebastiani score ("dss") and the logarithmic score ("logs") -are available and both computed by default.

+are available and both computed by default.

-

Value

+
+
+

Value

+ -

scores for the predictions of the observations in x (maintaining +

scores for the predictions of the observations in x (maintaining their dimensions).

+
+
- - - - + diff --git a/docs/reference/scores_lnorm_discrete.html b/docs/reference/scores_lnorm_discrete.html index 16d9cdb..fc4f758 100644 --- a/docs/reference/scores_lnorm_discrete.html +++ b/docs/reference/scores_lnorm_discrete.html @@ -1,80 +1,13 @@ - - - - - - - -Proper Scoring Rules for Discretized Log-Normal Forecasts — scores_lnorm_discrete • HIDDA.forecasting - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -Proper Scoring Rules for Discretized Log-Normal Forecasts — scores_lnorm_discrete • HIDDA.forecasting - - - - - - - - - - - +
-
- -
- -
+
@@ -131,65 +57,56 @@

Proper Scoring Rules for Discretized Log-Normal Forecasts

The function is vectorized and preserves the dimension of the input.

-
scores_lnorm_discrete(x, meanlog, sdlog, which = c("dss", "logs"))
- -

Arguments

- - - - - - - - - - - - - - - - - - -
x

the observed counts.

meanlog

parameters of the log-normal distribution, i.e., mean -and standard deviation of the distribution on the log scale.

sdlog

parameters of the log-normal distribution, i.e., mean -and standard deviation of the distribution on the log scale.

which

a character vector specifying which scoring rules to apply. The +

+
scores_lnorm_discrete(x, meanlog, sdlog, which = c("dss", "logs"))
+
+ +
+

Arguments

+
x
+

the observed counts.

+ + +
meanlog, sdlog
+

parameters of the log-normal distribution, i.e., mean +and standard deviation of the distribution on the log scale.

+ + +
which
+

a character vector specifying which scoring rules to apply. The Dawid-Sebastiani score ("dss") and the logarithmic score ("logs") -are available and both computed by default.

+are available and both computed by default.

-

Value

+
+
+

Value

+ -

scores for the predictions of the observations in x (maintaining +

scores for the predictions of the observations in x (maintaining their dimensions).

+
+
- - - - + diff --git a/docs/reference/scores_sample.html b/docs/reference/scores_sample.html index 19f24a3..ea471f7 100644 --- a/docs/reference/scores_sample.html +++ b/docs/reference/scores_sample.html @@ -1,83 +1,16 @@ - - - - - - - -Proper Scoring Rules based on Simulations — scores_sample • HIDDA.forecasting - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -Proper Scoring Rules based on Simulations — scores_sample • HIDDA.forecasting - - - - - - - - - - - - - - - - - - - - - - - - - +
-
- -
- -
+
-

This is a simple wrapper around functions from the scoringRules +

This is a simple wrapper around functions from the scoringRules package to calculate scoring rules from simulation-based forecasts. Calculation of the logarithmic score involves kernel density estimation, -see scoringRules::logs_sample(). +see scoringRules::logs_sample(). The function is vectorized and preserves the dimension of the input.

-
scores_sample(x, sims, which = c("dss", "logs"))
- -

Arguments

- - - - - - - - - - - - - - -
x

a vector of observed counts.

sims

a matrix of simulated counts with as many rows as length(x).

which

a character vector specifying which scoring rules to apply. The +

+
scores_sample(x, sims, which = c("dss", "logs"))
+
+ +
+

Arguments

+
x
+

a vector of observed counts.

+ + +
sims
+

a matrix of simulated counts with as many rows as length(x).

+ + +
which
+

a character vector specifying which scoring rules to apply. The Dawid-Sebastiani score ("dss") and the logarithmic score ("logs") -are available and both computed by default.

+are available and both computed by default.

-

Value

+
+
+

Value

+ -

scores for the predictions of the observations in x (maintaining +

scores for the predictions of the observations in x (maintaining their dimensions).

+
+
- - - - + diff --git a/docs/reference/update.Arima.html b/docs/reference/update.Arima.html index b38b50c..91f1cc7 100644 --- a/docs/reference/update.Arima.html +++ b/docs/reference/update.Arima.html @@ -1,81 +1,14 @@ - - - - - - - -Refit an ARIMA Model on a Subset of the Time Series — update.Arima • HIDDA.forecasting - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -Refit an ARIMA Model on a Subset of the Time Series — update.Arima • HIDDA.forecasting - - - - - - - - - - - - - - - - - - - - - - - - - +
-
- -
- -
+
-

There seems to be no function in package forecast (as of version +

There seems to be no function in package forecast (as of version 8.2) to re-estimate an ARIMA model on a subset of the original time series. This update method does exactly that.

-
# S3 method for Arima
-update(object, subset, ...)
- -

Arguments

- - - - - - - - - - - - - - -
object

an object of class "Arima", e.g., from -forecast::auto.arima().

subset

an integer vector selecting part of the original time series -(and external regressors).

...

further arguments to be passed to arima().

- -

Value

- -

the updated model.

-

Author

+
+
# S3 method for Arima
+update(object, subset, ...)
+
+ +
+

Arguments

+
object
+

an object of class "Arima", e.g., from +forecast::auto.arima().

+ + +
subset
+

an integer vector selecting part of the original time series +(and external regressors).

+ +
...
+

further arguments to be passed to arima().

+ +
+
+

Value

+ + +

the updated model.

+
+
+

Author

Sebastian Meyer

+
+
-
- - - + diff --git a/docs/sitemap.xml b/docs/sitemap.xml index bb80c20..2209116 100644 --- a/docs/sitemap.xml +++ b/docs/sitemap.xml @@ -1,69 +1,84 @@ - https://HIDDA.github.io/forecasting/index.html + https://HIDDA.github.io/forecasting/404.html - https://HIDDA.github.io/forecasting/reference/CHILI.html + https://HIDDA.github.io/forecasting/articles/BNV.html - https://HIDDA.github.io/forecasting/reference/dhhh4sims.html + https://HIDDA.github.io/forecasting/articles/CHILI.html - https://HIDDA.github.io/forecasting/reference/dnbmix.html + https://HIDDA.github.io/forecasting/articles/CHILI_arima.html - https://HIDDA.github.io/forecasting/reference/logs_hhh4sims.html + https://HIDDA.github.io/forecasting/articles/CHILI_glarma.html - https://HIDDA.github.io/forecasting/reference/logs_nbmix.html + https://HIDDA.github.io/forecasting/articles/CHILI_hhh4.html - https://HIDDA.github.io/forecasting/reference/osaplot.html + https://HIDDA.github.io/forecasting/articles/CHILI_naive.html - https://HIDDA.github.io/forecasting/reference/reexports.html + https://HIDDA.github.io/forecasting/articles/CHILI_prophet.html - https://HIDDA.github.io/forecasting/reference/scores_lnorm.html + https://HIDDA.github.io/forecasting/articles/extra/BNV_addon.html - https://HIDDA.github.io/forecasting/reference/scores_lnorm_discrete.html + https://HIDDA.github.io/forecasting/articles/extra/CHILI_kcde.html - https://HIDDA.github.io/forecasting/reference/scores_sample.html + https://HIDDA.github.io/forecasting/articles/extra/CHILI_tscount.html - https://HIDDA.github.io/forecasting/reference/update.Arima.html + https://HIDDA.github.io/forecasting/articles/index.html - https://HIDDA.github.io/forecasting/articles/BNV.html + https://HIDDA.github.io/forecasting/authors.html - https://HIDDA.github.io/forecasting/articles/CHILI.html + https://HIDDA.github.io/forecasting/index.html - https://HIDDA.github.io/forecasting/articles/CHILI_arima.html + https://HIDDA.github.io/forecasting/news/index.html - https://HIDDA.github.io/forecasting/articles/CHILI_glarma.html + https://HIDDA.github.io/forecasting/reference/CHILI.html - https://HIDDA.github.io/forecasting/articles/CHILI_hhh4.html + https://HIDDA.github.io/forecasting/reference/dhhh4sims.html - https://HIDDA.github.io/forecasting/articles/CHILI_naive.html + https://HIDDA.github.io/forecasting/reference/dnbmix.html - https://HIDDA.github.io/forecasting/articles/CHILI_prophet.html + https://HIDDA.github.io/forecasting/reference/index.html - https://HIDDA.github.io/forecasting/articles/extra/BNV_addon.html + https://HIDDA.github.io/forecasting/reference/logs_hhh4sims.html - https://HIDDA.github.io/forecasting/articles/extra/CHILI_kcde.html + https://HIDDA.github.io/forecasting/reference/logs_nbmix.html - https://HIDDA.github.io/forecasting/articles/extra/CHILI_tscount.html + https://HIDDA.github.io/forecasting/reference/osaplot.html + + + https://HIDDA.github.io/forecasting/reference/reexports.html + + + https://HIDDA.github.io/forecasting/reference/scores_lnorm.html + + + https://HIDDA.github.io/forecasting/reference/scores_lnorm_discrete.html + + + https://HIDDA.github.io/forecasting/reference/scores_sample.html + + + https://HIDDA.github.io/forecasting/reference/update.Arima.html