diff --git a/README.Rmd b/README.Rmd index 1944bca..5bf5059 100644 --- a/README.Rmd +++ b/README.Rmd @@ -50,16 +50,13 @@ install.packages("xrnet") 1. OS-specific prerequisites + *Windows*: Install [RTools](https://cran.r-project.org/bin/windows/Rtools/) (not an R package) - + *Mac*: If using R version >= 3.6.0, verify your GNU Fortran version is >= 6.1. If you have an older version, go [here](https://cran.r-project.org/bin/macosx/tools/) to install the required version + + *Mac*: Verify your GNU Fortran version is >= 6.1. If you have an older version, go [here](https://cran.r-project.org/bin/macosx/tools/) to install the required version. 2. Install the R package [devtools](https://github.com/hadley/devtools) 3. Install the **xrnet** package with the *install_github()* function (optionally install potentially unstable development branch) ```{r, eval = FALSE} # Master branch devtools::install_github("USCbiostats/xrnet") - -# Development branch -devtools::install_github("USCbiostats/xrnet", ref = "development") ``` # A First Example diff --git a/README.md b/README.md index 70c6e1e..0ae8417 100644 --- a/README.md +++ b/README.md @@ -66,10 +66,10 @@ install.packages("xrnet") - *Windows*: Install [RTools](https://cran.r-project.org/bin/windows/Rtools/) (not an R package) - - *Mac*: If using R version \>= 3.6.0, verify your GNU Fortran - version is \>= 6.1. If you have an older version, go + - *Mac*: Verify your GNU Fortran version is \>= 6.1. If you have an + older version, go [here](https://cran.r-project.org/bin/macosx/tools/) to install - the required version + the required version. 2. Install the R package [devtools](https://github.com/hadley/devtools) 3. Install the **xrnet** package with the *install_github()* function (optionally install potentially unstable development branch) @@ -77,9 +77,6 @@ install.packages("xrnet") ``` r # Master branch devtools::install_github("USCbiostats/xrnet") - -# Development branch -devtools::install_github("USCbiostats/xrnet", ref = "development") ``` # A First Example diff --git a/docs/404.html b/docs/404.html index 2b86361..b79c33a 100644 --- a/docs/404.html +++ b/docs/404.html @@ -1,156 +1,95 @@ - - - - + + + + - - + Page not found (404) • xrnet - - - + - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - + + + + - - - - - - -
-
-
+
+ + +
-
- - - - +
- -
- +
+
+ - - diff --git a/docs/CODE_OF_CONDUCT.html b/docs/CODE_OF_CONDUCT.html index c8511eb..03c395c 100644 --- a/docs/CODE_OF_CONDUCT.html +++ b/docs/CODE_OF_CONDUCT.html @@ -1,124 +1,47 @@ - - - - - - - -Contributor Code of Conduct • xrnet - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - +Contributor Code of Conduct • xrnet + Skip to contents + +
-
- +
- - -
- - -
- +
+ - - - + diff --git a/docs/authors.html b/docs/authors.html index f9ee293..2b3d36b 100644 --- a/docs/authors.html +++ b/docs/authors.html @@ -1,163 +1,98 @@ - - - - - - - -Authors • xrnet - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - +Authors and Citation • xrnet + Skip to contents + +
-
- - -
- +
+

Authors

+ +
  • +

    Garrett Weaver. Author, maintainer. +

    +
  • +
  • +

    Dixin Shen. Author. +

    +
  • +
  • +

    Juan Pablo Lewinger. Contributor, thesis advisor. +

    +
  • +
+ +
+

Citation

+

Source: DESCRIPTION

+ +

Weaver G, Shen D (2024). +xrnet: Hierarchical Regularized Regression. +R package version 0.1.7, https://github.com/USCbiostats/xrnet. +

+
@Manual{,
+  title = {xrnet: Hierarchical Regularized Regression},
+  author = {Garrett Weaver and Dixin Shen},
+  year = {2024},
+  note = {R package version 0.1.7},
+  url = {https://github.com/USCbiostats/xrnet},
+}
+
+
-
- +
+ - - - + diff --git a/docs/index.html b/docs/index.html index c9da1d3..a3dbdfc 100644 --- a/docs/index.html +++ b/docs/index.html @@ -4,7 +4,8 @@ - + + Hierarchical Regularized Regression • xrnet @@ -12,149 +13,132 @@ - - - + + + - - - + + - -
-
-
- -
- - +

R-CMD-check Codecov test coverage CRAN_Status_Badge DOI

-
-

-Introduction

-

The xrnet R package is an extension of regularized regression (i.e. ridge regression) that enables the incorporation of external data that may be informative for the effects of predictors on an outcome of interest. Let y be an n-dimensional observed outcome vector, X be a set of p potential predictors observed on the n observations, and Z be a set of q external features available for the p predictors. Our model builds off the standard two-level hierarchical regression model,

+
+

Introduction +

+

The xrnet R package is an extension of regularized regression (i.e. ridge regression) that enables the incorporation of external data that may be informative for the effects of predictors on an outcome of interest. Let y be an n-dimensional observed outcome vector, X be a set of p potential predictors observed on the n observations, and Z be a set of q external features available for the p predictors. Our model builds off the standard two-level hierarchical regression model,

but allows regularization of both the predictors and the external features, where beta is the vector of coefficients describing the association of each predictor with the outcome and alpha is the vector of coefficients describing the association of each external feature with the predictor coefficients, beta. As an example, assume that the outcome is continuous and that we want to apply a ridge penalty to the predictors and lasso penalty to the external features. We minimize the following objective function (ignoring intercept terms):

-

Note that our model allows for the predictor coefficients, beta, to shrink towards potentially informative values based on the matrix Z. In the event the external data is not informative, we can shrink alpha towards zero, returning back to a standard regularized regression. To efficiently fit the model, we rewrite this convex optimization with the variable substitution gamma = *beta − Z * alph**a*. The problem is then solved as a standard regularized regression in which we allow the penalty value and type (ridge / lasso) to be variable-specific:

+

Note that our model allows for the predictor coefficients, beta, to shrink towards potentially informative values based on the matrix Z. In the event the external data is not informative, we can shrink alpha towards zero, returning back to a standard regularized regression. To efficiently fit the model, we rewrite this convex optimization with the variable substitution gamma = beta − Z * alpha. The problem is then solved as a standard regularized regression in which we allow the penalty value and type (ridge / lasso) to be variable-specific:

This package extends the coordinate descent algorithm of Friedman et al. 2010 (used in the R package glmnet) to allow for this variable-specific generalization and to fit the model described above. Currently, we allow for continuous and binary outcomes, but plan to extend to other outcomes (i.e. survival) in the next release.

-
-

-Installation

-
-

-From CRAN

+
+

Installation +

+
+

From CRAN +

+install.packages("xrnet")
-
-

-From Github (most up-to-date)

-
    +
    +

    From Github (most up-to-date) +

    +
    1. OS-specific prerequisites
      • -Windows: Install RTools (not an R package)
      • +Windows: Install RTools (not an R package)
      • -Mac: If using R version >= 3.6.0, verify your GNU Fortran version is >= 6.1. If you have an older version, go here to install the required version
      • +Mac: Verify your GNU Fortran version is >= 6.1. If you have an older version, go here to install the required version.
    2. -
    3. Install the R package devtools +
    4. Install the R package devtools
    5. Install the xrnet package with the install_github() function (optionally install potentially unstable development branch)
    -# Master branch
    -devtools::install_github("USCbiostats/xrnet")
    -
    -# Development branch
    -devtools::install_github("USCbiostats/xrnet", ref = "development")
    +# Master branch +devtools::install_github("USCbiostats/xrnet")
-
-

-A First Example

+
+

A First Example +

As an example of how you might use xrnet, we have provided a small set of simulated external data variables (ext), predictors (x), and a continuous outcome variable (y). First, load the package and the example data:

-library(xrnet)
-data(GaussianExample)
-
-

-Fitting a Model

+library(xrnet) +data(GaussianExample)
+
+

Fitting a Model +

To fit a linear hierarchical regularized regression model, use the main xrnet function. At a minimum, you should specify the predictor matrix x, outcome variable y, and family (outcome distribution). The external option allows you to incorporate external data in the regularized regression model. If you do not include external data, a standard regularized regression model will be fit. By default, a lasso penalty is applied to both the predictors and the external data.

-xrnet_model <- xrnet(
-  x = x_linear, 
-  y = y_linear, 
-  external = ext_linear, 
-  family = "gaussian"
-)
+xrnet_model <- xrnet( + x = x_linear, + y = y_linear, + external = ext_linear, + family = "gaussian" +)
-
-

-Modifying Regularization Terms

+
+

Modifying Regularization Terms +

To modify the regularization terms and penalty path associated with the predictors or external data, you can use the define_penalty function. This function allows you to configure the following regularization attributes:

  • Regularization type @@ -173,150 +157,150 @@

  • User-defined set of penalties

As an example, we may want to apply a ridge penalty to the x variables and a lasso penalty to the external data variables. In addition, we may want to have 30 penalty values computed for the regularization path associated with both x and external. We modify our model call to xrnet follows.

-
    +
    1. penalty_main is used to specify the regularization for the x variables
    2. penalty_external is used to specify the regularization for the external variables
    -xrnet_model <- xrnet(
    -  x = x_linear, 
    -  y = y_linear, 
    -  external = ext_linear, 
    -  family = "gaussian", 
    -  penalty_main = define_penalty(0, num_penalty = 30),
    -  penalty_external = define_penalty(1, num_penalty = 30)
    -)
    +xrnet_model <- xrnet( + x = x_linear, + y = y_linear, + external = ext_linear, + family = "gaussian", + penalty_main = define_penalty(0, num_penalty = 30), + penalty_external = define_penalty(1, num_penalty = 30) +)

Helper functions are also available to define the available penalty types (define_lasso, define_ridge, and define_enet). The example below exemplifies fitting a standard ridge regression model with 100 penalty values by using the define_ridge helper function. As mentioned previously, a standard regularized regression is fit if no external data is provided.

-xrnet_model <- xrnet(
-  x = x_linear, 
-  y = y_linear, 
-  family = "gaussian", 
-  penalty_main = define_ridge(100)
-)
+xrnet_model <- xrnet( + x = x_linear, + y = y_linear, + family = "gaussian", + penalty_main = define_ridge(100) +)
-
-

-Tuning Penalty Parameters by Cross-Validation

+
+

Tuning Penalty Parameters by Cross-Validation +

In general, we need a method to determine the penalty values that produce the optimal out-of-sample prediction. We provide a simple two-dimensional grid search that uses k-fold cross-validation to determine the optimal values for the penalties. The cross-validation function tune_xrnet is used as follows.

-cv_xrnet <- tune_xrnet(
-  x = x_linear, 
-  y = y_linear, 
-  external = ext_linear, 
-  family = "gaussian",
-  penalty_main = define_ridge(),
-  penalty_external = define_lasso()
-)
+cv_xrnet <- tune_xrnet( + x = x_linear, + y = y_linear, + external = ext_linear, + family = "gaussian", + penalty_main = define_ridge(), + penalty_external = define_lasso() +)

To visualize the results of the cross-validation we provide a contour plot of the mean cross-validation error across the grid of penalties with the plot function.

-plot(cv_xrnet)
+plot(cv_xrnet)

Cross-validation error curves can also be generated with plot by fixing the value of either the penalty on x or the external penalty on external. By default, either penalty defaults the optimal penalty on x or external.

-plot(cv_xrnet, p = "opt")
+plot(cv_xrnet, p = "opt")

The predict function can be used to predict responses and to obtain the coefficient estimates at the optimal penalty combination (the default) or any other penalty combination that is within the penalty path(s). coef is a another help function that can be used to return the coefficients for a combination of penalty values as well.

-predy <- predict(cv_xrnet, newdata = x_linear)
-estimates <- coef(cv_xrnet)
+predy <- predict(cv_xrnet, newdata = x_linear) +estimates <- coef(cv_xrnet)
-
-

-Using the bigmemory R package with xrnet

-

As an example of using bigmemory with xrnet, we have a provided a ASCII file, x_linear.txt, that contains the data for x. The bigmemory function read.big.matrix() can be used to create a big.matrix version of this file. The ASCII file is located under inst/extdata in this repository and is also included when you install the R package. To access the file in the R package, use system.file("extdata", "x_linear.txt", package = "xrnet") as shown in the example below.

+
+

Using the bigmemory R package with xrnet +

+

As an example of using bigmemory with xrnet, we have a provided a ASCII file, x_linear.txt, that contains the data for x. The bigmemory function read.big.matrix() can be used to create a big.matrix version of this file. The ASCII file is located under inst/extdata in this repository and is also included when you install the R package. To access the file in the R package, use system.file("extdata", "x_linear.txt", package = "xrnet") as shown in the example below.

-x_big <- bigmemory::read.big.matrix(system.file("extdata", "x_linear.txt", package = "xrnet"), type = "double")
+x_big <- bigmemory::read.big.matrix(system.file("extdata", "x_linear.txt", package = "xrnet"), type = "double")

We can now fit a ridge regression model with the big.matrix version of the data and verify that we get the same estimates:

-xrnet_model_big <- xrnet(
-  x = x_big, 
-  y = y_linear, 
-  family = "gaussian", 
-  penalty_main = define_ridge(100)
-)
-
-all.equal(xrnet_model$beta0, xrnet_model_big$beta0)
-#> [1] TRUE
-all.equal(xrnet_model$betas, xrnet_model_big$betas)
-#> [1] TRUE
-all.equal(xrnet_model$alphas, xrnet_model_big$alphas)
-#> [1] TRUE
+xrnet_model_big <- xrnet( + x = x_big, + y = y_linear, + family = "gaussian", + penalty_main = define_ridge(100) +) + +all.equal(xrnet_model$beta0, xrnet_model_big$beta0) +#> [1] TRUE
+
+all.equal(xrnet_model$betas, xrnet_model_big$betas)
+#> [1] TRUE
+
+all.equal(xrnet_model$alphas, xrnet_model_big$alphas)
+#> [1] TRUE
-
-

-Contributing

-

To report a bug, ask a question, or propose a feature, create a new issue here. This project is released with the following Contributor Code of Conduct. If you would like to contribute, please abide by its terms.

+
+

Contributing +

+

To report a bug, ask a question, or propose a feature, create a new issue here. This project is released with the following Contributor Code of Conduct. If you would like to contribute, please abide by its terms.

-
-

-Funding

+
+

Funding +

Supported by National Cancer Institute Grant #1P01CA196596.

-
- - + diff --git a/docs/news/index.html b/docs/news/index.html index c236114..910f45d 100644 --- a/docs/news/index.html +++ b/docs/news/index.html @@ -1,171 +1,75 @@ - - - - - - - -Changelog • xrnet - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -Changelog • xrnet + + Skip to contents + +
-
- +
- -
- - -
- +
+ - - - + diff --git a/docs/pkgdown.js b/docs/pkgdown.js index 7e7048f..9bd6621 100644 --- a/docs/pkgdown.js +++ b/docs/pkgdown.js @@ -2,85 +2,45 @@ (function($) { $(function() { - $('.navbar-fixed-top').headroom(); + $('nav.navbar').headroom(); - $('body').css('padding-top', $('.navbar').height() + 10); - $(window).resize(function(){ - $('body').css('padding-top', $('.navbar').height() + 10); + Toc.init({ + $nav: $("#toc"), + $scope: $("main h2, main h3, main h4, main h5, main h6") }); - $('[data-toggle="tooltip"]').tooltip(); - - var cur_path = paths(location.pathname); - var links = $("#navbar ul li a"); - var max_length = -1; - var pos = -1; - for (var i = 0; i < links.length; i++) { - if (links[i].getAttribute("href") === "#") - continue; - // Ignore external links - if (links[i].host !== location.host) - continue; - - var nav_path = paths(links[i].pathname); - - var length = prefix_length(nav_path, cur_path); - if (length > max_length) { - max_length = length; - pos = i; - } - } - - // Add class to parent
  • , and enclosing
  • if in dropdown - if (pos >= 0) { - var menu_anchor = $(links[pos]); - menu_anchor.parent().addClass("active"); - menu_anchor.closest("li.dropdown").addClass("active"); - } - }); - - function paths(pathname) { - var pieces = pathname.split("/"); - pieces.shift(); // always starts with / - - var end = pieces[pieces.length - 1]; - if (end === "index.html" || end === "") - pieces.pop(); - return(pieces); - } - - // Returns -1 if not found - function prefix_length(needle, haystack) { - if (needle.length > haystack.length) - return(-1); - - // Special case for length-0 haystack, since for loop won't run - if (haystack.length === 0) { - return(needle.length === 0 ? 0 : -1); + if ($('#toc').length) { + $('body').scrollspy({ + target: '#toc', + offset: $("nav.navbar").outerHeight() + 1 + }); } - for (var i = 0; i < haystack.length; i++) { - if (needle[i] != haystack[i]) - return(i); - } + // Activate popovers + $('[data-bs-toggle="popover"]').popover({ + container: 'body', + html: true, + trigger: 'focus', + placement: "top", + sanitize: false, + }); - return(haystack.length); - } + $('[data-bs-toggle="tooltip"]').tooltip(); /* Clipboard --------------------------*/ function changeTooltipMessage(element, msg) { - var tooltipOriginalTitle=element.getAttribute('data-original-title'); - element.setAttribute('data-original-title', msg); + var tooltipOriginalTitle=element.getAttribute('data-bs-original-title'); + element.setAttribute('data-bs-original-title', msg); $(element).tooltip('show'); - element.setAttribute('data-original-title', tooltipOriginalTitle); + element.setAttribute('data-bs-original-title', tooltipOriginalTitle); } if(ClipboardJS.isSupported()) { $(document).ready(function() { - var copyButton = ""; + var copyButton = ""; - $(".examples, div.sourceCode").addClass("hasCopyButton"); + $("div.sourceCode").addClass("hasCopyButton"); // Insert copy buttons: $(copyButton).prependTo(".hasCopyButton"); @@ -89,20 +49,108 @@ $('.btn-copy-ex').tooltip({container: 'body'}); // Initialize clipboard: - var clipboardBtnCopies = new ClipboardJS('[data-clipboard-copy]', { + var clipboard = new ClipboardJS('[data-clipboard-copy]', { text: function(trigger) { - return trigger.parentNode.textContent; + return trigger.parentNode.textContent.replace(/\n#>[^\n]*/g, ""); } }); - clipboardBtnCopies.on('success', function(e) { + clipboard.on('success', function(e) { changeTooltipMessage(e.trigger, 'Copied!'); e.clearSelection(); }); - clipboardBtnCopies.on('error', function() { + clipboard.on('error', function(e) { changeTooltipMessage(e.trigger,'Press Ctrl+C or Command+C to copy'); }); + }); } + + /* Search marking --------------------------*/ + var url = new URL(window.location.href); + var toMark = url.searchParams.get("q"); + var mark = new Mark("main#main"); + if (toMark) { + mark.mark(toMark, { + accuracy: { + value: "complementary", + limiters: [",", ".", ":", "/"], + } + }); + } + + /* Search --------------------------*/ + /* Adapted from https://github.com/rstudio/bookdown/blob/2d692ba4b61f1e466c92e78fd712b0ab08c11d31/inst/resources/bs4_book/bs4_book.js#L25 */ + // Initialise search index on focus + var fuse; + $("#search-input").focus(async function(e) { + if (fuse) { + return; + } + + $(e.target).addClass("loading"); + var response = await fetch($("#search-input").data("search-index")); + var data = await response.json(); + + var options = { + keys: ["what", "text", "code"], + ignoreLocation: true, + threshold: 0.1, + includeMatches: true, + includeScore: true, + }; + fuse = new Fuse(data, options); + + $(e.target).removeClass("loading"); + }); + + // Use algolia autocomplete + var options = { + autoselect: true, + debug: true, + hint: false, + minLength: 2, + }; + var q; +async function searchFuse(query, callback) { + await fuse; + + var items; + if (!fuse) { + items = []; + } else { + q = query; + var results = fuse.search(query, { limit: 20 }); + items = results + .filter((x) => x.score <= 0.75) + .map((x) => x.item); + if (items.length === 0) { + items = [{dir:"Sorry 😿",previous_headings:"",title:"No results found.",what:"No results found.",path:window.location.href}]; + } + } + callback(items); +} + $("#search-input").autocomplete(options, [ + { + name: "content", + source: searchFuse, + templates: { + suggestion: (s) => { + if (s.title == s.what) { + return `${s.dir} >
    ${s.title}
    `; + } else if (s.previous_headings == "") { + return `${s.dir} >
    ${s.title}
    > ${s.what}`; + } else { + return `${s.dir} >
    ${s.title}
    > ${s.previous_headings} > ${s.what}`; + } + }, + }, + }, + ]).on('autocomplete:selected', function(event, s) { + window.location.href = s.path + "?q=" + q + "#" + s.id; + }); + }); })(window.jQuery || window.$) + + diff --git a/docs/pkgdown.yml b/docs/pkgdown.yml index 6015365..1fcca2e 100644 --- a/docs/pkgdown.yml +++ b/docs/pkgdown.yml @@ -1,6 +1,6 @@ -pandoc: 2.11.4 -pkgdown: 1.6.1 +pandoc: 3.1.11 +pkgdown: 2.0.9 pkgdown_sha: ~ articles: {} -last_built: 2021-06-01T05:40Z +last_built: 2024-06-26T05:16Z diff --git a/docs/reference/coef.tune_xrnet.html b/docs/reference/coef.tune_xrnet.html index 8c9b3d9..23d3383 100644 --- a/docs/reference/coef.tune_xrnet.html +++ b/docs/reference/coef.tune_xrnet.html @@ -1,220 +1,149 @@ - - - - - - - -Get coefficient estimates from "tune_xrnet" model object. — coef.tune_xrnet • xrnet - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - +path(s).">Get coefficient estimates from "tune_xrnet" model object. — coef.tune_xrnet • xrnet + Skip to contents + +
    -
    - + +
    +

    Examples

    +
    ## Cross validation of hierarchical linear regression model
    +data(GaussianExample)
    +
    +## 5-fold cross validation
    +cv_xrnet <- tune_xrnet(
    +  x = x_linear,
    +  y = y_linear,
    +  external = ext_linear,
    +  family = "gaussian",
    +  control = xrnet_control(tolerance = 1e-6)
    +)
    +
    +## Get coefficient estimates at optimal penalty combination
    +coef_opt <- coef(cv_xrnet)
    +
    +
    +
    -
    - +
    + - - - + diff --git a/docs/reference/coef.xrnet.html b/docs/reference/coef.xrnet.html index 47a53f6..0f09637 100644 --- a/docs/reference/coef.xrnet.html +++ b/docs/reference/coef.xrnet.html @@ -1,221 +1,150 @@ - - - - - - - -Get coefficient estimates from "xrnet" model object. — coef.xrnet • xrnet - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - +path(s).">Get coefficient estimates from "xrnet" model object. — coef.xrnet • xrnet + Skip to contents + +
    -
    - + +
    +

    Examples

    +
    data(GaussianExample)
    +
    +fit_xrnet <- xrnet(
    +  x = x_linear,
    +  y = y_linear,
    +  external = ext_linear,
    +  family = "gaussian"
    +)
    +
    +lambda1 <- fit_xrnet$penalty[10]
    +lambda2 <- fit_xrnet$penalty_ext[10]
    +
    +coef_xrnet <- coef(
    +  fit_xrnet,
    +  p = lambda1,
    +  pext = lambda2,
    +)
    +
    +
    +
    -
    - +
    + - - - + diff --git a/docs/reference/define_enet.html b/docs/reference/define_enet.html index 215cd64..c483e77 100644 --- a/docs/reference/define_enet.html +++ b/docs/reference/define_enet.html @@ -1,204 +1,121 @@ - - - - - - - -Define elastic net regularization object for predictor and external data — define_enet • xrnet - - - - - - - - - - - - - - - - - - - - +Define elastic net regularization object for predictor and external data — define_enet • xrnet + Skip to contents + - - - +
    +
    +
    +
    +

    Helper function to define a elastic net penalty regularization +object. See define_penalty for more details.

    +
    - - - +
    +

    Usage

    +
    define_enet(
    +  en_param = 0.5,
    +  num_penalty = 20,
    +  penalty_ratio = NULL,
    +  user_penalty = NULL,
    +  custom_multiplier = NULL
    +)
    +
    +
    +

    Arguments

    +
    en_param
    +

    elastic net parameter, between 0 and 1

    +
    num_penalty
    +

    number of penalty values to fit in grid. Default is 20.

    - - - - - - - - - - -
    -
    - - +
    user_penalty
    +

    user-defined vector of penalty values to use in penalty +path.

    -
    -
    -
    - +
    custom_multiplier
    +

    variable-specific penalty multipliers to apply to +overall penalty. Default is 1 for all variables. 0 is no penalization.

    -
    -

    Helper function to define a elastic net penalty regularization -object. See define_penalty for more details.

    -
    +
    +
    +

    Value

    + -
    define_enet(
    -  en_param = 0.5,
    -  num_penalty = 20,
    -  penalty_ratio = NULL,
    -  user_penalty = NULL,
    -  custom_multiplier = NULL
    -)
    - -

    Arguments

    - - - - - - - - - - - - - - - - - - - - - - -
    en_param

    elastic net parameter, between 0 and 1

    num_penalty

    number of penalty values to fit in grid. Default is 20.

    penalty_ratio

    ratio between minimum and maximum penalty for x. -Default is 1e-04 if \(n > p\) and 0.01 if \(n <= p\).

    user_penalty

    user-defined vector of penalty values to use in penalty -path.

    custom_multiplier

    variable-specific penalty multipliers to apply to -overall penalty. Default is 1 for all variables. 0 is no penalization.

    - -

    Value

    - -

    A list object with regularization settings that are used to define -the regularization for predictors or external data in xrnet and -tune_xrnet. The list elements will match those returned by -define_penalty, but with the penalty_type set to match the +

    A list object with regularization settings that are used to define +the regularization for predictors or external data in xrnet and +tune_xrnet. The list elements will match those returned by +define_penalty, but with the penalty_type set to match the value of en_param.

    +
    -
    - -
    +
    -
    - +
    + - - - + diff --git a/docs/reference/define_lasso.html b/docs/reference/define_lasso.html index 81d535a..394989c 100644 --- a/docs/reference/define_lasso.html +++ b/docs/reference/define_lasso.html @@ -1,201 +1,118 @@ - - - - - - - -Define lasso regularization object for predictor and external data — define_lasso • xrnet - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - +Define lasso regularization object for predictor and external data — define_lasso • xrnet + Skip to contents + +
    -
    - - -
    +
    -
    - +
    + - - - + diff --git a/docs/reference/define_penalty.html b/docs/reference/define_penalty.html index 64b6a0e..627d6e3 100644 --- a/docs/reference/define_penalty.html +++ b/docs/reference/define_penalty.html @@ -1,237 +1,167 @@ - - - - - - - -Define regularization object for predictor and external data. — define_penalty • xrnet +Define regularization object for predictor and external data. — define_penalty • xrnet + Skip to contents + - - - - - - - +
    +
    +
    - - +
    +

    Defines regularization for predictors and external data +variables in xrnet fitting. Use helper functions define_lasso, +define_ridge, or define_enet to specify a common penalty on x or external.

    +
    - - - +
    +

    Usage

    +
    define_penalty(
    +  penalty_type = 1,
    +  quantile = 0.5,
    +  num_penalty = 20,
    +  penalty_ratio = NULL,
    +  user_penalty = NULL,
    +  custom_multiplier = NULL
    +)
    +
    - - - +
    +

    Arguments

    +
    penalty_type
    +

    type of regularization. Default is 1 (Lasso). +Can supply either a scalar value or vector with length equal to the number of +variables the matrix.

    • 0 = Ridge

    • +
    • (0,1) = Elastic-Net

    • +
    • 1 = Lasso / Quantile

    • +
    +
    quantile
    +

    specifies quantile for quantile penalty. Default of 0.5 +reduces to lasso (currently not implemented).

    - - - +
    num_penalty
    +

    number of penalty values to fit in grid. Default is 20.

    +
    penalty_ratio
    +

    ratio between minimum and maximum penalty for x. +Default is 1e-04 if \(n > p\) and 0.01 if \(n <= p\).

    - - - +
    user_penalty
    +

    user-defined vector of penalty values to use in penalty +path.

    - - - - - - - -
    -
    - +
    custom_multiplier
    +

    variable-specific penalty multipliers to apply to +overall penalty. Default is 1 for all variables. 0 is no penalization.

    - +
    +
    +

    Value

    + - +

    A list object with regularization settings that are used to define +the regularization for predictors or external data in xrnet and +tune_xrnet:

    +
    penalty_type
    +

    The penalty type, scalar with value in range [0, 1].

    -
    -
    - +
    quantile
    +

    Quantile for quantile penalty, 0.5 defaults to lasso +(not currently implemented).

    -
    -

    Defines regularization for predictors and external data -variables in xrnet fitting. Use helper functions define_lasso, -define_ridge, or define_enet to specify a common penalty on x or external.

    -
    +
    num_penalty
    +

    The number of penalty values in the penalty path.

    -
    define_penalty(
    -  penalty_type = 1,
    -  quantile = 0.5,
    -  num_penalty = 20,
    -  penalty_ratio = NULL,
    -  user_penalty = NULL,
    -  custom_multiplier = NULL
    -)
    - -

    Arguments

    - - - - - - - - - - - - - - - - - - - - - - - - - - -
    penalty_type

    type of regularization. Default is 1 (Lasso). -Can supply either a scalar value or vector with length equal to the number of -variables the matrix.

      -
    • 0 = Ridge

    • -
    • (0,1) = Elastic-Net

    • -
    • 1 = Lasso / Quantile

    • -
    quantile

    specifies quantile for quantile penalty. Default of 0.5 -reduces to lasso (currently not implemented).

    num_penalty

    number of penalty values to fit in grid. Default is 20.

    penalty_ratio

    ratio between minimum and maximum penalty for x. -Default is 1e-04 if \(n > p\) and 0.01 if \(n <= p\).

    user_penalty

    user-defined vector of penalty values to use in penalty -path.

    custom_multiplier

    variable-specific penalty multipliers to apply to -overall penalty. Default is 1 for all variables. 0 is no penalization.

    - -

    Value

    - -

    A list object with regularization settings that are used to define -the regularization for predictors or external data in xrnet and -tune_xrnet:

    -
    penalty_type

    The penalty type, scalar with value in range [0, 1].

    -
    quantile

    Quantile for quantile penalty, 0.5 defaults to lasso -(not currently implemented).

    -
    num_penalty

    The number of penalty values in the penalty path.

    -
    penalty_ratio

    The ratio of the minimum penalty value compared to the +

    penalty_ratio
    +

    The ratio of the minimum penalty value compared to the maximum penalty value.

    -
    user_penalty

    User-defined numeric vector of penalty values, NULL if -not provided by user.

    -
    custom_multiplier

    User-defined feature-specific penalty multipliers, -NULL if not provided by user.

    +
    user_penalty
    +

    User-defined numeric vector of penalty values, NULL if +not provided by user.

    -

    Examples

    -
    -# define ridge penalty with penalty grid split into 30 values -my_penalty <- define_penalty(penalty_type = 0, num_penalty = 30) +
    custom_multiplier
    +

    User-defined feature-specific penalty multipliers, +NULL if not provided by user.

    -# define elastic net (0.5) penalty with user-defined penalty -my_custom_penalty <- define_penalty( - penalty_type = 0.5, user_penalty = c(100, 50, 10, 1, 0.1) -) -
    -
    - -
    +
    + +
    +

    Examples

    +
    
    +# define ridge penalty with penalty grid split into 30 values
    +my_penalty <- define_penalty(penalty_type = 0, num_penalty = 30)
    +
    +# define elastic net (0.5) penalty with user-defined penalty
    +my_custom_penalty <- define_penalty(
    +  penalty_type = 0.5, user_penalty = c(100, 50, 10, 1, 0.1)
    +)
    +
    +
    +
    -
    - +
    + - - - + diff --git a/docs/reference/define_ridge.html b/docs/reference/define_ridge.html index 510745a..7c6f77f 100644 --- a/docs/reference/define_ridge.html +++ b/docs/reference/define_ridge.html @@ -1,199 +1,116 @@ - - - - - - - -Define ridge regularization object for predictor and external data — define_ridge • xrnet - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - +Define ridge regularization object for predictor and external data — define_ridge • xrnet + Skip to contents + +
    -
    - +
    -
    - +
    + - - - + diff --git a/docs/reference/ext_linear.html b/docs/reference/ext_linear.html index 71daa17..8768b2f 100644 --- a/docs/reference/ext_linear.html +++ b/docs/reference/ext_linear.html @@ -1,165 +1,81 @@ - - - - - - - -Simulated external data — ext_linear • xrnet - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - +Simulated external data — ext_linear • xrnet + Skip to contents + +
    -
    - +
    -
    - +
    + - - - + diff --git a/docs/reference/index.html b/docs/reference/index.html index 4edf392..4d41cc4 100644 --- a/docs/reference/index.html +++ b/docs/reference/index.html @@ -1,265 +1,151 @@ - - - - - - - -Function reference • xrnet - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -Function reference • xrnet + + Skip to contents + +
    -
    - - -
    - - -
    - +
    + + +
    -
    -

    Site built with pkgdown 1.6.1.

    + -
    -
    +
    + - - - + diff --git a/docs/reference/plot.tune_xrnet-1.png b/docs/reference/plot.tune_xrnet-1.png index f4a070d..f451cd8 100644 Binary files a/docs/reference/plot.tune_xrnet-1.png and b/docs/reference/plot.tune_xrnet-1.png differ diff --git a/docs/reference/plot.tune_xrnet.html b/docs/reference/plot.tune_xrnet.html index f1c9e99..01ad4a7 100644 --- a/docs/reference/plot.tune_xrnet.html +++ b/docs/reference/plot.tune_xrnet.html @@ -1,136 +1,63 @@ - - - - - - - -Plot k-fold cross-validation error grid — plot.tune_xrnet • xrnet - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - +when external data is used.">Plot k-fold cross-validation error grid — plot.tune_xrnet • xrnet + Skip to contents + +
    -
    - - -
    +
    +

    Examples

    +
    
    +## load example data
    +data(GaussianExample)
    +
    +## 5-fold cross validation
    +cv_xrnet <- tune_xrnet(
    +  x = x_linear,
    +  y = y_linear,
    +  external = ext_linear,
    +  family = "gaussian",
    +  control = xrnet_control(tolerance = 1e-6)
    +)
    +
    +## contour plot of cross-validated error
    +plot(cv_xrnet)
    +
    +
    +## error curve of external penalties at optimal penalty value
    +plot(cv_xrnet, p = "opt")
    +
    +
    +
    +
    -
    - +
    + - - - + diff --git a/docs/reference/predict.tune_xrnet.html b/docs/reference/predict.tune_xrnet.html index 2b2f31a..697e315 100644 --- a/docs/reference/predict.tune_xrnet.html +++ b/docs/reference/predict.tune_xrnet.html @@ -1,241 +1,157 @@ - - - - - - - -Predict function for "tune_xrnet" object — predict.tune_xrnet • xrnet +Predict function for "tune_xrnet" object — predict.tune_xrnet • xrnet + Skip to contents + - - - - - - - +
    +
    +
    - - +
    +

    Extract coefficients or predict response in new data using +fitted model from a tune_xrnet object. Note that we currently +only support returning results that are in the original path(s).

    +
    - - - +
    +

    Usage

    +
    # S3 method for tune_xrnet
    +predict(
    +  object,
    +  newdata = NULL,
    +  newdata_fixed = NULL,
    +  p = "opt",
    +  pext = "opt",
    +  type = c("response", "link", "coefficients"),
    +  ...
    +)
    +
    - - - +
    +

    Arguments

    +
    object
    +

    A tune_xrnet object

    +
    newdata
    +

    matrix with new values for penalized variables

    - - - +
    newdata_fixed
    +

    matrix with new values for unpenalized variables

    +
    p
    +

    vector of penalty values to apply to predictor variables. +Default is optimal value in tune_xrnet object.

    - - - +
    pext
    +

    vector of penalty values to apply to external data variables. +Default is optimal value in tune_xrnet object.

    - - - - - - - -
    -
    - - - +
    type
    +

    type of prediction to make using the xrnet model, options +include:

    • response

    • +
    • link (linear predictor)

    • +
    • coefficients

    • +
    -
    -
    -
    - +
    ...
    +

    pass other arguments to xrnet function (if needed)

    -
    -

    Extract coefficients or predict response in new data using -fitted model from a tune_xrnet object. Note that we currently -only support returning results that are in the original path(s).

    -
    +
    +
    +

    Value

    + -
    # S3 method for tune_xrnet
    -predict(
    -  object,
    -  newdata = NULL,
    -  newdata_fixed = NULL,
    -  p = "opt",
    -  pext = "opt",
    -  type = c("response", "link", "coefficients"),
    -  ...
    -)
    - -

    Arguments

    - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    object

    A tune_xrnet object

    newdata

    matrix with new values for penalized variables

    newdata_fixed

    matrix with new values for unpenalized variables

    p

    vector of penalty values to apply to predictor variables. -Default is optimal value in tune_xrnet object.

    pext

    vector of penalty values to apply to external data variables. -Default is optimal value in tune_xrnet object.

    type

    type of prediction to make using the xrnet model, options -include:

      -
    • response

    • -
    • link (linear predictor)

    • -
    • coefficients

    • -
    ...

    pass other arguments to xrnet function (if needed)

    - -

    Value

    - -

    The object returned is based on the value of type as follows:

      -
    • response: An array with the response predictions based on the data +

      The object returned is based on the value of type as follows:

      • response: An array with the response predictions based on the data for each penalty combination

      • link: An array with linear predictions based on the data for each penalty combination

      • coefficients: A list with the coefficient estimates for each - penalty combination. See coef.xrnet.

      • -
      - - -

      Examples

      -
      data(GaussianExample) - -## 5-fold cross validation -cv_xrnet <- tune_xrnet( - x = x_linear, - y = y_linear, - external = ext_linear, - family = "gaussian", - control = xrnet_control(tolerance = 1e-6) -) - -## Get coefficients and predictions at optimal penalty combination -coef_xrnet <- predict(cv_xrnet, type = "coefficients") -pred_xrnet <- predict(cv_xrnet, newdata = x_linear, type = "response") -
      -
    - -
    + penalty combination. See coef.xrnet.

  • +
    + +
    +

    Examples

    +
    data(GaussianExample)
    +
    +## 5-fold cross validation
    +cv_xrnet <- tune_xrnet(
    +  x = x_linear,
    +  y = y_linear,
    +  external = ext_linear,
    +  family = "gaussian",
    +  control = xrnet_control(tolerance = 1e-6)
    +)
    +
    +## Get coefficients and predictions at optimal penalty combination
    +coef_xrnet <- predict(cv_xrnet, type = "coefficients")
    +pred_xrnet <- predict(cv_xrnet, newdata = x_linear, type = "response")
    +
    +
    +
    - + - - - + diff --git a/docs/reference/predict.xrnet.html b/docs/reference/predict.xrnet.html index e97415a..74cdf07 100644 --- a/docs/reference/predict.xrnet.html +++ b/docs/reference/predict.xrnet.html @@ -1,251 +1,167 @@ - - - - - - - -Predict function for "xrnet" object — predict.xrnet • xrnet +Predict function for "xrnet" object — predict.xrnet • xrnet + Skip to contents + - - - - - - - +
    +
    +
    - - +
    +

    Extract coefficients or predict response in new data using +fitted model from an xrnet object. Note that we currently only +support returning coefficient estimates that are in the original path(s).

    +
    - - - +
    +

    Usage

    +
    # S3 method for xrnet
    +predict(
    +  object,
    +  newdata = NULL,
    +  newdata_fixed = NULL,
    +  p = NULL,
    +  pext = NULL,
    +  type = c("response", "link", "coefficients"),
    +  ...
    +)
    +
    - - - +
    +

    Arguments

    +
    object
    +

    A xrnet object

    +
    newdata
    +

    matrix with new values for penalized variables

    - - - +
    newdata_fixed
    +

    matrix with new values for unpenalized variables

    +
    p
    +

    vector of penalty values to apply to predictor variables

    - - - +
    pext
    +

    vector of penalty values to apply to external data variables

    - - - - - - - -
    -
    - - - +
    type
    +

    type of prediction to make using the xrnet model, options +include:

    • response

    • +
    • link (linear predictor)

    • +
    • coefficients

    • +
    -
    -
    -
    - +
    ...
    +

    pass other arguments to xrnet function (if needed)

    -
    -

    Extract coefficients or predict response in new data using -fitted model from an xrnet object. Note that we currently only -support returning coefficient estimates that are in the original path(s).

    -
    +
    +
    +

    Value

    + -
    # S3 method for xrnet
    -predict(
    -  object,
    -  newdata = NULL,
    -  newdata_fixed = NULL,
    -  p = NULL,
    -  pext = NULL,
    -  type = c("response", "link", "coefficients"),
    -  ...
    -)
    - -

    Arguments

    - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    object

    A xrnet object

    newdata

    matrix with new values for penalized variables

    newdata_fixed

    matrix with new values for unpenalized variables

    p

    vector of penalty values to apply to predictor variables

    pext

    vector of penalty values to apply to external data variables

    type

    type of prediction to make using the xrnet model, options -include:

      -
    • response

    • -
    • link (linear predictor)

    • -
    • coefficients

    • -
    ...

    pass other arguments to xrnet function (if needed)

    - -

    Value

    - -

    The object returned is based on the value of type as follows:

      -
    • response: An array with the response predictions based on the data +

      The object returned is based on the value of type as follows:

      • response: An array with the response predictions based on the data for each penalty combination

      • link: An array with linear predictions based on the data for each penalty combination

      • coefficients: A list with the coefficient estimates for each - penalty combination. See coef.xrnet.

      • -
      - - -

      Examples

      -
      data(GaussianExample) - -fit_xrnet <- xrnet( - x = x_linear, - y = y_linear, - external = ext_linear, - family = "gaussian" -) - -lambda1 <- fit_xrnet$penalty[10] -lambda2 <- fit_xrnet$penalty_ext[10] - -coef_xrnet <- predict( - fit_xrnet, - p = lambda1, - pext = lambda2, - type = "coefficients" -) - -pred_xrnet <- predict( - fit_xrnet, - p = lambda1, - pext = lambda2, - newdata = x_linear, - type = "response" -) -
      -
    - -
    + penalty combination. See coef.xrnet.

    +
    + +
    +

    Examples

    +
    data(GaussianExample)
    +
    +fit_xrnet <- xrnet(
    +  x = x_linear,
    +  y = y_linear,
    +  external = ext_linear,
    +  family = "gaussian"
    +)
    +
    +lambda1 <- fit_xrnet$penalty[10]
    +lambda2 <- fit_xrnet$penalty_ext[10]
    +
    +coef_xrnet <- predict(
    +  fit_xrnet,
    +  p = lambda1,
    +  pext = lambda2,
    +  type = "coefficients"
    +)
    +
    +pred_xrnet <- predict(
    +  fit_xrnet,
    +  p = lambda1,
    +  pext = lambda2,
    +  newdata = x_linear,
    +  type = "response"
    +)
    +
    +
    + - + - - - + diff --git a/docs/reference/tune_xrnet-1.png b/docs/reference/tune_xrnet-1.png index 04f8b47..d423379 100644 Binary files a/docs/reference/tune_xrnet-1.png and b/docs/reference/tune_xrnet-1.png differ diff --git a/docs/reference/tune_xrnet.html b/docs/reference/tune_xrnet.html index 20a9e64..8717c8e 100644 --- a/docs/reference/tune_xrnet.html +++ b/docs/reference/tune_xrnet.html @@ -1,268 +1,201 @@ - - - - - - +k-fold cross-validation for hierarchical regularized regression — tune_xrnet • xrnet + Skip to contents + -k-fold cross-validation for hierarchical regularized regression — tune_xrnet • xrnet +
    +
    +
    - - - +
    +

    k-fold cross-validation for hierarchical regularized +regression xrnet

    +
    - - +
    +

    Usage

    +
    tune_xrnet(
    +  x,
    +  y,
    +  external = NULL,
    +  unpen = NULL,
    +  family = c("gaussian", "binomial"),
    +  penalty_main = define_penalty(),
    +  penalty_external = define_penalty(),
    +  weights = NULL,
    +  standardize = c(TRUE, TRUE),
    +  intercept = c(TRUE, FALSE),
    +  loss = c("deviance", "mse", "mae", "auc"),
    +  nfolds = 5,
    +  foldid = NULL,
    +  parallel = FALSE,
    +  control = list()
    +)
    +
    - - - +
    +

    Arguments

    +
    x
    +

    predictor design matrix of dimension \(n x p\), matrix options +include:

    • matrix

    • +
    • big.matrix

    • +
    • filebacked.big.matrix

    • +
    • sparse matrix (dgCMatrix)

    • +
    - - - +
    y
    +

    outcome vector of length \(n\)

    +
    external
    +

    (optional) external data design matrix of dimension +\(p x q\), matrix options include:

    • matrix

    • +
    • sparse matrix (dgCMatrix)

    • +
    - - - +
    unpen
    +

    (optional) unpenalized predictor design matrix, matrix options +include:

    • matrix

    • +
    +
    family
    +

    error distribution for outcome variable, options include:

    • "gaussian"

    • +
    • "binomial"

    • +
    - - - - - - - - - - -
    -
    - - +
    penalty_external
    +

    specifies regularization object for external. See +define_penalty for more details. +See define_penalty for more details.

    -
    -
    -
    - +
    weights
    +

    optional vector of observation-specific weights. +Default is 1 for all observations.

    -
    -

    k-fold cross-validation for hierarchical regularized -regression xrnet

    -
    -
    tune_xrnet(
    -  x,
    -  y,
    -  external = NULL,
    -  unpen = NULL,
    -  family = c("gaussian", "binomial"),
    -  penalty_main = define_penalty(),
    -  penalty_external = define_penalty(),
    -  weights = NULL,
    -  standardize = c(TRUE, TRUE),
    -  intercept = c(TRUE, FALSE),
    -  loss = c("deviance", "mse", "mae", "auc"),
    -  nfolds = 5,
    -  foldid = NULL,
    -  parallel = FALSE,
    -  control = list()
    -)
    - -

    Arguments

    - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    x

    predictor design matrix of dimension \(n x p\), matrix options -include:

      -
    • matrix

    • -
    • big.matrix

    • -
    • filebacked.big.matrix

    • -
    • sparse matrix (dgCMatrix)

    • -
    y

    outcome vector of length \(n\)

    external

    (optional) external data design matrix of dimension -\(p x q\), matrix options include:

      -
    • matrix

    • -
    • sparse matrix (dgCMatrix)

    • -
    unpen

    (optional) unpenalized predictor design matrix, matrix options -include:

      -
    • matrix

    • -
    family

    error distribution for outcome variable, options include:

      -
    • "gaussian"

    • -
    • "binomial"

    • -
    penalty_main

    specifies regularization object for x. See -define_penalty for more details.

    penalty_external

    specifies regularization object for external. See -define_penalty for more details. -See define_penalty for more details.

    weights

    optional vector of observation-specific weights. -Default is 1 for all observations.

    standardize

    indicates whether x and/or external should be -standardized. Default is c(TRUE, TRUE).

    intercept

    indicates whether an intercept term is included for x and/or -external. Default is c(TRUE, FALSE).

    loss

    loss function for cross-validation. Options include:

      -
    • "deviance"

    • +
      standardize
      +

      indicates whether x and/or external should be +standardized. Default is c(TRUE, TRUE).

      + + +
      intercept
      +

      indicates whether an intercept term is included for x and/or +external. Default is c(TRUE, FALSE).

      + + +
      loss
      +

      loss function for cross-validation. Options include:

      • "deviance"

      • "mse" (Mean Squared Error)

      • "mae" (Mean Absolute Error)

      • "auc" (Area under the curve)

      • -
    nfolds

    number of folds for cross-validation. Default is 5.

    foldid

    (optional) vector that identifies user-specified fold for each -observation. If NULL, folds are automatically generated.

    parallel

    use foreach function to fit folds in parallel if TRUE, -must register cluster (doParallel) before using.

    control

    specifies xrnet control object. See -xrnet_control for more details.

    - -

    Value

    - -

    A list of class tune_xrnet with components

    -
    cv_mean

    mean cross-validated error for each penalty combination. +

    + + +
    nfolds
    +

    number of folds for cross-validation. Default is 5.

    + + +
    foldid
    +

    (optional) vector that identifies user-specified fold for each +observation. If NULL, folds are automatically generated.

    + + +
    parallel
    +

    use foreach function to fit folds in parallel if TRUE, +must register cluster (doParallel) before using.

    + + +
    control
    +

    specifies xrnet control object. See +xrnet_control for more details.

    + +
    +
    +

    Value

    + + +

    A list of class tune_xrnet with components

    +
    cv_mean
    +

    mean cross-validated error for each penalty combination. Object returned is a vector if there is no external data (external = NULL) and matrix if there is external data.

    -
    cv_sd

    estimated standard deviation for cross-validated errors. + +

    cv_sd
    +

    estimated standard deviation for cross-validated errors. Object returned is a vector if there is no external data (external = NULL) and matrix if there is external data.

    -
    loss

    loss function used to compute cross-validation error

    -
    opt_loss

    the value of the loss function for the optimal + +

    loss
    +

    loss function used to compute cross-validation error

    + +
    opt_loss
    +

    the value of the loss function for the optimal cross-validated error

    -
    opt_penalty

    first-level penalty value that achieves the optimal loss

    -
    opt_penalty_ext

    second-level penalty value that achieves the optimal + +

    opt_penalty
    +

    first-level penalty value that achieves the optimal loss

    + +
    opt_penalty_ext
    +

    second-level penalty value that achieves the optimal loss (if external data is present)

    -
    fitted_model

    fitted xrnet object using all data, see -xrnet for details of object

    -

    Details

    +
    fitted_model
    +

    fitted xrnet object using all data, see +xrnet for details of object

    +
    +
    +

    Details

    k-fold cross-validation is used to determine the 'optimal' combination of hyperparameter values, where optimal is based on the optimal value obtained for the user-selected loss function across the k folds. To @@ -275,48 +208,44 @@

    Details makeCluster and then register the cluster registerDoParallel. See the parallel, foreach, and/or doParallel R packages for more details on how to setup parallelization.

    +

    -

    Examples

    -
    ## cross validation of hierarchical linear regression model -data(GaussianExample) - -## 5-fold cross validation -cv_xrnet <- tune_xrnet( - x = x_linear, - y = y_linear, - external = ext_linear, - family = "gaussian", - control = xrnet_control(tolerance = 1e-6) -) - -## contour plot of cross-validated error -plot(cv_xrnet) -
    -
    - -
    +
    +

    Examples

    +
    ## cross validation of hierarchical linear regression model
    +data(GaussianExample)
    +
    +## 5-fold cross validation
    +cv_xrnet <- tune_xrnet(
    +  x = x_linear,
    +  y = y_linear,
    +  external = ext_linear,
    +  family = "gaussian",
    +  control = xrnet_control(tolerance = 1e-6)
    +)
    +
    +## contour plot of cross-validated error
    +plot(cv_xrnet)
    +
    +
    +
    + - + - - - + diff --git a/docs/reference/x_linear.html b/docs/reference/x_linear.html index 9830f05..2ad9fc6 100644 --- a/docs/reference/x_linear.html +++ b/docs/reference/x_linear.html @@ -1,165 +1,81 @@ - - - - - - - -Simulated example data for hierarchical regularized linear regression — x_linear • xrnet - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - +Simulated example data for hierarchical regularized linear regression — x_linear • xrnet + Skip to contents + +
    -
    - +
    -
    - +
    + - - - + diff --git a/docs/reference/xrnet.html b/docs/reference/xrnet.html index 3cb34a9..5f4cc94 100644 --- a/docs/reference/xrnet.html +++ b/docs/reference/xrnet.html @@ -1,137 +1,65 @@ - - - - - - - -Fit hierarchical regularized regression model — xrnet • xrnet - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - +implemented in the next major update.">Fit hierarchical regularized regression model — xrnet • xrnet + Skip to contents + +
    -
    - - -
    +
    +

    Examples

    +
    ### hierarchical regularized linear regression ###
    +data(GaussianExample)
    +
    +## define penalty for predictors and external variables
    +## default is ridge for predictors and lasso for external
    +## see define_penalty() function for more details
    +
    +penMain <- define_penalty(0, num_penalty = 20)
    +penExt <- define_penalty(1, num_penalty = 20)
    +
    +## fit model with defined regularization
    +fit_xrnet <- xrnet(
    +  x = x_linear,
    +  y = y_linear,
    +  external = ext_linear,
    +  family = "gaussian",
    +  penalty_main = penMain,
    +  penalty_external = penExt
    +)
    +
    +
    +
    -
    - +
    + - - - + diff --git a/docs/reference/xrnet_control.html b/docs/reference/xrnet_control.html index 353714d..8ee74ea 100644 --- a/docs/reference/xrnet_control.html +++ b/docs/reference/xrnet_control.html @@ -1,217 +1,142 @@ - - - - - - +Control function for xrnet fitting — xrnet_control • xrnet + Skip to contents + -Control function for xrnet fitting — xrnet_control • xrnet +
    +
    +
    - - - +
    +

    Control function for xrnet fitting.

    +
    - - +
    +

    Usage

    +
    xrnet_control(
    +  tolerance = 1e-08,
    +  max_iterations = 1e+05,
    +  dfmax = NULL,
    +  pmax = NULL,
    +  lower_limits = NULL,
    +  upper_limits = NULL
    +)
    +
    - - - +
    +

    Arguments

    +
    tolerance
    +

    positive convergence criterion. Default is 1e-08.

    - - - +
    max_iterations
    +

    maximum number of iterations to run coordinate +gradient descent across all penalties before returning an error. +Default is 1e+05.

    +
    dfmax
    +

    maximum number of variables allowed in model. Default is +\(ncol(x) + ncol(unpen) + ncol(external) + intercept[1] + intercept[2]\).

    - - - +
    pmax
    +

    maximum number of variables with nonzero coefficient estimate. +Default is \(min(2 * dfmax + 20, ncol(x) + ncol(unpen) + ncol(external) ++ intercept[2])\).

    +
    lower_limits
    +

    vector of lower limits for each coefficient. Default is +-Inf for all variables.

    - - - - - - - - - - -
    -
    - - - +
    +
    +

    Value

    + - - -
    -
    - - -
    -

    Control function for xrnet fitting.

    -
    +

    A list object with the following components:

    +
    tolerance
    +

    The coordinate descent stopping criterion.

    -
    xrnet_control(
    -  tolerance = 1e-08,
    -  max_iterations = 1e+05,
    -  dfmax = NULL,
    -  pmax = NULL,
    -  lower_limits = NULL,
    -  upper_limits = NULL
    -)
    - -

    Arguments

    - - - - - - - - - - - - - - - - - - - - - - - - - - -
    tolerance

    positive convergence criterion. Default is 1e-08.

    max_iterations

    maximum number of iterations to run coordinate -gradient descent across all penalties before returning an error. -Default is 1e+05.

    dfmax

    maximum number of variables allowed in model. Default is -\(ncol(x) + ncol(unpen) + ncol(external) + intercept[1] + intercept[2]\).

    pmax

    maximum number of variables with nonzero coefficient estimate. -Default is \(min(2 * dfmax + 20, ncol(x) + ncol(unpen) + ncol(external) -+ intercept[2])\).

    lower_limits

    vector of lower limits for each coefficient. Default is --Inf for all variables.

    upper_limits

    vector of upper limits for each coefficient. Default is -Inf for all variables.

    - -

    Value

    - -

    A list object with the following components:

    -
    tolerance

    The coordinate descent stopping criterion.

    -
    dfmax

    The maximum number of variables that will be allowed in the +

    dfmax
    +

    The maximum number of variables that will be allowed in the model.

    -
    pmax

    The maximum number of variables with nonzero coefficient + +

    pmax
    +

    The maximum number of variables with nonzero coefficient estimate.

    -
    lower_limits

    Feature-specific numeric vector of lower bounds for + +

    lower_limits
    +

    Feature-specific numeric vector of lower bounds for coefficient estimates

    -
    upper_limits

    Feature-specific numeric vector of upper bounds for + +

    upper_limits
    +

    Feature-specific numeric vector of upper bounds for coefficient estimates

    +
    -
    - -
    +
    -
    - +
    + - - - + diff --git a/docs/reference/y_linear.html b/docs/reference/y_linear.html index 635f661..4f5e10e 100644 --- a/docs/reference/y_linear.html +++ b/docs/reference/y_linear.html @@ -1,165 +1,81 @@ - - - - - - - -Simulated outcome data — y_linear • xrnet - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - +Simulated outcome data — y_linear • xrnet + Skip to contents + +
    -
    - +
    -
    + - - - +