diff --git a/docs/404.html b/docs/404.html new file mode 100644 index 00000000..59505782 --- /dev/null +++ b/docs/404.html @@ -0,0 +1,133 @@ + + + + + + + + +Page not found (404) • soilDB + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +
+
+ + + + +
+ +
+
+ + +Content not found. Please use links in the navbar. + +
+ +
+ + + + +
+ + + + + + + + diff --git a/docs/authors.html b/docs/authors.html index e19e5dc1..a1060351 100644 --- a/docs/authors.html +++ b/docs/authors.html @@ -8,21 +8,25 @@ Authors • soilDB + + - + + - - + + + @@ -30,10 +34,12 @@ + + @@ -44,6 +50,7 @@ + @@ -60,7 +67,7 @@ soilDB - 2.3.9 + 2.5 @@ -68,7 +75,7 @@ - @@ -89,6 +95,7 @@ +
@@ -117,19 +124,23 @@

Authors

+ + + diff --git a/docs/index.html b/docs/index.html index c1236055..7aca83e5 100644 --- a/docs/index.html +++ b/docs/index.html @@ -7,8 +7,9 @@ Soil Database Interface • soilDB - - + + + @@ -30,7 +31,7 @@ soilDB - 2.3.9 + 2.5 @@ -38,7 +39,7 @@

Author

D.E. Beaudette

+ + + diff --git a/docs/reference/SSURGO_spatial_query.html b/docs/reference/SSURGO_spatial_query.html index bb9ae87a..582d0d8f 100644 --- a/docs/reference/SSURGO_spatial_query.html +++ b/docs/reference/SSURGO_spatial_query.html @@ -8,21 +8,25 @@ Get SSURGO Data via Spatial Query — SoilWeb_spatial_query • soilDB + + - + + - - + + + @@ -30,13 +34,14 @@ - + + @@ -47,6 +52,7 @@ + @@ -63,7 +69,7 @@ soilDB - 2.3.9 + 2.5 @@ -71,7 +77,7 @@ - @@ -92,6 +97,7 @@ +
@@ -103,13 +109,11 @@

Get SSURGO Data via Spatial Query

-

Get SSURGO Data via Spatial Query to SoilWeb

-
SoilWeb_spatial_query(bbox = NULL, coords = NULL, what = "mapunit", source = "soilweb")
- +

Arguments

@@ -130,41 +134,179 @@

Arg

the data source, currently ignored

- +

Note

This function should be considered experimental; arguments, results, and side-effects could change at any time. SDA now supports spatial queries, consider using SDA_query_features instead.

-

Details

Data are currently available from SoilWeb. These data are a snapshot of the "official" data. The snapshot date is encoded in the "soilweb_last_update" column in the function return value. Planned updates to this function will include a switch to determine the data source: "official" data via USDA-NRCS servers, or a "snapshot" via SoilWeb.

-

Value

The data returned from this function will depend on the query style. See examples below.

-

Examples

-
# query by bbox -
# NOT RUN { -SoilWeb_spatial_query(bbox=c(-122.05, 37, -122, 37.05)) -# }
-# query by coordinate pair -
# NOT RUN { -SoilWeb_spatial_query(coords=c(-121, 38)) -# }
+
# \donttest{ +# query by bbox +SoilWeb_spatial_query(bbox=c(-122.05, 37, -122, 37.05))
#> area_ac areasymbol mukey musym +#> 1 1123.9 ca087 455958 182 +#> 2 749.0 ca087 455891 115 +#> 3 586.7 ca087 455889 113 +#> 4 523.9 ca087 455920 144 +#> 5 459.8 ca087 455934 158 +#> 6 267.7 ca087 455935 159 +#> 7 231.4 ca087 455921 145 +#> 8 175.0 ca087 455947 171 +#> 9 165.9 ca087 455894 118 +#> 10 150.3 ca087 455909 133 +#> 11 149.7 ca087 455918 142 +#> 12 131.9 ca087 455955 179 +#> 13 117.3 ca087 455919 143 +#> 14 107.1 ca087 455953 177 +#> 15 84.9 ca087 455936 160 +#> 16 83.8 ca087 455893 117 +#> 17 83.8 ca087 455927 151 +#> 18 73.9 ca087 455940 164 +#> 19 73.0 ca087 455886 110 +#> 20 72.3 ca087 455960 184 +#> 21 66.2 ca087 455950 174 +#> 22 57.3 ca087 455949 173 +#> 23 54.5 ca087 455959 183 +#> 24 47.2 ca087 455951 175 +#> 25 43.7 ca087 2833423 130 +#> 26 43.2 ca087 455892 116 +#> 27 37.4 ca087 2833403 131 +#> 28 33.6 ca087 455924 148 +#> 29 31.2 ca087 455882 106 +#> 30 28.2 ca087 455877 101 +#> 31 26.6 ca087 455881 105 +#> 32 25.6 ca087 455876 100 +#> 33 25.3 ca087 455901 125 +#> 34 22.2 ca087 455954 178 +#> 35 19.8 ca087 455874 185 +#> 36 19.1 ca087 455933 157 +#> 37 17.2 ca087 455912 136 +#> 38 16.8 ca087 455890 114 +#> 39 15.4 ca087 455911 135 +#> 40 14.2 ca087 455880 104 +#> 41 11.7 ca087 455887 111 +#> 42 7.0 ca087 455915 139 +#> 43 5.9 ca087 455946 170 +#> 44 5.0 ca087 455922 146 +#> 45 4.7 ca087 455956 180 +#> 46 3.5 ca087 455910 134 +#> 47 3.0 ca087 455941 165 +#> 48 1.4 ca087 455948 172 +#> muname +#> 1 Zayante coarse sand, 5 to 30 percent slopes +#> 2 Ben Lomond-Felton complex, 50 to 75 percent slopes +#> 3 Ben Lomond-Catelli-Sur complex, 30 to 75 percent slopes +#> 4 Lompico-Felton complex, 50 to 75 percent slopes, MLRA 4B +#> 5 Nisene-Aptos complex, 50 to 75 percent slopes +#> 6 Pfeiffer gravelly sandy loam, 15 to 30 percent slopes +#> 7 Lompico variant loam, 5 to 30 percent slopes +#> 8 Soquel loam, 2 to 9 percent slopes +#> 9 Bonnydoon-Rock outcrop complex, 50 to 85 percent slopes +#> 10 Elkhorn sandy loam, 2 to 9 percent slopes +#> 11 Lompico-Felton complex, 5 to 30 percent slopes +#> 12 Watsonville loam, thick surface, 2 to 15 percent slopes +#> 13 Lompico-Felton complex, 30 to 50 percent slopes, MLRA 4B +#> 14 Watsonville loam, 2 to 15 percent slopes +#> 15 Pfeiffer gravelly sandy loam, 30 to 50 percent slopes +#> 16 Bonnydoon loam, 30 to 50 percent slopes +#> 17 Maymen stony loam, 30 to 75 percent slopes +#> 18 Pits-Dumps complex +#> 19 Ben Lomond sandy loam, 5 to 15 percent slopes +#> 20 Zayante-Rock outcrop complex, 15 to 75 percent slopes +#> 21 Tierra-Watsonville complex, 15 to 30 percent slopes +#> 22 Sur-Catelli complex, 50 to 75 percent slopes +#> 23 Zayante coarse sand, 30 to 50 percent slopes +#> 24 Tierra-Watsonville complex, 30 to 50 percent slopes +#> 25 Elder sandy loam, 2 to 9 percent slopes, MLRA 14 +#> 26 Bonnydoon loam, 5 to 50 percent slopes, MLRA 4B +#> 27 Elder sandy loam, 9 to 15 percent slopes, MLRA 14 +#> 28 Los Osos loam, 30 to 50 percent slopes, moist +#> 29 Baywood loamy sand, 15 to 30 percent slopes +#> 30 Aptos loam, warm, 30 to 50 percent slopes +#> 31 Baywood loamy sand, 2 to 15 percent slopes +#> 32 Aptos loam, warm, 15 to 30 percent slopes +#> 33 Danville loam, 2 to 9 percent slopes +#> 34 Watsonville loam, thick surface, 0 to 2 percent slopes +#> 35 Water +#> 36 Nisene-Aptos complex, 30 to 50 percent slopes +#> 37 Elkhorn-Pfeiffer complex, 30 to 50 percent slopes +#> 38 Ben Lomond-Felton complex, 30 to 50 percent slopes +#> 39 Elkhorn sandy loam, 15 to 30 percent slopes +#> 40 Baywood loamy sand, 0 to 2 percent slopes +#> 41 Ben Lomond sandy loam, 15 to 50 percent slopes +#> 42 Fluvaquentic Haploxerolls-Aquic Xerofluvents complex, 0 to 15 percent slopes +#> 43 Soquel loam, 0 to 2 percent slopes +#> 44 Los Osos loam, 5 to 15 percent slopes +#> 45 Watsonville loam, thick surface, 15 to 30 percent slope s +#> 46 Elkhorn sandy loam, 9 to 15 percent slopes +#> 47 Riverwash +#> 48 Soquel loam, 9 to 15 percent slopes +#> soilweb_last_update +#> 1 2019-09-16 +#> 2 2019-09-16 +#> 3 2019-09-16 +#> 4 2019-09-16 +#> 5 2019-09-16 +#> 6 2019-09-16 +#> 7 2019-09-16 +#> 8 2019-09-16 +#> 9 2019-09-16 +#> 10 2019-09-16 +#> 11 2019-09-16 +#> 12 2019-09-16 +#> 13 2019-09-16 +#> 14 2019-09-16 +#> 15 2019-09-16 +#> 16 2019-09-16 +#> 17 2019-09-16 +#> 18 2019-09-16 +#> 19 2019-09-16 +#> 20 2019-09-16 +#> 21 2019-09-16 +#> 22 2019-09-16 +#> 23 2019-09-16 +#> 24 2019-09-16 +#> 25 2019-09-16 +#> 26 2019-09-16 +#> 27 2019-09-16 +#> 28 2019-09-16 +#> 29 2019-09-16 +#> 30 2019-09-16 +#> 31 2019-09-16 +#> 32 2019-09-16 +#> 33 2019-09-16 +#> 34 2019-09-16 +#> 35 2019-09-16 +#> 36 2019-09-16 +#> 37 2019-09-16 +#> 38 2019-09-16 +#> 39 2019-09-16 +#> 40 2019-09-16 +#> 41 2019-09-16 +#> 42 2019-09-16 +#> 43 2019-09-16 +#> 44 2019-09-16 +#> 45 2019-09-16 +#> 46 2019-09-16 +#> 47 2019-09-16 +#> 48 2019-09-16
+# query by coordinate pair +SoilWeb_spatial_query(coords=c(-121, 38))
#> ogc_fid areasymbol musym mukey soilweb_last_update dist_meters +#> 1 1479147 ca077 220 462112 2019-09-16 92.17456302
# } +
+ + + diff --git a/docs/reference/STRplot.html b/docs/reference/STRplot.html index 9ca1d9e1..5b99f7b8 100644 --- a/docs/reference/STRplot.html +++ b/docs/reference/STRplot.html @@ -8,21 +8,25 @@ Graphical Description of US Soil Taxonomy Soil Temperature Regimes — STRplot • soilDB + + - + + - - + + + @@ -30,13 +34,14 @@ - + + @@ -47,6 +52,7 @@ + @@ -63,7 +69,7 @@ soilDB - 2.3.9 + 2.5 @@ -71,7 +77,7 @@ - @@ -92,6 +97,7 @@ +
@@ -103,13 +109,11 @@

Graphical Description of US Soil Taxonomy Soil Temperature Regimes

-

Graphical Description of US Soil Taxonomy Soil Temperature Regimes

-
STRplot(mast, msst, mwst, permafrost = FALSE, pt.cex = 2.75, leg.cex = 0.85)
- +

Arguments

@@ -138,35 +142,28 @@

Arg

legend size

- +

Details

Related tutorial.

-

References

Soil Survey Staff. 2015. Illustrated guide to soil taxonomy. U.S. Department of Agriculture, Natural Resources Conservation Service, National Soil Survey Center, Lincoln, Nebraska.

-

See also

estimateSTR

-

Examples

-
par(mar=c(4,1,0,1)) +
par(mar=c(4,1,0,1)) STRplot(mast = 0:25, msst = 10, mwst = 1)
+
-

Site built with pkgdown 1.3.0.

+

Site built with pkgdown 1.4.1.

+
+ + diff --git a/docs/reference/estimateSTR.html b/docs/reference/estimateSTR.html index 3aea579b..a6f53962 100644 --- a/docs/reference/estimateSTR.html +++ b/docs/reference/estimateSTR.html @@ -8,21 +8,25 @@ Estimate Soil Temperature Regime — estimateSTR • soilDB + + - + + - - + + + @@ -30,13 +34,14 @@ - + + @@ -47,6 +52,7 @@ + @@ -63,7 +69,7 @@ soilDB - 2.3.9 + 2.5 @@ -71,7 +77,7 @@ - @@ -92,6 +97,7 @@ +
@@ -103,13 +109,11 @@

Estimate Soil Temperature Regime

-

Estimate soil temperature regime (STR) based on mean annual soil temperature (MAST), mean summer temperature (MSST), mean winter soil temperature (MWST), presence of O horizons, saturated conditions, and presence of permafrost. Several assumptions are made when O horizon or saturation are undefined.

-
estimateSTR(mast, mean.summer, mean.winter, O.hz = NA, saturated = NA, permafrost = FALSE)
- +

Arguments

@@ -138,24 +142,20 @@

Arg

logical vector of permafrost presence / absense

- +

Details

Pending.

Related tutorial.

-

Value

Vector of soil temperature regimes.

-

References

Soil Survey Staff. 2015. Illustrated guide to soil taxonomy. U.S. Department of Agriculture, Natural Resources Conservation Service, National Soil Survey Center, Lincoln, Nebraska.

-

See also

-

Examples

# simple example @@ -167,15 +167,10 @@

Examp

Contents

@@ -184,19 +179,23 @@

Author

+
-

Site built with pkgdown 1.3.0.

+

Site built with pkgdown 1.4.1.

+
+ + diff --git a/docs/reference/fetchHenry-1.png b/docs/reference/fetchHenry-1.png new file mode 100644 index 00000000..a6d2149c Binary files /dev/null and b/docs/reference/fetchHenry-1.png differ diff --git a/docs/reference/fetchHenry.html b/docs/reference/fetchHenry.html index 0643778d..c2340381 100644 --- a/docs/reference/fetchHenry.html +++ b/docs/reference/fetchHenry.html @@ -1,230 +1,227 @@ - - - - - - - - -Download Data from the Henry Mount Soil Temperature and Water Database — fetchHenry • soilDB - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
-
- - - -
- -
-
- - -
- -

This function is a front-end to the REST query functionality of the Henry Mount Soil Temperature and Water Database.

- -
- -
fetchHenry(what='all', usersiteid = NULL, project = NULL, sso = NULL,
-gran = "day", start.date = NULL, stop.date = NULL,
-pad.missing.days = TRUE, soiltemp.summaries = TRUE)
- -

Arguments

- - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
what

type of data to return: 'sensors': sensor metadata only | 'soiltemp': sensor metadata + soil temperature data | 'soilVWC': sensor metadata + soil moisture data | 'airtemp': sensor metadata + air temperature data | 'waterlevel': sensor metadata + water level data |'all': sensor metadata + all sensor data

usersiteid

(optional) filter results using a NASIS user site ID

project

(optional) filter results using a project ID

sso

(optional) filter results using a soil survey office code

gran

data granularity: "day", "week", "month", "year"; returned data are averages

start.date

(optional) starting date filter

stop.date

(optional) ending date filter

pad.missing.days

should missing data ("day" granularity) be filled with NA? see details

soiltemp.summaries

should soil temperature ("day" granularity only) be summarized? see details

- -

Details

- -

Filling missing days with NA is useful for computing and index of how complete the data are, and for estimating (mostly) unbiased MAST and seasonal mean soil temperatures. Summaries are computed by first averaging over Julian day, then averaging over all days of the year (MAST) or just those days that occur within "summer" or "winter". This approach makes it possible to estimate summaries in the presence of missing data. The quality of summaries should be weighted by the number of "functional years" (number of years with non-missing data after combining data by Julian day) and "complete years" (number of years of data with >= 365 days of non-missing data).

- -

Value

- -

a list containing:

-
sensors

a SpatialPointsDataFrame object containing site-level information

-
soiltemp

a data.frame object containing soil temperature timeseries data

-
soilVWC

a data.frame object containing soil moisture timeseries data

-
airtemp

a data.frame object containing air temperature timeseries data

-
waterlevel

a data.frame object containing water level timeseries data

- - -

Note

- -

This function and the back-end database are very much a work in progress.

- -

See also

- - - - -

Examples

-
# NOT RUN {
-library(lattice)
-
-# get CA630 data as daily averages
-x <- fetchHenry(project='CA630', gran = 'day')
-
-# inspect data gaps
-levelplot(factor(!is.na(sensor_value)) ~ doy * factor(year) | name,
-data=x$soiltemp, col.regions=c('grey', 'RoyalBlue'), cuts=1,
-colorkey=FALSE, as.table=TRUE, scales=list(alternating=3),
-par.strip.text=list(cex=0.75), strip=strip.custom(bg='yellow'),
-xlab='Julian Day', ylab='Year')
-# }
-
- -
- -
- - -
-

Site built with pkgdown 1.3.0.

-
-
-
- - - - - - + + + + + + + + +Download Data from the Henry Mount Soil Temperature and Water Database — fetchHenry • soilDB + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +
+
+ + + + +
+ +
+
+ + +
+

This function is a front-end to the REST query functionality of the Henry Mount Soil Temperature and Water Database.

+
+ +
fetchHenry(what='all', usersiteid = NULL, project = NULL, sso = NULL,
+gran = "day", start.date = NULL, stop.date = NULL,
+pad.missing.days = TRUE, soiltemp.summaries = TRUE)
+ +

Arguments

+ + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +
what

type of data to return: 'sensors': sensor metadata only | 'soiltemp': sensor metadata + soil temperature data | 'soilVWC': sensor metadata + soil moisture data | 'airtemp': sensor metadata + air temperature data | 'waterlevel': sensor metadata + water level data |'all': sensor metadata + all sensor data

usersiteid

(optional) filter results using a NASIS user site ID

project

(optional) filter results using a project ID

sso

(optional) filter results using a soil survey office code

gran

data granularity: "day", "week", "month", "year"; returned data are averages

start.date

(optional) starting date filter

stop.date

(optional) ending date filter

pad.missing.days

should missing data ("day" granularity) be filled with NA? see details

soiltemp.summaries

should soil temperature ("day" granularity only) be summarized? see details

+ +

Details

+ +

Filling missing days with NA is useful for computing and index of how complete the data are, and for estimating (mostly) unbiased MAST and seasonal mean soil temperatures. Summaries are computed by first averaging over Julian day, then averaging over all days of the year (MAST) or just those days that occur within "summer" or "winter". This approach makes it possible to estimate summaries in the presence of missing data. The quality of summaries should be weighted by the number of "functional years" (number of years with non-missing data after combining data by Julian day) and "complete years" (number of years of data with >= 365 days of non-missing data).

+

Value

+ +

a list containing:

+
sensors

a SpatialPointsDataFrame object containing site-level information

+
soiltemp

a data.frame object containing soil temperature timeseries data

+
soilVWC

a data.frame object containing soil moisture timeseries data

+
airtemp

a data.frame object containing air temperature timeseries data

+
waterlevel

a data.frame object containing water level timeseries data

+ +

Note

+ +

This function and the back-end database are very much a work in progress.

+

See also

+ + + +

Examples

+
# \donttest{ +library(lattice)
#> Warning: package 'lattice' was built under R version 3.5.3
+# get CA630 data as daily averages +x <- fetchHenry(project='CA630', gran = 'day')
#> computing un-biased soil temperature summaries
#> | | | 0% | | | 1% | |= | 1% | |= | 2% | |== | 2% | |== | 3% | |== | 4% | |=== | 4% | |=== | 5% | |==== | 5% | |==== | 6% | |===== | 6% | |===== | 7% | |===== | 8% | |====== | 8% | |====== | 9% | |======= | 9% | |======= | 10% | |======= | 11% | |======== | 11% | |======== | 12% | |========= | 12% | |========= | 13% | |========= | 14% | |========== | 14% | |========== | 15% | |=========== | 15% | |=========== | 16% | |============ | 16% | |============ | 17% | |============ | 18% | |============= | 18% | |============= | 19% | |============== | 19% | |============== | 20% | |============== | 21% | |=============== | 21% | |=============== | 22% | |================ | 22% | |================ | 23% | |================ | 24% | |================= | 24% | |================= | 25% | |================== | 25% | |================== | 26% | |=================== | 26% | |=================== | 27% | |=================== | 28% | |==================== | 28% | |==================== | 29% | |===================== | 29% | |===================== | 30% | |===================== | 31% | |====================== | 31% | |====================== | 32% | |======================= | 32% | |======================= | 33% | |======================= | 34% | |======================== | 34% | |======================== | 35% | |========================= | 35% | |========================= | 36% | |========================== | 36% | |========================== | 37% | |========================== | 38% | |=========================== | 38% | |=========================== | 39% | |============================ | 39% | |============================ | 40% | |============================ | 41% | |============================= | 41% | |============================= | 42% | |============================== | 42% | |============================== | 43% | |============================== | 44% | |=============================== | 44% | |=============================== | 45% | |================================ | 45% | |================================ | 46% | |================================= | 46% | |================================= | 47% | |================================= | 48% | |================================== | 48% | |================================== | 49% | |=================================== | 49% | |=================================== | 50% | |=================================== | 51% | |==================================== | 51% | |==================================== | 52% | |===================================== | 52% | |===================================== | 53% | |===================================== | 54% | |====================================== | 54% | |====================================== | 55% | |======================================= | 55% | |======================================= | 56% | |======================================== | 56% | |======================================== | 57% | |======================================== | 58% | |========================================= | 58% | |========================================= | 59% | |========================================== | 59% | |========================================== | 60% | |========================================== | 61% | |=========================================== | 61% | |=========================================== | 62% | |============================================ | 62% | |============================================ | 63% | |============================================ | 64% | |============================================= | 64% | |============================================= | 65% | |============================================== | 65% | |============================================== | 66% | |=============================================== | 66% | |=============================================== | 67% | |=============================================== | 68% | |================================================ | 68% | |================================================ | 69% | |================================================= | 69% | |================================================= | 70% | |================================================= | 71% | |================================================== | 71% | |================================================== | 72% | |=================================================== | 72% | |=================================================== | 73% | |=================================================== | 74% | |==================================================== | 74% | |==================================================== | 75% | |===================================================== | 75% | |===================================================== | 76% | |====================================================== | 76% | |====================================================== | 77% | |====================================================== | 78% | |======================================================= | 78% | |======================================================= | 79% | |======================================================== | 79% | |======================================================== | 80% | |======================================================== | 81% | |========================================================= | 81% | |========================================================= | 82% | |========================================================== | 82% | |========================================================== | 83% | |========================================================== | 84% | |=========================================================== | 84% | |=========================================================== | 85% | |============================================================ | 85% | |============================================================ | 86% | |============================================================= | 86% | |============================================================= | 87% | |============================================================= | 88% | |============================================================== | 88% | |============================================================== | 89% | |=============================================================== | 89% | |=============================================================== | 90% | |=============================================================== | 91% | |================================================================ | 91% | |================================================================ | 92% | |================================================================= | 92% | |================================================================= | 93% | |================================================================= | 94% | |================================================================== | 94% | |================================================================== | 95% | |=================================================================== | 95% | |=================================================================== | 96% | |==================================================================== | 96% | |==================================================================== | 97% | |==================================================================== | 98% | |===================================================================== | 98% | |===================================================================== | 99% | |======================================================================| 99% | |======================================================================| 100%
#> 32 sensors loaded (3.72 Mb transferred)
+# inspect data gaps +levelplot(factor(!is.na(sensor_value)) ~ doy * factor(year) | name, +data=x$soiltemp, col.regions=c('grey', 'RoyalBlue'), cuts=1, +colorkey=FALSE, as.table=TRUE, scales=list(alternating=3), +par.strip.text=list(cex=0.75), strip=strip.custom(bg='yellow'), +xlab='Julian Day', ylab='Year')
# } +
+
+ +
+ + +
+ + +
+

Site built with pkgdown 1.4.1.

+
+ +
+
+ + + + + + + + diff --git a/docs/reference/fetchKSSL-1.png b/docs/reference/fetchKSSL-1.png new file mode 100644 index 00000000..fa9a7603 Binary files /dev/null and b/docs/reference/fetchKSSL-1.png differ diff --git a/docs/reference/fetchKSSL-2.png b/docs/reference/fetchKSSL-2.png new file mode 100644 index 00000000..72c1ad29 Binary files /dev/null and b/docs/reference/fetchKSSL-2.png differ diff --git a/docs/reference/fetchKSSL-3.png b/docs/reference/fetchKSSL-3.png new file mode 100644 index 00000000..a4045a89 Binary files /dev/null and b/docs/reference/fetchKSSL-3.png differ diff --git a/docs/reference/fetchKSSL.html b/docs/reference/fetchKSSL.html index 958dcb32..02400574 100644 --- a/docs/reference/fetchKSSL.html +++ b/docs/reference/fetchKSSL.html @@ -8,21 +8,25 @@ Fetch KSSL Data — fetchKSSL • soilDB + + - + + - - + + + @@ -30,13 +34,14 @@ - + + @@ -47,6 +52,7 @@ + @@ -63,7 +69,7 @@ soilDB - 2.3.9 + 2.5 @@ -71,7 +77,7 @@ - @@ -92,6 +97,7 @@ +
@@ -103,14 +109,12 @@

Fetch KSSL Data

-

Get soil characterization and morphologic data via BBOX, MLRA, or series name query, from the KSSL database.

-
fetchKSSL(series=NULL, bbox=NULL, mlra=NULL, pedlabsampnum=NULL,
 pedon_id=NULL, pedon_key=NULL, returnMorphologicData=FALSE, simplifyColors=FALSE)
- +

Arguments

@@ -120,7 +124,7 @@

Arg

- + @@ -147,7 +151,7 @@

Arg

bbox

a bounding box in WGS84 geographic coordinates e.g. c(-120, 37, -122, 38)

a bounding box in WGS84 geographic coordinates e.g. c(-120, 37, -122, 38)

mlra

simplify colors (from morphologic data) and join with horizon data

- +

Details

This is an experimental interface to a subset for the most commonly used data from a snapshot of KSSL (lab characterization) and NASIS (morphologic) data. The snapshots were last updated September 2018 (KSSL / NASIS).

@@ -156,106 +160,87 @@

Details

Setting simplifyColors=TRUE will automatically flatten the soil color data and join to horizon level attributes.

Function arguments (series, mlra, etc.) are NOT vectorized: the first element of a vector will be used when supplied as a filter. See the fetchKSSL tutorial for ideas on how to iterate over a set of IDs. )

-

Value

a SoilProfileCollection object when returnMorphologicData is FALSE, otherwise a list.

-

Note

SoilWeb maintains a snapshot of these KSSL and NASIS data. The SoilWeb snapshot was developed using methods described here: https://github.com/dylanbeaudette/process-kssl-snapshot. Please use the link below for the live data.

-

References

http://ncsslabdatamart.sc.egov.usda.gov/

-

See also

-

Examples

-
# NOT RUN {
+    
# \donttest{ # search by series name -s <- fetchKSSL(series='auburn') - +s <- fetchKSSL(series='auburn')
#> 14 pedons loaded (0.06 Mb transferred)
# search by bounding-box # s <- fetchKSSL(bbox=c(-120, 37, -122, 38)) # how many pedons -length(s) - +length(s)
#> [1] 14
# plot -par(mar=c(0,0,0,0)) -plot(s, name='hzn_desgn', max.depth=150) - -## +if(requireNamespace("sp")) { + par(mar=c(0,0,0,0)) + sp::plot(s, name='hzn_desgn', max.depth=150) +}
## ## morphologic data ## -library(soilDB) -library(aqp) -library(plyr) -library(reshape2) - +library(soilDB) +library(aqp)
#> This is aqp 1.19
#> +#> Attaching package: 'aqp'
#> The following object is masked from 'package:base': +#> +#> union
library(plyr)
#> Warning: package 'plyr' was built under R version 3.5.3
library(reshape2)
#> Warning: package 'reshape2' was built under R version 3.5.3
# get lab and morphologic data -s <- fetchKSSL(series='auburn', returnMorphologicData = TRUE) - +s <- fetchKSSL(series='auburn', returnMorphologicData = TRUE)
#> 14 pedons loaded (0.09 Mb transferred)
# extract SPC pedons <- s$SPC ## simplify color data manually -s.colors <- simplifyColorData(s$morph$phcolor, id.var = 'labsampnum', wt='colorpct') - +s.colors <- simplifyColorData(s$morph$phcolor, id.var = 'labsampnum', wt='colorpct')
#> mixing dry colors ... [2 of 53 horizons]
#> Loading required namespace: farver
#> mixing moist colors ... [4 of 52 horizons]
# merge color data into SPC -h <- horizons(pedons) -h <- join(h, s.colors, by='labsampnum', type='left', match='first') -horizons(pedons) <- h +h <- horizons(pedons) +h <- join(h, s.colors, by='labsampnum', type='left', match='first') +horizons(pedons) <- h # check -par(mar=c(0,0,0,0)) -plot(pedons, color='moist_soil_color', print.id=FALSE) - +par(mar=c(0,0,0,0)) +plot(pedons, color='moist_soil_color', print.id=FALSE)
#> unable to guess column containing horizon designations
## automatically simplify color data -s <- fetchKSSL(series='auburn', returnMorphologicData = TRUE, simplifyColors=TRUE) - +s <- fetchKSSL(series='auburn', returnMorphologicData = TRUE, simplifyColors=TRUE)
#> mixing dry colors ... [2 of 53 horizons]
#> mixing moist colors ... [4 of 52 horizons]
#> 14 pedons loaded (0.1 Mb transferred)
# check -par(mar=c(0,0,0,0)) -plot(pedons, color='moist_soil_color', print.id=FALSE) - +par(mar=c(0,0,0,0)) +plot(pedons, color='moist_soil_color', print.id=FALSE)
#> unable to guess column containing horizon designations
# simplify fragment data s.frags <- simplifyFragmentData(s$morph$phfrags, id.var='labsampnum') # merge fragment data into SPC -h <- horizons(pedons) -h <- join(h, s.frags, by='labsampnum', type='left', match='first') -horizons(pedons) <- h +h <- horizons(pedons) +h <- join(h, s.frags, by='labsampnum', type='left', match='first') +horizons(pedons) <- h # check -par(mar=c(0,0,3,0)) -plot(pedons, color='total_frags_pct', print.id=FALSE) - - -# }
+par(mar=c(0,0,3,0)) +plot(pedons, color='total_frags_pct', print.id=FALSE)
#> unable to guess column containing horizon designations
+# } +
+
-

Site built with pkgdown 1.3.0.

+

Site built with pkgdown 1.4.1.

+
+ + diff --git a/docs/reference/fetchNASIS-1.png b/docs/reference/fetchNASIS-1.png new file mode 100644 index 00000000..35caeb30 Binary files /dev/null and b/docs/reference/fetchNASIS-1.png differ diff --git a/docs/reference/fetchNASIS.html b/docs/reference/fetchNASIS.html index 691af697..ca216480 100644 --- a/docs/reference/fetchNASIS.html +++ b/docs/reference/fetchNASIS.html @@ -6,23 +6,27 @@ -Fetch commonly used site/pedon/horizon or component data from a local NASIS database. — fetchNASIS • soilDB +Fetch commonly used site/pedon/horizon or component data from NASIS. — fetchNASIS • soilDB + + - + + - - + + + @@ -30,13 +34,14 @@ - - + + + @@ -47,6 +52,7 @@ + @@ -63,7 +69,7 @@ soilDB - 2.3.9 + 2.5 @@ -71,7 +77,7 @@ - @@ -92,52 +97,43 @@ +
- -

Fetch commonly used site/pedon/horizon data or component from a local NASIS database, return as a SoilProfileCollection object.

- +

Fetch commonly used site/pedon/horizon data or component from NASIS, returned as a SoilProfileCollection object.

-
fetchNASIS(from = 'pedons', ...)
-
-fetchNASIS_pedons(SS=TRUE, rmHzErrors=TRUE, nullFragsAreZero=TRUE,
-                  soilColorState='moist', lab=FALSE,
-                  stringsAsFactors = default.stringsAsFactors()
+    
fetchNASIS(from = 'pedons', url = NULL, SS=TRUE, rmHzErrors=TRUE, nullFragsAreZero=TRUE,
+                  soilColorState='moist', lab=FALSE, fill = FALSE,
+                  stringsAsFactors = default.stringsAsFactors()
                   )
-fetchNASIS_components(SS=TRUE, rmHzErrors=TRUE, fill = FALSE,
-                      stringsAsFactors = default.stringsAsFactors()
-                      )
+
 getHzErrorsNASIS(strict=TRUE)
- +

Arguments

- + - - + + - - - - @@ -160,76 +156,108 @@

Arg

- +
from

determines what objects should fetched? ('pedons' | 'components')

determines what objects should fetched? ('pedons' | 'components' | 'pedon_report')

arguments passed to fetchNASIS_pedons() or fetchNASIS_components()

url

string specifying the url for the NASIS pedon_report (default: NULL)

SS

fetch data from the currently loaded selected set in NASIS or from the entire local database (default: TRUE)

drop.unused.levels

logical: indicating whether to drop unused levels in classifying factors. This is useful when a class has large number of unused classes, which can waste space in tables and figures.

stringsAsFactors

logical: should character vectors be converted to factors? This argument is passed to the uncode() function. It does not convert those vectors that have been set outside of uncode() (i.e. hard coded). The 'factory-fresh' default is TRUE, but this can be changed by setting options(stringsAsFactors = FALSE)

fill

(fetchNASIS_components only: include components without horizon data? (default: FALSE)

(fetchNASIS(from='components') only: include component records without horizon data in result? (default: FALSE)

strict

how strict should horizon boundaries be checked for consistency: TRUE=more | FALSE=less

- +

Value

a SoilProfileCollection class object

-

Details

-

The value of nullFragsAreZero will have a significant impact on the rock fragment fractions returned by fetchNASIS. Set nullFragsAreZero = FALSE in those cases where there are many data-gaps and NULL rock fragment values should be interpretated as NULLs. Set nullFragsAreZero = TRUE in those cases where NULL rock fragment values should be interpreted as 0.

+

This function imports data from NASIS into R as a S3 R object specified by the aqp R package, known as a soil profile collection object. It flattens NASIS's pedon and component tables, including their various child tables, into several more easily managable data frames. Primarily these functions access the local NASIS database using an ODBC connection. However using the fetchNASIS() argument from = "pedon_report", data can be read from the NASIS Report 'fetchNASIS', as either a txt file or url. The primary purpose of fetchNASIS(from = "pedon_report") is to faclitate importing datasets larger than 8000+ pedons/components.

+

The value of nullFragsAreZero will have a significant impact on the rock fragment fractions returned by fetchNASIS. Set nullFragsAreZero = FALSE in those cases where there are many data-gaps and NULL rock fragment values should be interpretated as NULLs. Set nullFragsAreZero = TRUE in those cases where NULL rock fragment values should be interpreted as 0.

This function attempts to do most of the boilerplate work when extracting site/pedon/horizon or component data from a local NASIS database. Pedons that are missing horizon data, or have errors in their horizonation are excluded from the returned object, however, their IDs are printed on the console. Pedons with combination horizons (e.g. B/C) are erroneously marked as errors due to the way in which they are stored in NASIS as two overlapping horizon records.

See getHzErrorsNASIS for a simple approach to identifying pedons with problematic horizonation.

See the NASIS component tutorial, and NASIS pedon tutorial for more information.

- -

Note

- -

This function currently works only on Windows, and requires a 'nasis_local' ODBC connection.

-

Examples

-
# NOT RUN {
-# query depends on some pedon data, queried against the national database
-# note that you must setup this connection ahead of time
-f <- fetchNASIS(from = 'pedons')
-
-# plot only those profiles with densic contact
-plot(f[which(f$densic.contact), ], name='hzname')
-
-# get basic component data from local NASIS, after performing a 
-# DMU-* query against the national database
-fc <- fetchNASIS(from = 'components')
-# }
+
# \donttest{ +# check required packages +if(require("aqp") & requireNamespace("RODBC")) { + + # test that NASIS db connection is set up + # note that you must setup this connection ahead of time + # see inst/doc/setup_ODBC_local_NASIS.pdf + if(any(grepl(names(RODBC::odbcDataSources()), pattern="nasis_local"))) { + + ## 1. fetchNASIS(from='pedon') NASIS setup + # query depends on some pedon data in your selected set + + f <- try(fetchNASIS(from = 'pedons')) + # note: wrap in try() to capture error in case of empty selected set + + # plot only those profiles with densic contact + if(!inherits(f,'try-error')) { + + # which pedons have densic.contact==TRUE + idx <- which(f$densic.contact) + + # if there are any pedons with densic contacts, plot them + if(length(idx)) + plot(f[idx, ], name='hzname') + + } else { message(f[1]) } + + ## 2. fetchNASIS(from='component') NASIS setup: + # perform a DMU-* query against the national database + + fc <- try(fetchNASIS(from = 'components')) + # note: wrap in try() to capture error in case of empty selected set + + ## 3. fetchNASIS(from='pedon_report') NASIS setup: + # run the 11-IND NASIS report 'fetchNASIS' against the national database + # the result will automatically be opened and saved as fetchNASIS.txt + # in NASIS Temp folder + + # the fetchNASIS.txt fileis read by fetchNASIS(from = 'pedon_report') + # alternate: run offline against national db and supply `url` argument + try(f <- fetchNASIS(from = 'pedon_report')) + # note: wrap in try() to capture error in case of empty selected set + } +}
#> Loading required namespace: RODBC
#> multiple horizontal datums present, consider using WGS84 coordinates (x_std, y_std)
#> NOTICE: multiple `labsampnum` values / horizons; see pedon IDs: +#> 1981IL115059,1990IL203027,77KY047001,C1805P01-1,S2003IL111005,S2006SC085002,S2011MI117102,V1985-VA023-309,V1985-VA023-311,V1985-VA023-312,V1985-VA023-314,V1985-VA023-317,V1985-VA023-319,V1985-VA161-328,V1985-VA161-332,V1985-VA161-334,V1985-VA163-321,V1985-VA163-322,V1985-VA163-326,V1985-VA173-337,V1985-VA173-339,V1985-VA173-342,V1985-VA173-343,V1985-VA173-345,V1985-VA173-346,V1985-VA191-349,V1985-VA191-350,V1985-VA191-356,V1985-VA197-341,V1985-VA197-360,V1985-VA197-361,V1985-VA770-329,V1985-VA770-333,V1985-VA775-335
#> mixing dry colors ... [10 of 387 horizons]
#> mixing moist colors ... [319 of 3436 horizons]
#> Warning: some records are missing rock fragment volume, these have been removed
#> -> QC: some fragsize_h values == 76mm, may be mis-classified as cobbles [91 / 3505 records]
#> Warning: some records are missing artifact volume, these have been removed
#> Warning: all records are missing artifact volume (NULL). buffering result with NA. will be converted to zero if nullFragsAreZero = TRUE.
#> replacing missing lower horizon depths with top depth + 1cm ... [20 horizons]
#> top/bottom depths equal, adding 1cm to bottom depth ... [11 horizons]
#> -> QC: sites without pedons: use `get('sites.missing.pedons', envir=soilDB.env)` for related usersiteid values
#> -> QC: duplicate pedons: use `get('dup.pedon.ids', envir=soilDB.env)` for related peiid values
#> -> QC: horizon errors detected, use `get('bad.pedon.ids', envir=soilDB.env)` for related userpedonid values or `get('bad.horizons', envir=soilDB.env)` for related horizon designations
#> -> QC: pedons missing bottom hz depths: use `get('missing.bottom.depths', envir=soilDB.env)` for related pedon IDs
#> -> QC: equal hz top and bottom depths: use `get('top.bottom.equal', envir=soilDB.env)` for related pedon IDs
#> Warning: No horizon data in NASIS component query result.
#> Error in (function (classes, fdef, mtable) : +#> unable to find an inherited method for function 'site<-' for signature '"data.frame"' +#> Error in .fetchNASIS_report(url = url, rmHzErrors = rmHzErrors, nullFragsAreZero = nullFragsAreZero, : +#> the temp file C:/ProgramData/USDA/NASIS/Temp/fetchNASIS.txt +#> doesn't exist, please run the fetchNASIS report from NASIS
# } +
+
-

Site built with pkgdown 1.3.0.

+

Site built with pkgdown 1.4.1.

+
+ + diff --git a/docs/reference/fetchNASISLabData.html b/docs/reference/fetchNASISLabData.html index c9e64f58..1ea1972c 100644 --- a/docs/reference/fetchNASISLabData.html +++ b/docs/reference/fetchNASISLabData.html @@ -8,21 +8,25 @@ Fetch lab data used site/horizon data from a PedonPC database. — fetchNASISLabData • soilDB + + - + + - - + + + @@ -30,13 +34,14 @@ - + + @@ -47,6 +52,7 @@ + @@ -63,7 +69,7 @@ soilDB - 2.3.9 + 2.5 @@ -71,7 +77,7 @@ - @@ -92,6 +97,7 @@ +
@@ -103,54 +109,72 @@

Fetch lab data used site/horizon data from a PedonPC database.

-

Fetch KSSL laboratory pedon/horizon layer data from a local NASIS database, return as a SoilProfileCollection object.

-
-
fetchNASISLabData()
- -

Details

+
fetchNASISLabData(SS = TRUE)
+ +

Arguments

+ + + + + + +
SS

fetch data from the currently loaded selected set in NASIS or from the entire local database (default: TRUE)

-

This function currently works only on Windows, and requires a 'nasis_local' ODBC connection.

-

Value

a SoilProfileCollection class object

- +

Details

+ +

This function currently works only on Windows, and requires a 'nasis_local' ODBC connection.

Note

This fuction attempts to do most of the boilerplate work when extracting KSSL laboratory site/horizon data from a local NASIS database. Lab pedons that have errors in their horizonation are excluded from the returned object, however, their IDs are printed on the console. See getHzErrorsNASIS for a simple approach to identifying pedons with problematic horizonation.

-

See also

-

Examples

-
# NOT RUN {
-# query depends on some lab data, queried against the national database
-# note that you must setup this connection ahead of time
-# see inst/doc/setup_ODBC_local_NASIS.pdf
-f <- fetchNASISLabData()
-
-# plot only those profiles with densic contact
-#plot(f[which(f$densic.contact), ], name='hzname')
-
-# }
+
# \donttest{ + # check required packages + if(require(aqp) & requireNamespace("RODBC")) { + + # test that NASIS db connection is set up + # note that you must setup this connection ahead of time + # see inst/doc/setup_ODBC_local_NASIS.pdf + if(any(grepl(names(RODBC::odbcDataSources()), pattern="nasis_local"))) { + + # query depends on some lab data, queried against the national database + f <- try(fetchNASISLabData()) + # note: wrap in try in case no lab data in selected set + + # plot only those profiles with densic contact + if(!inherits(f,'try-error')) { + + # which pedons have densic.contact==TRUE + idx <- which(f$densic.contact) + + # if there are any pedons with densic contacts, plot them + if(length(idx)) + plot(f[idx, ], name='hzname') + + } else { message(f[1]) } + } + }
#> Error in fetchNASISLabData() : +#> Selected set is missing either the Pedon or Layer NCSS Lab Data table, please load and try again :)
#> Error in fetchNASISLabData() : +#> Selected set is missing either the Pedon or Layer NCSS Lab Data table, please load and try again :)
# } +
+
-

Site built with pkgdown 1.3.0.

+

Site built with pkgdown 1.4.1.

+
+ + diff --git a/docs/reference/fetchNASISWebReport-1.png b/docs/reference/fetchNASISWebReport-1.png new file mode 100644 index 00000000..cd01ac06 Binary files /dev/null and b/docs/reference/fetchNASISWebReport-1.png differ diff --git a/docs/reference/fetchNASISWebReport-2.png b/docs/reference/fetchNASISWebReport-2.png new file mode 100644 index 00000000..01e034bf Binary files /dev/null and b/docs/reference/fetchNASISWebReport-2.png differ diff --git a/docs/reference/fetchNASISWebReport-3.png b/docs/reference/fetchNASISWebReport-3.png new file mode 100644 index 00000000..f767d556 Binary files /dev/null and b/docs/reference/fetchNASISWebReport-3.png differ diff --git a/docs/reference/fetchNASISWebReport.html b/docs/reference/fetchNASISWebReport.html new file mode 100644 index 00000000..87e462a1 --- /dev/null +++ b/docs/reference/fetchNASISWebReport.html @@ -0,0 +1,309 @@ + + + + + + + + +Extract component tables from a the NASIS Web Reports — fetchNASISWebReport • soilDB + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +
+
+ + + + +
+ +
+
+ + +
+

Get, format, impute, and return component tables.

+
+ +
fetchNASISWebReport(projectname, rmHzErrors = FALSE, fill = FALSE,
+                    stringsAsFactors = default.stringsAsFactors()
+                    )
+get_progress_from_NASISWebReport(mlrassoarea, fiscalyear, projecttypename)
+get_project_from_NASISWebReport(mlrassoarea, fiscalyear)
+get_project_correlation_from_NASISWebReport(mlrassoarea, fiscalyear, projectname)
+get_projectmapunit_from_NASISWebReport(projectname,
+                                       stringsAsFactors = default.stringsAsFactors()
+                                       )
+get_projectmapunit2_from_NASISWebReport(mlrassoarea, fiscalyear, projectname,
+                                        stringsAsFactors = default.stringsAsFactors()
+                                        )
+get_legend_from_NASISWebReport(areasymbol,
+                               droplevels = TRUE,
+                               stringsAsFactors = default.stringsAsFactors()
+                               )
+get_mapunit_from_NASISWebReport(areasymbol,
+                                droplevels = TRUE,
+                                stringsAsFactors = default.stringsAsFactors()
+                                )
+get_component_from_NASISWebReport(projectname,
+                                  stringsAsFactors = default.stringsAsFactors()
+                                  )
+get_chorizon_from_NASISWebReport(projectname, fill = FALSE,
+                                 stringsAsFactors = default.stringsAsFactors()
+                                 )
+get_cosoilmoist_from_NASISWebReport(projectname, impute = TRUE,
+                                    stringsAsFactors = default.stringsAsFactors()
+                                    )
+get_sitesoilmoist_from_NASISWebReport(usiteid)
+ +

Arguments

+ + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +
projectname

text string vector of project names to be inserted into a SQL WHERE clause (default: NA)

mlrassoarea

text string value identifying the mlra soil survey office areasymbol symbol inserted into a SQL WHERE clause (default: NA)

fiscalyear

text string value identifying the fiscal year inserted into a SQL WHERE clause (default: NA)

projecttypename

text string value identifying the project type name inserted into a SQL WHERE clause (default: NA)

areasymbol

text string value identifying the area symbol (e.g. "IN001" or "IN%") inserted into a SQL WHERE clause (default: NA)

usiteid

text string value identifying the user site id inserted into a SQL WHERE clause (default: NA)

impute

replace missing (i.e. NULL) values with "Not_Populated" for categorical data, or the "RV" for numeric data or 201 cm if the "RV" is also NULL (default: TRUE)

fill

should rows with missing component ids be removed NA (FALSE)

rmHzErrors

should pedons with horizonation errors be removed from the results? (default: FALSE)

stringsAsFactors

logical: should character vectors be converted to factors? This argument is passed to the uncode() function. It does not convert those vectors that have been set outside of uncode() (i.e. hard coded). The 'factory-fresh' default is TRUE, but this can be changed by setting options(stringsAsFactors = FALSE)

droplevels

logical: indicating whether to drop unused levels in classifying factors. This is useful when a class has large number of unused classes, which can waste space in tables and figures.

+ +

Value

+ +

A dataframe or list with the results.

+ +

Examples

+
# \donttest{ + + +if ( + require("aqp") & + require("ggplot2") & + require("gridExtra") +) { + # query soil components by projectname + test = fetchNASISWebReport( + "EVAL - MLRA 111A - Ross silt loam, 0 to 2 percent slopes, frequently flooded" + ) + test = test$spc + + # profile plot + plot(test) + + # convert the data for depth plot + clay_slice = horizons(slice(test, 0:200 ~ claytotal_l + claytotal_r + claytotal_h)) + names(clay_slice) <- gsub("claytotal_", "", names(clay_slice)) + + om_slice = horizons(slice(test, 0:200 ~ om_l + om_r + om_h)) + names(om_slice) = gsub("om_", "", names(om_slice)) + + test2 = rbind(data.frame(clay_slice, var = "clay"), + data.frame(om_slice, var = "om") + ) + + h = merge(test2, site(test)[c("dmuiid", "coiid", "compname", "comppct_r")], + by = "coiid", + all.x = TRUE + ) + + # depth plot of clay content by soil component + gg_comp <- function(x) { + ggplot(x) + + geom_line(aes(y = r, x = hzdept_r)) + + geom_line(aes(y = r, x = hzdept_r)) + + geom_ribbon(aes(ymin = l, ymax = h, x = hzdept_r), alpha = 0.2) + + xlim(200, 0) + + xlab("depth (cm)") + + facet_grid(var ~ dmuiid + paste(compname, comppct_r)) + + coord_flip() + } + g1 <- gg_comp(subset(h, var == "clay")) + g2 <- gg_comp(subset(h, var == "om")) + + grid.arrange(g1, g2) + + + # query cosoilmoist (e.g. water table data) by mukey + # NA depths are interpreted as (???) with impute=TRUE argument + x <- get_cosoilmoist_from_NASISWebReport( + "EVAL - MLRA 111A - Ross silt loam, 0 to 2 percent slopes, frequently flooded" + ) + + ggplot(x, aes(x = as.integer(month), y = dept_r, lty = status)) + + geom_rect(aes(xmin = as.integer(month), xmax = as.integer(month) + 1, + ymin = 0, ymax = max(x$depb_r), + fill = flodfreqcl)) + + geom_line(cex = 1) + + geom_point() + + geom_ribbon(aes(ymin = dept_l, ymax = dept_h), alpha = 0.2) + + ylim(max(x$depb_r), 0) + + xlab("month") + ylab("depth (cm)") + + scale_x_continuous(breaks = 1:12, labels = month.abb, name="Month") + + facet_wrap(~ paste0(compname, ' (', comppct_r , ')')) + + ggtitle(paste0(x$nationalmusym[1], + ': Water Table Levels from Component Soil Moisture Month Data')) + + +}
#> Loading required package: ggplot2
#> Warning: package 'ggplot2' was built under R version 3.5.3
#> Loading required package: gridExtra
#> Warning: package 'gridExtra' was built under R version 3.5.3
#> Loading required namespace: rvest
#> getting project 'EVAL - MLRA 111A - Ross silt loam, 0 to 2 percent slopes, frequently flooded' from NasisReportsWebSite
#> guessing horizon designations are stored in `hzname`
#> Warning: Removed 49 rows containing missing values (geom_path).
#> Warning: Removed 49 rows containing missing values (geom_path).
#> Warning: Removed 49 rows containing missing values (geom_path).
#> Warning: Removed 49 rows containing missing values (geom_path).
+ + +# } +
+
+ +
+ + +
+ + +
+

Site built with pkgdown 1.4.1.

+
+ +
+
+ + + + + + + + diff --git a/docs/reference/fetchOSD-1.png b/docs/reference/fetchOSD-1.png new file mode 100644 index 00000000..92d07d1b Binary files /dev/null and b/docs/reference/fetchOSD-1.png differ diff --git a/docs/reference/fetchOSD-2.png b/docs/reference/fetchOSD-2.png new file mode 100644 index 00000000..5339a1f9 Binary files /dev/null and b/docs/reference/fetchOSD-2.png differ diff --git a/docs/reference/fetchOSD.html b/docs/reference/fetchOSD.html index 2cc25b1d..6a61fe68 100644 --- a/docs/reference/fetchOSD.html +++ b/docs/reference/fetchOSD.html @@ -8,21 +8,25 @@ Fetch Data by Soil Series Name — fetchOSD • soilDB + + - + + - - + + + @@ -30,13 +34,14 @@ - + + @@ -47,6 +52,7 @@ + @@ -63,7 +69,7 @@ soilDB - 2.3.9 + 2.5 @@ -71,7 +77,7 @@ - @@ -92,6 +97,7 @@ +
@@ -103,13 +109,11 @@

Fetch Data by Soil Series Name

-

This functions fetches a varity of data associated with named soil series, extracted from the USDA-NRCS Official Series Description text files and detailed soil survey (SSURGO). These data are periodically updated and made available via SoilWeb.

-
fetchOSD(soils, colorState = 'moist', extended=FALSE)
- +

Arguments

@@ -126,7 +130,7 @@

Arg

if TRUE additional soil series summary data are returned, see details

- +

Details

The standard set of "site" and "horizon" data are returned as a SoilProfileCollection object (extended=FALSE. The "extended" suite of summary data can be requested by setting extended=TRUE. The resulting object will be a list with the following elements:)

@@ -141,30 +145,27 @@

Details
mlra

empirical MLRA membership values, derrived from the current SSURGO snapshot

climate

experimental climate summaries from PRISM stack

metadata

metadata associated with SoilWeb cached summaries

- -

Further details pending.

- + + + +

Further details pending.

Value

a SoilProfileCollection object containing basic soil morphology and taxonomic information.

-

References

USDA-NRCS OSD search tools: http://www.nrcs.usda.gov/wps/portal/nrcs/detailfull/soils/home/?cid=nrcs142p2_053587

-

Note

SoilWeb maintains a snapshot of the Official Series Description data. Please use the link above for the live data.

-

See also

-

Examples

-
# NOT RUN {
+    
# \donttest{ # soils of interest -s.list <- c('musick', 'cecil', 'drummer', 'amador', 'pentz', +s.list <- c('musick', 'cecil', 'drummer', 'amador', 'pentz', 'reiff', 'san joaquin', 'montpellier', 'grangeville', 'pollasky', 'ramona') # fetch and convert data into an SPC @@ -173,35 +174,40 @@

Examp # plot profiles # moist soil colors -par(mar=c(0,0,0,0), mfrow=c(2,1)) -plot(s.moist, name='hzname', cex.names=0.85, axis.line.offset=-4) -plot(s.dry, name='hzname', cex.names=0.85, axis.line.offset=-4) - -# extended mode: return a list with SPC + summary tables -x <- fetchOSD(s.list, extended = TRUE, colorState = 'dry') - -par(mar=c(0,0,1,1)) -plot(x$SPC) - -str(x, 1) - -# }

+if(require("aqp")) { + + par(mar=c(0,0,0,0), mfrow=c(2,1)) + plot(s.moist, name='hzname', cex.names=0.85, axis.line.offset=-4) + plot(s.dry, name='hzname', cex.names=0.85, axis.line.offset=-4) + + # extended mode: return a list with SPC + summary tables + x <- fetchOSD(s.list, extended = TRUE, colorState = 'dry') + + par(mar=c(0,0,1,1)) + plot(x$SPC) +}
#> guessing horizon designations are stored in `hzname`
str(x, 1)
#> List of 11 +#> $ SPC :Formal class 'SoilProfileCollection' [package "aqp"] with 11 slots +#> $ competing :'data.frame': 84 obs. of 3 variables: +#> $ geomcomp :'data.frame': 11 obs. of 9 variables: +#> $ hillpos :'data.frame': 11 obs. of 8 variables: +#> $ mtnpos :'data.frame': 1 obs. of 9 variables: +#> $ pmkind :'data.frame': 18 obs. of 5 variables: +#> $ pmorigin :'data.frame': 33 obs. of 5 variables: +#> $ mlra :'data.frame': 52 obs. of 4 variables: +#> $ climate.annual :'data.frame': 88 obs. of 12 variables: +#> $ climate.monthly :'data.frame': 264 obs. of 14 variables: +#> $ soilweb.metadata:'data.frame': 13 obs. of 2 variables:
# } +
+
-

Site built with pkgdown 1.3.0.

+

Site built with pkgdown 1.4.1.

+
+ + diff --git a/docs/reference/fetchPedonPC.html b/docs/reference/fetchPedonPC.html index 0e01b1c8..7810a8bf 100644 --- a/docs/reference/fetchPedonPC.html +++ b/docs/reference/fetchPedonPC.html @@ -8,21 +8,25 @@ Fetch commonly used site/horizon data from a PedonPC v.5 database. — fetchPedonPC • soilDB + + - + + - - + + + @@ -30,13 +34,14 @@ - + + @@ -47,6 +52,7 @@ + @@ -63,7 +69,7 @@ soilDB - 2.3.9 + 2.5 @@ -71,7 +77,7 @@ - @@ -92,6 +97,7 @@ +
@@ -103,14 +109,12 @@

Fetch commonly used site/horizon data from a PedonPC v.5 database.

-

Fetch commonly used site/horizon data from a version 5.x PedonPC database, return as a SoilProfileCollection object.

-
fetchPedonPC(dsn)
 getHzErrorsPedonPC(dsn, strict=TRUE)
- +

Arguments

@@ -123,49 +127,48 @@

Arg

should horizonation by strictly enforced? (TRUE)

- +

Details

This function currently works only on Windows.

-

Value

a SoilProfileCollection class object

-

Note

This fuction attempts to do most of the boilerplate work when extracting site/horizon data from a PedonPC or local NASIS database. Pedons that have errors in their horizonation are excluded from the returned object, however, their IDs are printed on the console. See getHzErrorsPedonPC for a simple approach to identifying pedons with problematic horizonation. Records from the 'taxhistory' table are selected based on 1) most recent record, or 2) record with the least amount of missing data.

-

See also

-

Examples

-
# NOT RUN {
-# path to local PedonPC back-end DB
-dsn <- "S:/Service_Center/NRCS/pedon/pedon.accdb"
-
-# get routinely used soil data SoilProfileCollection object
-f <- fetchPedonPC(dsn)
-
-# plot only those profiles with densic contact
-plot(f[which(f$densic.contact), ], name='hzname')
-# }
+
# \donttest{ +if(require(aqp)) { + # path to local PedonPC back-end DB + dsn <- "S:/Service_Center/NRCS/pedon/pedon.accdb" + + if(file.exists(dsn)) { + # get routinely used soil data SoilProfileCollection object + f <- fetchPedonPC(dsn) + + # determine which profiles have densic contacts + idx <- which(f$densic.contact) + + # plot only those profiles with densic contact + if(length(idx)) + plot(f[idx, ], name='hzname') + } +} +# }
+
-

Site built with pkgdown 1.3.0.

+

Site built with pkgdown 1.4.1.

+
+ + diff --git a/docs/reference/fetchRaCA-1.png b/docs/reference/fetchRaCA-1.png new file mode 100644 index 00000000..2a4fed58 Binary files /dev/null and b/docs/reference/fetchRaCA-1.png differ diff --git a/docs/reference/fetchRaCA.html b/docs/reference/fetchRaCA.html index b0a60902..9e2b5275 100644 --- a/docs/reference/fetchRaCA.html +++ b/docs/reference/fetchRaCA.html @@ -8,21 +8,25 @@ Fetch KSSL Data (EXPERIMENTAL) — fetchRaCA • soilDB + + - + + - - + + + @@ -30,13 +34,14 @@ - + + @@ -47,6 +52,7 @@ + @@ -63,7 +69,7 @@ soilDB - 2.3.9 + 2.5 @@ -71,7 +77,7 @@ - @@ -92,6 +97,7 @@ +
@@ -103,13 +109,11 @@

Fetch KSSL Data (EXPERIMENTAL)

-

Get Rapid Carbon Assessment (RaCA) data via state, geographic bounding-box, RaCA site ID, or series query from the SoilWeb system.

-
fetchRaCA(series = NULL, bbox = NULL, state = NULL, rcasiteid = NULL, get.vnir = FALSE)
- +

Arguments

@@ -119,7 +123,7 @@

Arg

- + @@ -134,7 +138,7 @@

Arg

bbox

a bounding box in WGS84 geographic coordinates e.g. c(-120, 37, -122, 38), constrained to a 5-degree block

a bounding box in WGS84 geographic coordinates e.g. c(-120, 37, -122, 38), constrained to a 5-degree block

state

boolean, should associated VNIR spectra be downloaded? (see details)

- +

Value

@@ -144,58 +148,58 @@

Value

stock:

a data.frame object containing carbon quantities (stocks) at standardized depths

sample:

a data.frame object containing sample-level bulk density and soil organic carbon values

spectra:

a numeric matrix containing VNIR reflectance spectra from 350--2500 nm

-
+ + -

Details

The VNIR spectra associated with RaCA data are quite large [each gzip-compressed VNIR spectra record is about 6.6kb], so requests for these data are disabled by default. Note that VNIR spectra can only be queried by soil series or geographic BBOX.

-

References

http://www.nrcs.usda.gov/wps/portal/nrcs/detail/soils/survey/?cid=nrcs142p2_054164 fetchRaCA() Tutorial

-

See also

-

Examples

-
# NOT RUN {
-  # search by series name
-  s <- fetchRaCA(series='auburn')
-
-  # search by bounding-box
-  # s <- fetchRaCA(bbox=c(-120, 37, -122, 38))
-
-  # check structure
-  str(s, 1)
-
-  # extract pedons
-  p <- s$pedons
-
-  # how many pedons
-  length(p)
-
-  # plot 
-  par(mar=c(0,0,0,0))
-  plot(p, name='hzn_desgn', max.depth=150)
-# }
+
# \donttest{ +if(require(aqp)) { + # search by series name + s <- fetchRaCA(series='auburn') + + # search by bounding-box + # s <- fetchRaCA(bbox=c(-120, 37, -122, 38)) + + # check structure + str(s, 1) + + # extract pedons + p <- s$pedons + + # how many pedons + length(p) + + # plot + par(mar=c(0,0,0,0)) + plot(p, name='hzn_desgn', max.depth=150) +}
#> Site coordinates have been truncated to 2 decimal places, contact the National Soil Survey Center for more detailed coordinates.
#> Carbon concentration and stock values are probably wrong, or at least suspect. USE WITH CAUTION.
#> 4 RaCA sites loaded (0.06 Mb transferred)
#> List of 6 +#> $ pedons :Formal class 'SoilProfileCollection' [package "aqp"] with 11 slots +#> $ trees : NULL +#> $ veg :'data.frame': 4 obs. of 6 variables: +#> $ stock :'data.frame': 20 obs. of 13 variables: +#> $ sample :'data.frame': 76 obs. of 16 variables: +#> $ spectra: NULL
# } +
+
-

Site built with pkgdown 1.3.0.

+

Site built with pkgdown 1.4.1.

+
+ + diff --git a/docs/reference/fetchSCAN.html b/docs/reference/fetchSCAN.html index 2e0fa49b..07eaf128 100644 --- a/docs/reference/fetchSCAN.html +++ b/docs/reference/fetchSCAN.html @@ -8,21 +8,25 @@ Fetch SCAN Data — fetchSCAN • soilDB + + - + + - - + + + @@ -30,13 +34,14 @@ - + + @@ -47,6 +52,7 @@ + @@ -63,7 +69,7 @@ soilDB - 2.3.9 + 2.5 @@ -71,7 +77,7 @@ - @@ -92,6 +97,7 @@ +
@@ -103,9 +109,7 @@

Fetch SCAN Data

-

Query soil/climate data from USDA-NRCS SCAN Stations (experimental)

-
# get SCAN data
@@ -116,7 +120,7 @@ 

Fetch SCAN Data

# get site metadata for one or more sites SCAN_site_metadata(site.code)
- +

Arguments

@@ -137,50 +141,128 @@

Arg

list of SCAN request parameters, for backwards-compatibility only

- +

Details

See the fetchSCAN tutorial for details. These functions require the `httr` and `rvest` libraries.

-

Note

SCAN_sensor_metadata() is known to crash on 32bit R / libraries (Windows).

-

Value

a data.frame object

-

References

https://www.wcc.nrcs.usda.gov/index.html

-

Examples

-
# NOT RUN {
+    
# \donttest{ # get data: new interface -x <- fetchSCAN(site.code=c(356, 2072), year=c(2015, 2016)) -str(x) - +x <- fetchSCAN(site.code=c(356, 2072), year=c(2015, 2016))
#> 19336 records (0.83 Mb transferred)
str(x)
#> List of 14 +#> $ metadata:'data.frame': 2 obs. of 12 variables: +#> ..$ Name : chr [1:2] "Blue Lakes" "Eros Data Center" +#> ..$ Site : num [1:2] 356 2072 +#> ..$ State : chr [1:2] "California" "South Dakota" +#> ..$ Network : chr [1:2] "SNOTEL" "SCAN" +#> ..$ County : chr [1:2] "Alpine" "Minnehaha" +#> ..$ Elevation_ft : num [1:2] 8067 1602 +#> ..$ Latitude : num [1:2] 38.6 43.7 +#> ..$ Longitude : num [1:2] -119.9 -96.6 +#> ..$ HUC : chr [1:2] "180400120101" "101702031402" +#> ..$ climstanm : chr [1:2] NA "Eros Data Center" +#> ..$ upedonid : chr [1:2] NA "S2003SD099001" +#> ..$ pedlabsampnum: chr [1:2] NA "03N0688" +#> $ SMS :'data.frame': 5752 obs. of 7 variables: +#> ..$ Site : int [1:5752] 356 356 356 356 356 356 356 356 356 356 ... +#> ..$ Date : Date[1:5752], format: "2015-01-01" "2015-01-02" ... +#> ..$ water_year: num [1:5752] 2015 2015 2015 2015 2015 ... +#> ..$ water_day : int [1:5752] 93 94 95 96 97 98 99 100 101 102 ... +#> ..$ value : num [1:5752] 0 0 0 0 0 0 0 0 0 0 ... +#> ..$ depth : num [1:5752] 5 5 5 5 5 5 5 5 5 5 ... +#> ..$ sensor.id : Factor w/ 5 levels "SMS.I_2","SMS.I_8",..: 1 1 1 1 1 1 1 1 1 1 ... +#> $ STO :'data.frame': 5843 obs. of 7 variables: +#> ..$ Site : int [1:5843] 356 356 356 356 356 356 356 356 356 356 ... +#> ..$ Date : Date[1:5843], format: "2015-01-01" "2015-01-02" ... +#> ..$ water_year: num [1:5843] 2015 2015 2015 2015 2015 ... +#> ..$ water_day : int [1:5843] 93 94 95 96 97 98 99 100 101 102 ... +#> ..$ value : num [1:5843] 1.4 1.4 1.3 1.3 1.3 1.3 1.3 1.4 1.4 1.4 ... +#> ..$ depth : num [1:5843] 5 5 5 5 5 5 5 5 5 5 ... +#> ..$ sensor.id : Factor w/ 5 levels "STO.I_2","STO.I_8",..: 1 1 1 1 1 1 1 1 1 1 ... +#> $ SAL :'data.frame': 0 obs. of 0 variables +#> $ TAVG :'data.frame': 1460 obs. of 7 variables: +#> ..$ Site : int [1:1460] 356 356 356 356 356 356 356 356 356 356 ... +#> ..$ Date : Date[1:1460], format: "2015-01-01" "2015-01-02" ... +#> ..$ water_year: num [1:1460] 2015 2015 2015 2015 2015 ... +#> ..$ water_day : int [1:1460] 93 94 95 96 97 98 99 100 101 102 ... +#> ..$ value : num [1:1460] -13.1 -6.3 -4.6 -1.5 0.5 2 3 2.2 1.2 2.2 ... +#> ..$ depth : num [1:1460] NA NA NA NA NA NA NA NA NA NA ... +#> ..$ sensor.id : Factor w/ 1 level "TAVG.D": 1 1 1 1 1 1 1 1 1 1 ... +#> $ TMIN :'data.frame': 1460 obs. of 7 variables: +#> ..$ Site : int [1:1460] 356 356 356 356 356 356 356 356 356 356 ... +#> ..$ Date : Date[1:1460], format: "2015-01-01" "2015-01-02" ... +#> ..$ water_year: num [1:1460] 2015 2015 2015 2015 2015 ... +#> ..$ water_day : int [1:1460] 93 94 95 96 97 98 99 100 101 102 ... +#> ..$ value : num [1:1460] -17 -10.9 -12.5 -8.1 -5.2 -4.2 -2.2 -3.8 -5.1 -1.4 ... +#> ..$ depth : num [1:1460] NA NA NA NA NA NA NA NA NA NA ... +#> ..$ sensor.id : Factor w/ 1 level "TMIN.D": 1 1 1 1 1 1 1 1 1 1 ... +#> $ TMAX :'data.frame': 1460 obs. of 7 variables: +#> ..$ Site : int [1:1460] 356 356 356 356 356 356 356 356 356 356 ... +#> ..$ Date : Date[1:1460], format: "2015-01-01" "2015-01-02" ... +#> ..$ water_year: num [1:1460] 2015 2015 2015 2015 2015 ... +#> ..$ water_day : int [1:1460] 93 94 95 96 97 98 99 100 101 102 ... +#> ..$ value : num [1:1460] -8.9 2 7 9.2 8.1 11.3 13.8 12.5 10.3 6.4 ... +#> ..$ depth : num [1:1460] NA NA NA NA NA NA NA NA NA NA ... +#> ..$ sensor.id : Factor w/ 1 level "TMAX.D": 1 1 1 1 1 1 1 1 1 1 ... +#> $ PRCP :'data.frame': 581 obs. of 7 variables: +#> ..$ Site : int [1:581] 2072 2072 2072 2072 2072 2072 2072 2072 2072 2072 ... +#> ..$ Date : Date[1:581], format: "2015-01-01" "2015-01-02" ... +#> ..$ water_year: num [1:581] 2015 2015 2015 2015 2015 ... +#> ..$ water_day : int [1:581] 93 94 95 96 97 98 99 100 101 102 ... +#> ..$ value : num [1:581] 0 0 0 0.08 0 0 0 0 0.23 0 ... +#> ..$ depth : num [1:581] NA NA NA NA NA NA NA NA NA NA ... +#> ..$ sensor.id : Factor w/ 1 level "PRCP.D": 1 1 1 1 1 1 1 1 1 1 ... +#> $ PREC :'data.frame': 1316 obs. of 7 variables: +#> ..$ Site : int [1:1316] 356 356 356 356 356 356 356 356 356 356 ... +#> ..$ Date : Date[1:1316], format: "2015-01-01" "2015-01-02" ... +#> ..$ water_year: num [1:1316] 2015 2015 2015 2015 2015 ... +#> ..$ water_day : int [1:1316] 93 94 95 96 97 98 99 100 101 102 ... +#> ..$ value : num [1:1316] 9.1 9.1 9.1 9.1 9.1 9.1 9.1 9.1 9.1 9.1 ... +#> ..$ depth : num [1:1316] NA NA NA NA NA NA NA NA NA NA ... +#> ..$ sensor.id : Factor w/ 1 level "PREC.I": 1 1 1 1 1 1 1 1 1 1 ... +#> $ SNWD :'data.frame': 731 obs. of 7 variables: +#> ..$ Site : int [1:731] 356 356 356 356 356 356 356 356 356 356 ... +#> ..$ Date : Date[1:731], format: "2015-01-01" "2015-01-02" ... +#> ..$ water_year: num [1:731] 2015 2015 2015 2015 2015 ... +#> ..$ water_day : int [1:731] 93 94 95 96 97 98 99 100 101 102 ... +#> ..$ value : int [1:731] 21 20 20 20 19 18 18 18 17 17 ... +#> ..$ depth : num [1:731] NA NA NA NA NA NA NA NA NA NA ... +#> ..$ sensor.id : Factor w/ 1 level "SNWD.I": 1 1 1 1 1 1 1 1 1 1 ... +#> $ WTEQ :'data.frame': 731 obs. of 7 variables: +#> ..$ Site : int [1:731] 356 356 356 356 356 356 356 356 356 356 ... +#> ..$ Date : Date[1:731], format: "2015-01-01" "2015-01-02" ... +#> ..$ water_year: num [1:731] 2015 2015 2015 2015 2015 ... +#> ..$ water_day : int [1:731] 93 94 95 96 97 98 99 100 101 102 ... +#> ..$ value : num [1:731] 6.1 6.1 6.1 6.1 6.1 6.1 6.1 6.1 6.1 6.1 ... +#> ..$ depth : num [1:731] NA NA NA NA NA NA NA NA NA NA ... +#> ..$ sensor.id : Factor w/ 1 level "WTEQ.I": 1 1 1 1 1 1 1 1 1 1 ... +#> $ WDIRV :'data.frame': 0 obs. of 0 variables +#> $ WSPDV :'data.frame': 0 obs. of 0 variables +#> $ LRADT :'data.frame': 0 obs. of 0 variables
# get sensor metadata -m <- SCAN_sensor_metadata(site.code=c(356, 2072)) +m <- SCAN_sensor_metadata(site.code=c(356, 2072)) # get site metadata -m <- SCAN_site_metadata(site.code=c(356, 2072)) -# }
+m <- SCAN_site_metadata(site.code=c(356, 2072)) +# }
+
-

Site built with pkgdown 1.3.0.

+

Site built with pkgdown 1.4.1.

+
+ + diff --git a/docs/reference/fetchSDA_component-1.png b/docs/reference/fetchSDA_component-1.png new file mode 100644 index 00000000..bf29fe23 Binary files /dev/null and b/docs/reference/fetchSDA_component-1.png differ diff --git a/docs/reference/fetchSDA_component-2.png b/docs/reference/fetchSDA_component-2.png new file mode 100644 index 00000000..9cc70882 Binary files /dev/null and b/docs/reference/fetchSDA_component-2.png differ diff --git a/docs/reference/fetchSDA_component-3.png b/docs/reference/fetchSDA_component-3.png new file mode 100644 index 00000000..98f770f6 Binary files /dev/null and b/docs/reference/fetchSDA_component-3.png differ diff --git a/docs/reference/fetchSDA_component.html b/docs/reference/fetchSDA_component.html index 2bff9c44..95534bad 100644 --- a/docs/reference/fetchSDA_component.html +++ b/docs/reference/fetchSDA_component.html @@ -6,23 +6,27 @@ -Extract component tables from Soil Data Access — fetchSDA_component • soilDB +Download and Flatten Data from Soil Data Access — fetchSDA • soilDB + + - + + - - + + + @@ -30,13 +34,14 @@ - - + + + @@ -47,6 +52,7 @@ + @@ -63,7 +69,7 @@ soilDB - 2.3.9 + 2.5 @@ -71,7 +77,7 @@ - @@ -92,50 +97,47 @@ +
- -

Get, format, impute, and return component tables.

- +

Functions to download and flatten commonly used tables and from Soil Data Access, and create soil profile collection objects (SPC).

-
fetchSDA_component(WHERE = NULL, duplicates = FALSE, childs = TRUE,
-                   nullFragsAreZero = TRUE, rmHzErrors = FALSE,
-                   drop.unused.levels = TRUE,
-                   stringsAsFactors = default.stringsAsFactors()
-                   )
+    
fetchSDA(WHERE = NULL, duplicates = FALSE, childs = TRUE,
+         nullFragsAreZero = TRUE, rmHzErrors = FALSE,
+         droplevels = TRUE,
+         stringsAsFactors = default.stringsAsFactors()
+         )
 
 get_mapunit_from_SDA(WHERE = NULL,
-                     drop.unused.levels = TRUE,
-                     stringsAsFactors = default.stringsAsFactors()
+                     droplevels = TRUE,
+                     stringsAsFactors = default.stringsAsFactors()
                      )
 
 get_component_from_SDA(WHERE = NULL, duplicates = FALSE, childs = TRUE,
-                       drop.unused.levels = TRUE,
-                       stringsAsFactors = default.stringsAsFactors()
+                       droplevels = TRUE,
+                       stringsAsFactors = default.stringsAsFactors()
                        )
 
 get_chorizon_from_SDA(WHERE = NULL, duplicates = FALSE, childs = TRUE,
                       nullFragsAreZero = TRUE,
-                      drop.unused.levels = TRUE,
-                      stringsAsFactors = default.stringsAsFactors()
+                      droplevels = TRUE,
+                      stringsAsFactors = default.stringsAsFactors()
                       )
 
 get_cosoilmoist_from_SDA(WHERE = NULL, duplicates = FALSE, impute = TRUE,
-                         stringsAsFactors = default.stringsAsFactors()
-                         )
+                         stringsAsFactors = default.stringsAsFactors()
+                         )
-get_cosoilmoist_from_NASIS(impute = TRUE, stringsAsFactors = default.stringsAsFactors())
-

Arguments

@@ -145,7 +147,7 @@

Arg

- + @@ -164,7 +166,7 @@

Arg

- + @@ -172,129 +174,140 @@

Arg

duplicates

logical; if TRUE duplicate nationalmusym are returned

logical; if TRUE a record is returned for each unique mukey (may be many per nationalmusym)

childs

should pedons with horizonation errors be removed from the results? (default: FALSE)

drop.unused.levelsdroplevels

logical: indicating whether to drop unused levels in classifying factors. This is useful when a class has large number of unused classes, which can waste space in tables and figures.

logical: should character vectors be converted to factors? This argument is passed to the uncode() function. It does not convert those vectors that have set outside of uncode() (i.e. hard coded). The 'factory-fresh' default is TRUE, but this can be changed by setting options(stringsAsFactors = FALSE)

- +

Details

-

The SDA functions can get and fetch data with an internet connection and a WHERE clause.

+

These functions return data from Soil Data Access with the use of a simple text string that formated as an SQL WHERE clause (e.g. WHERE = "areasymbol = 'IN001'". All functions are SQL querys that wrap around SDAquery() and format the data for analysis.

+

Beware SDA includes the data for both SSURGO and STATSGO2. The areasymbol for STATSGO2 is US. Therefore if data from just SSURGO is desired, set WHERE = "areareasymbol != 'US'".

If the duplicates argument is set to TRUE, duplicate components are returned. This is not necessary with data returned from NASIS, which has one unique national map unit. SDA has duplicate map national map units, one for each legend it exists in.

-

The function get_cosoilmoist_from_NASIS_db() only works only on Windows , and requires a 'nasis_local' ODBC connection. See the NASIS ODBC Setup tutorial for instructions.

-

The value of nullFragsAreZero will have a significant impact on the rock fragment fractions returned by fetchSDA_component. Set nullFragsAreZero = FALSE in those cases where there are many data-gaps and NULL rock fragment values should be interpretated as NULLs. Set nullFragsAreZero = TRUE in those cases where NULL rock fragment values should be interpreted as 0.

- +

The value of nullFragsAreZero will have a significant impact on the rock fragment fractions returned by fetchSDA. Set nullFragsAreZero = FALSE in those cases where there are many data-gaps and NULL rock fragment values should be interpretated as NULLs. Set nullFragsAreZero = TRUE in those cases where NULL rock fragment values should be interpreted as 0.

Value

-

A dataframe or list with the results.

- +

A dataframe or soil profile collection object.

+

See also

+ +

Examples

-
# NOT RUN {
-library(soilDB)
-library(ggplot2)
-library(gridExtra)
-
-# query soil components by areasymbol and musym
-test = fetchSDA_component(WHERE = "areasymbol = 'IN005' AND musym = 'MnpB2'")
-
-# profile plot
-plot(test)
-
-# convert the data for depth plot
-clay_slice = horizons(slice(test, 0:200 ~ claytotal_l + claytotal_r + claytotal_h))
-names(clay_slice) <- gsub("claytotal_", "", names(clay_slice))
-
-om_slice = horizons(slice(test, 0:200 ~ om_l + om_r + om_h))
-names(om_slice) = gsub("om_", "", names(om_slice))
-
-test2 = rbind(data.frame(clay_slice, var = "clay"),
-              data.frame(om_slice, var = "om")
-              )
-
-h = merge(test2, site(test)[c("nationalmusym", "cokey", "compname", "comppct_r")],
-          by = "cokey",
-          all.x = TRUE
-          )
-
-# depth plot of clay content by soil component
-gg_comp <- function(x) {
-  ggplot(x) +
-  geom_line(aes(y = r, x = hzdept_r)) +
-  geom_line(aes(y = r, x = hzdept_r)) +
-  geom_ribbon(aes(ymin = l, ymax = h, x = hzdept_r), alpha = 0.2) +
-  xlim(200, 0) +
-  xlab("depth (cm)") +
-  facet_grid(var ~ nationalmusym + paste(compname, comppct_r)) +
-  coord_flip()
+    
# \donttest{ + + +if ( + require(aqp) & + require("ggplot2") & + require("gridExtra") & + require("viridis") +) { + + # query soil components by areasymbol and musym + test = fetchSDA(WHERE = "areasymbol = 'IN005' AND musym = 'MnpB2'") + + + # profile plot + plot(test) + + + # convert the data for depth plot + clay_slice = horizons(slice(test, 0:200 ~ claytotal_l + claytotal_r + claytotal_h)) + names(clay_slice) <- gsub("claytotal_", "", names(clay_slice)) + + om_slice = horizons(slice(test, 0:200 ~ om_l + om_r + om_h)) + names(om_slice) = gsub("om_", "", names(om_slice)) + + test2 = rbind(data.frame(clay_slice, var = "clay"), + data.frame(om_slice, var = "om") + ) + + h = merge(test2, site(test)[c("nationalmusym", "cokey", "compname", "comppct_r")], + by = "cokey", + all.x = TRUE + ) + + # depth plot of clay content by soil component + gg_comp <- function(x) { + ggplot(x) + + geom_line(aes(y = r, x = hzdept_r)) + + geom_line(aes(y = r, x = hzdept_r)) + + geom_ribbon(aes(ymin = l, ymax = h, x = hzdept_r), alpha = 0.2) + + xlim(200, 0) + + xlab("depth (cm)") + + facet_grid(var ~ nationalmusym + paste(compname, comppct_r)) + + coord_flip() } -g1 <- gg_comp(subset(h, var == "clay")) -g2 <- gg_comp(subset(h, var == "om")) - -grid.arrange(g1, g2) - - -# query cosoilmoist (e.g. water table data) by mukey -# NA depths are interpreted as (???) with impute=TRUE argument -x <- get_cosoilmoist_from_SDA(WHERE = "mukey = '1395352'", impute = TRUE) - -ggplot(x, aes(x = as.integer(month), y = dept_r, lty = status)) + - geom_rect(aes(xmin = as.integer(month), xmax = as.integer(month) + 1, - ymin = 0, ymax = max(x$depb_r), - fill = flodfreqcl)) + - geom_line(cex = 1) + - geom_point() + - geom_ribbon(aes(ymin = dept_l, ymax = dept_h), alpha = 0.2) + - ylim(max(x$depb_r), 0) + - xlab("month") + ylab("depth (cm)") + - scale_x_continuous(breaks = 1:12, labels = month.abb, name="Month") + - facet_wrap(~ paste0(compname, ' (', comppct_r , ')')) + - ggtitle(paste0(x$nationalmusym[1], - ': Water Table Levels from Component Soil Moisture Month Data')) - - - -# query all Miami major components -s <- get_component_from_SDA(WHERE = "compname = 'Miami' AND majcompflag = 'Yes'") - -# landform vs 3-D morphometry -test <- { - subset(s, ! is.na(landform) | ! is.na(geompos)) ->.; - split(., .$drainagecl, drop = TRUE) ->.; - lapply(., function(x) { - test = data.frame() - test = as.data.frame(table(x$landform, x$geompos)) - test$compname = x$compname[1] - test$drainagecl = x$drainagecl[1] - names(test)[1:2] <- c("landform", "geompos") - return(test) + g1 <- gg_comp(subset(h, var == "clay")) + g2 <- gg_comp(subset(h, var == "om")) + + grid.arrange(g1, g2) + + + # query cosoilmoist (e.g. water table data) by mukey + x <- get_cosoilmoist_from_SDA(WHERE = "mukey = '1395352'") + + ggplot(x, aes(x = as.integer(month), y = dept_r, lty = status)) + + geom_rect(aes(xmin = as.integer(month), xmax = as.integer(month) + 1, + ymin = 0, ymax = max(x$depb_r), + fill = flodfreqcl)) + + geom_line(cex = 1) + + geom_point() + + geom_ribbon(aes(ymin = dept_l, ymax = dept_h), alpha = 0.2) + + ylim(max(x$depb_r), 0) + + xlab("month") + ylab("depth (cm)") + + scale_x_continuous(breaks = 1:12, labels = month.abb, name="Month") + + facet_wrap(~ paste0(compname, ' (', comppct_r , ')')) + + ggtitle(paste0(x$nationalmusym[1], + ': Water Table Levels from Component Soil Moisture Month Data')) + + + + # query all Miami major components + s <- get_component_from_SDA(WHERE = "compname = 'Miami' \n + AND majcompflag = 'Yes' AND areasymbol != 'US'") + + + # landform vs 3-D morphometry + test <- { + subset(s, ! is.na(landform) | ! is.na(geompos)) ->.; + split(., .$drainagecl, drop = TRUE) ->.; + lapply(., function(x) { + test = data.frame() + test = as.data.frame(table(x$landform, x$geompos)) + test$compname = x$compname[1] + test$drainagecl = x$drainagecl[1] + names(test)[1:2] <- c("landform", "geompos") + return(test) + }) ->.; + do.call("rbind", .) ->.; + .[.$Freq > 0, ] ->.; + within(., { + landform = reorder(factor(landform), Freq, max) + geompos = reorder(factor(geompos), Freq, max) + geompos = factor(geompos, levels = rev(levels(geompos))) }) ->.; - do.call("rbind", .) ->.; - .[.$Freq > 0, ] ->.; - within(., { - landform = reorder(factor(landform), Freq, max) - geompos = reorder(factor(geompos), Freq, max) - geompos = factor(geompos, levels = rev(levels(geompos))) - }) ->.; } -test$Freq2 <- cut(test$Freq, - breaks = c(0, 5, 10, 25, 50, 100, 150), - labels = c("<5", "5-10", "10-25", "25-50", "50-100", "100-150") - ) -ggplot(test, aes(x = geompos, y = landform, fill = Freq2)) + - geom_tile(alpha = 0.5) + facet_wrap(~ paste0(compname, "\n", drainagecl)) + - viridis::scale_fill_viridis(discrete = T) + - theme(aspect.ratio = 1, axis.text.x = element_text(angle = 45, hjust = 1, vjust = 1)) + - ggtitle("Landform vs 3-D Morphometry for Miami Major Components on SDA") - - -# }
+ test$Freq2 <- cut(test$Freq, + breaks = c(0, 5, 10, 25, 50, 100, 150), + labels = c("<5", "5-10", "10-25", "25-50", "50-100", "100-150") + ) + ggplot(test, aes(x = geompos, y = landform, fill = Freq2)) + + geom_tile(alpha = 0.5) + facet_wrap(~ paste0(compname, "\n", drainagecl)) + + scale_fill_viridis(discrete = TRUE) + + theme(aspect.ratio = 1, axis.text.x = element_text(angle = 45, hjust = 1, vjust = 1)) + + ggtitle("Landform vs 3-D Morphometry for Miami Major Components on SDA") + + +}
#> Loading required package: viridis
#> Warning: package 'viridis' was built under R version 3.5.3
#> Loading required package: viridisLite
#> Warning: package 'viridisLite' was built under R version 3.5.3
#> single result set, returning a data.frame
#> single result set, returning a data.frame
#> single result set, returning a data.frame
#> single result set, returning a data.frame
#> single result set, returning a data.frame
#> single result set, returning a data.frame
#> guessing horizon designations are stored in `hzname`
#> single result set, returning a data.frame
#> single result set, returning a data.frame
#> single result set, returning a data.frame
#> single result set, returning a data.frame
+ + +# } +
+
-

Site built with pkgdown 1.3.0.

+

Site built with pkgdown 1.4.1.

+
+ + diff --git a/docs/reference/fetchSDA_spatial-1.png b/docs/reference/fetchSDA_spatial-1.png new file mode 100644 index 00000000..cebec5ec Binary files /dev/null and b/docs/reference/fetchSDA_spatial-1.png differ diff --git a/docs/reference/fetchSDA_spatial.html b/docs/reference/fetchSDA_spatial.html new file mode 100644 index 00000000..faf74c0b --- /dev/null +++ b/docs/reference/fetchSDA_spatial.html @@ -0,0 +1,210 @@ + + + + + + + + +Query SDA and Return Spatial Data — fetchSDA_spatial • soilDB + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +
+
+ + + + +
+ +
+
+ + +
+

This is a high-level fetch method that facilitates making spatial queries to Soil Data Access (SDA) based on `mukey` or `nationalmusym`. A typical SDA spatial query is made returning geometry and key identifying information about the mapunit. Additional columns from the mapunit table can be included using `add.fields` argument.

+

This function automatically "chunks" the input vector (using `soilDB::makeChunks`) of mapunit identifiers to minimize the likelihood of exceeding the SDA data request size. The number of chunks varies with the `chunk.size` setting and the length of your input vector. If you are working with many mapunits and/or large extents, you may need to decrease this number in order to have more chunks.

+
+ +
fetchSDA_spatial(
+  x,
+  by.col = "mukey",
+  method = "feature",
+  add.fields = NULL,
+  chunk.size = 10
+)
+ +

Arguments

+ + + + + + + + + + + + + + + + + + + + + + +
x

A vector of MUKEYs or national mapunit symbols.

by.col

Column name containing mapunit identifier ("mukey" or "nmusym"); default: "mukey"

method

geometry result type: 'feature' returns polygons, 'bbox' returns the bounding box of each polygon, and 'point' returns a single point within each polygon.

add.fields

Column names from `mapunit` table to add to result. Must specify table name prefix as either `G` or `mapunit`.

chunk.size

How many queries should spatial request be divided into? Necessary for large extents. Default: 10

+ +

Value

+ +

A SpatialPolygonsDataFrame corresponding to SDA spatial data for all MUKEYs / nmusyms requested. Default result contains MupolygonWktWgs84-derived geometry with attribute table containing `gid`, `mukey` and `nationalmusym`, additional fields in result are specified with `add.fields`.

+ +

Examples

+
# \donttest{ +# get spatial data for a single mukey +single.mukey <- fetchSDA_spatial(x = "2924882") + +# demonstrate fetching full extent (multi-mukey) of national musym +full.extent.nmusym <- fetchSDA_spatial(x = "2x8l5", by = "nmusym") + +# compare extent of nmusym to single mukey within it +if(require(sp)) { + plot(full.extent.nmusym, col = "RED",border=0) + plot(single.mukey, add = TRUE, col = "BLUE", border=0) +}
#> Loading required package: sp
#> Warning: package 'sp' was built under R version 3.5.3
+# demo adding a field (`muname`) to attribute table of result +head(fetchSDA_spatial(x = "2x8l5", by="nmusym", add.fields="muname"))
#> class : SpatialPolygonsDataFrame +#> features : 6 +#> extent : -121.034, -120.9596, 38.01706, 38.24938 (xmin, xmax, ymin, ymax) +#> crs : +proj=longlat +datum=WGS84 +ellps=WGS84 +towgs84=0,0,0 +#> variables : 4 +#> names : gid, mukey, nationalmusym, muname +#> min values : 1, 462101, 2x8l5, Pentz-Bellota complex, 2 to 15 percent slopes +#> max values : 6, 462101, 2x8l5, Pentz-Bellota complex, 2 to 15 percent slopes
# } +
+
+ +
+ + +
+ + +
+

Site built with pkgdown 1.4.1.

+
+ +
+
+ + + + + + + + diff --git a/docs/reference/get_colors_from_NASIS_db.html b/docs/reference/get_colors_from_NASIS_db.html index 698e4a4f..8b7357bd 100644 --- a/docs/reference/get_colors_from_NASIS_db.html +++ b/docs/reference/get_colors_from_NASIS_db.html @@ -8,21 +8,25 @@ Extract Soil Color Data from a local NASIS Database — get_colors_from_NASIS_db • soilDB + + - + + - - + + + @@ -30,13 +34,14 @@ - + + @@ -47,6 +52,7 @@ + @@ -63,7 +69,7 @@ soilDB - 2.3.9 + 2.5 @@ -71,7 +77,7 @@ - @@ -92,6 +97,7 @@ +
@@ -103,13 +109,11 @@

Extract Soil Color Data from a local NASIS Database

-

Get, format, mix, and return color data from a NASIS database.

-
get_colors_from_NASIS_db(SS = TRUE)
- +

Arguments

@@ -118,51 +122,49 @@

Arg

fetch data from Selected Set in NASIS or from the entire local database (default: TRUE)

- +

Details

This function currently works only on Windows.

-

Value

A dataframe with the results.

-

See also

- +
-

Site built with pkgdown 1.3.0.

+

Site built with pkgdown 1.4.1.

+
+ + diff --git a/docs/reference/get_colors_from_pedon_db.html b/docs/reference/get_colors_from_pedon_db.html index 876f75bb..4f9cf38b 100644 --- a/docs/reference/get_colors_from_pedon_db.html +++ b/docs/reference/get_colors_from_pedon_db.html @@ -8,21 +8,25 @@ Extract Soil Color Data from a PedonPC Database — get_colors_from_pedon_db • soilDB + + - + + - - + + + @@ -30,13 +34,14 @@ - + + @@ -47,6 +52,7 @@ + @@ -63,7 +69,7 @@ soilDB - 2.3.9 + 2.5 @@ -71,7 +77,7 @@ - @@ -92,6 +97,7 @@ +
@@ -103,13 +109,11 @@

Extract Soil Color Data from a PedonPC Database

-

Get, format, mix, and return color data from a PedonPC database.

-
get_colors_from_pedon_db(dsn)
- +

Arguments

@@ -118,51 +122,49 @@

Arg

The path to a 'pedon.mdb' database.

- +

Details

This function currently works only on Windows.

-

Value

A dataframe with the results.

-

See also

- +
-

Site built with pkgdown 1.3.0.

+

Site built with pkgdown 1.4.1.

+
+ + diff --git a/docs/reference/get_comonth_from_NASIS_db.html b/docs/reference/get_comonth_from_NASIS_db.html index bec295b7..8044b1d1 100644 --- a/docs/reference/get_comonth_from_NASIS_db.html +++ b/docs/reference/get_comonth_from_NASIS_db.html @@ -8,21 +8,25 @@ Extract component month data from a local NASIS Database — get_comonth_from_NASIS_db • soilDB + + - + + - - + + + @@ -30,13 +34,14 @@ - + + @@ -47,6 +52,7 @@ + @@ -63,7 +69,7 @@ soilDB - 2.3.9 + 2.5 @@ -71,7 +77,7 @@ - @@ -92,6 +97,7 @@ +
@@ -103,15 +109,13 @@

Extract component month data from a local NASIS Database

-

Extract component month data from a local NASIS Database.

-
get_comonth_from_NASIS_db(SS = TRUE, fill = FALSE,
-                          stringsAsFactors = default.stringsAsFactors()
+                          stringsAsFactors = default.stringsAsFactors()
                           )
- +

Arguments

@@ -128,41 +132,37 @@

Arg

logical: should character vectors be converted to factors? This argument is passed to the uncode() function. It does not convert those vectors that have set outside of uncode() (i.e. hard coded). The 'factory-fresh' default is TRUE, but this can be changed by setting options(stringsAsFactors = FALSE)

- +

Details

This function currently works only on Windows.

-

Value

A list with the results.

-

See also

- - +

Examples

-
# NOT RUN {
+    
# \donttest{ # query text note data -cm <- get_comonth_from_NASIS_db() - +cm <- try(get_comonth_from_NASIS_db())
#> Error in `$<-.data.frame`(`*tmp*`, "month", value = NA_character_) : +#> replacement has 1 row, data has 0
# show structure of component month data -str(cm) - -# }
+str(cm)
#> 'try-error' chr "Error in `$<-.data.frame`(`*tmp*`, \"month\", value = NA_character_) : \n replacement has 1 row, data has 0\n" +#> - attr(*, "condition")=List of 2 +#> ..$ message: chr "replacement has 1 row, data has 0" +#> ..$ call : language `$<-.data.frame`(`*tmp*`, "month", value = NA_character_) +#> ..- attr(*, "class")= chr [1:3] "simpleError" "error" "condition"
# } +
+
-

Site built with pkgdown 1.3.0.

+

Site built with pkgdown 1.4.1.

+
+ + diff --git a/docs/reference/get_component_data_from_NASIS_db.html b/docs/reference/get_component_data_from_NASIS_db.html index 2fd41eae..070d2b97 100644 --- a/docs/reference/get_component_data_from_NASIS_db.html +++ b/docs/reference/get_component_data_from_NASIS_db.html @@ -8,21 +8,25 @@ Extract component data from a local NASIS Database — get_component_data_from_NASIS_db • soilDB + + - + + - - + + + @@ -30,13 +34,14 @@ - + + @@ -47,6 +52,7 @@ + @@ -63,7 +69,7 @@ soilDB - 2.3.9 + 2.5 @@ -71,7 +77,7 @@ - @@ -92,6 +97,7 @@ +
@@ -103,13 +109,12 @@

Extract component data from a local NASIS Database

-

Extract component data from a local NASIS Database.

-
-
get_component_data_from_NASIS_db(SS = TRUE, stringsAsFactors = default.stringsAsFactors())
- +
get_component_data_from_NASIS_db(SS = TRUE, stringsAsFactors = default.stringsAsFactors())
+get_component_restrictions_from_NASIS_db(SS = TRUE)
+

Arguments

@@ -122,41 +127,90 @@

Arg

logical: should character vectors be converted to factors? This argument is passed to the uncode() function. It does not convert those vectors that have set outside of uncode() (i.e. hard coded). The 'factory-fresh' default is TRUE, but this can be changed by setting options(stringsAsFactors = FALSE)

- +

Details

This function currently works only on Windows.

-

Value

A list with the results.

-

See also

- - +

Examples

-
# NOT RUN {
+    
# \donttest{ # query text note data -fc <- get_component_data_from_NASIS_db() +fc <- try(get_component_data_from_NASIS_db()) # show structure of component data returned -str(fc) - -# }
+str(fc)
#> 'data.frame': 0 obs. of 57 variables: +#> $ dmudesc : chr +#> $ compname : chr +#> $ comppct_r : int +#> $ compkind : int +#> $ majcompflag : chr +#> $ localphase : chr +#> $ drainagecl : int +#> $ hydricrating : int +#> $ elev_l : num +#> $ elev_r : num +#> $ elev_h : num +#> $ slope_l : num +#> $ slope_r : num +#> $ slope_h : num +#> $ aspectccwise : int +#> $ aspectrep : int +#> $ aspectcwise : int +#> $ map_l : int +#> $ map_r : int +#> $ map_h : int +#> $ maat_l : num +#> $ maat_r : num +#> $ maat_h : num +#> $ mast_r : num +#> $ reannualprecip_r: int +#> $ ffd_l : int +#> $ ffd_r : int +#> $ ffd_h : int +#> $ tfact : int +#> $ wei : int +#> $ weg : int +#> $ nirrcapcl : int +#> $ nirrcapscl : int +#> $ nirrcapunit : int +#> $ irrcapcl : int +#> $ irrcapscl : int +#> $ irrcapunit : int +#> $ frostact : int +#> $ hydricrating.1 : int +#> $ hydgrp : int +#> $ corcon : int +#> $ corsteel : int +#> $ taxclname : chr +#> $ taxorder : int +#> $ taxsuborder : int +#> $ taxgrtgroup : int +#> $ taxsubgrp : int +#> $ taxpartsize : int +#> $ taxpartsizemod : int +#> $ taxceactcl : int +#> $ taxreaction : int +#> $ taxtempcl : int +#> $ taxmoistscl : int +#> $ taxtempregime : int +#> $ soiltaxedition : int +#> $ coiid : int +#> $ dmuiid : int
# } +
+
-

Site built with pkgdown 1.3.0.

+

Site built with pkgdown 1.4.1.

+
+ + diff --git a/docs/reference/get_cosoilmoist_from_NASIS.html b/docs/reference/get_cosoilmoist_from_NASIS.html new file mode 100644 index 00000000..3c721279 --- /dev/null +++ b/docs/reference/get_cosoilmoist_from_NASIS.html @@ -0,0 +1,192 @@ + + + + + + + + +Read and Flatten the Component Soil Moisture Tables — get_cosoilmoist_from_NASIS • soilDB + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +
+
+ + + + +
+ +
+
+ + +
+

Read and flatten the component soil moisture month tables from a local NASIS Database.

+
+ +
get_cosoilmoist_from_NASIS(impute = TRUE, stringsAsFactors = default.stringsAsFactors())
+ +

Arguments

+ + + + + + + + + + +
impute

replace missing (i.e. NULL) values with "Not_Populated" for categorical data, or the "RV" for numeric data or 201 cm if the "RV" is also NULL (default: TRUE)

stringsAsFactors

logical: should character vectors be converted to factors? This argument is passed to the uncode() function. It does not convert those vectors that have set outside of uncode() (i.e. hard coded). The 'factory-fresh' default is TRUE, but this can be changed by setting options(stringsAsFactors = FALSE)

+ +

Value

+ +

A dataframe.

+

Details

+ +

The component soil moisture tables within NASIS house monthly data on flooding, ponding, and soil moisture status. The soil moisture status is used to specify the water table depth for components (e.g. status == "Moist").

+

Note

+ +

This function currently works only on Windows.

+

See also

+ + + +

Examples

+
# \donttest{ +# load cosoilmoist (e.g. water table data) +test <- try(get_cosoilmoist_from_NASIS()) + +# inspect +if(!inherits(test, 'try-error')) { + head(test) +}
#> [1] dmuiid coiid compname comppct_r drainagecl +#> [6] month flodfreqcl pondfreqcl cosoilmoistiid dept_l +#> [11] dept_r dept_h depb_l depb_r depb_h +#> [16] status +#> <0 rows> (or 0-length row.names)
# }
+
+ +
+ + +
+ + +
+

Site built with pkgdown 1.4.1.

+
+ +
+
+ + + + + + + + diff --git a/docs/reference/get_extended_data_from_NASIS.html b/docs/reference/get_extended_data_from_NASIS.html index e396fa5e..d473faef 100644 --- a/docs/reference/get_extended_data_from_NASIS.html +++ b/docs/reference/get_extended_data_from_NASIS.html @@ -8,21 +8,25 @@ Extract accessory tables and summaries from a local NASIS Database — get_extended_data_from_NASIS_db • soilDB + + - + + - - + + + @@ -30,13 +34,14 @@ - + + @@ -47,6 +52,7 @@ + @@ -63,7 +69,7 @@ soilDB - 2.3.9 + 2.5 @@ -71,7 +77,7 @@ - @@ -92,6 +97,7 @@ +
@@ -103,15 +109,13 @@

Extract accessory tables and summaries from a local NASIS Database

-

Extract accessory tables and summaries from a local NASIS Database.

-
get_extended_data_from_NASIS_db(SS = TRUE, nullFragsAreZero = TRUE,
-                                stringsAsFactors = default.stringsAsFactors()
+                                stringsAsFactors = default.stringsAsFactors()
                                 )
- +

Arguments

@@ -128,40 +132,218 @@

Arg

logical: should character vectors be converted to factors? This argument is passed to the uncode() function. It does not convert those vectors that have been set outside of uncode() (i.e. hard coded). The 'factory-fresh' default is TRUE, but this can be changed by setting options(stringsAsFactors = FALSE)

- +

Details

This function currently works only on Windows.

-

Value

A list with the results.

-

See also

-

Examples

-
# NOT RUN {
+    
# \donttest{ # query extended data -e <- get_extended_data_from_NASIS_db() - +e <- try(get_extended_data_from_NASIS_db())
#> Warning: some records are missing rock fragment volume, these have been removed
#> -> QC: some fragsize_h values == 76mm, may be mis-classified as cobbles [91 / 3505 records]
#> Warning: some records are missing artifact volume, these have been removed
#> Warning: all records are missing artifact volume (NULL). buffering result with NA. will be converted to zero if nullFragsAreZero = TRUE.
# show contents of extended data -str(e) -# }
+str(e)
#> List of 15 +#> $ ecositehistory :'data.frame': 0 obs. of 5 variables: +#> ..$ siteiid : int(0) +#> ..$ ecositeid : chr(0) +#> ..$ ecositenm : chr(0) +#> ..$ ecositecorrdate: chr(0) +#> ..$ es_classifier : chr(0) +#> $ diagnostic :'data.frame': 750 obs. of 4 variables: +#> ..$ peiid : int [1:750] 35404 35404 35404 75620 75620 75857 75857 75857 76262 109251 ... +#> ..$ featkind: Factor w/ 84 levels "anthropic epipedon",..: 20 23 32 23 32 23 32 50 32 23 ... +#> ..$ featdept: int [1:750] 0 23 36 0 23 0 41 119 18 0 ... +#> ..$ featdepb: int [1:750] 36 203 178 23 183 18 165 165 183 18 ... +#> $ diagHzBoolean :'data.frame': 293 obs. of 21 variables: +#> ..$ peiid : int [1:293] 35404 75620 75857 76262 109251 111458 113566 115627 121340 122973 ... +#> ..$ mollic.epipedon : logi [1:293] TRUE FALSE FALSE FALSE FALSE FALSE ... +#> ..$ ochric.epipedon : logi [1:293] TRUE TRUE TRUE FALSE TRUE TRUE ... +#> ..$ argillic.horizon : logi [1:293] TRUE TRUE TRUE TRUE TRUE TRUE ... +#> ..$ mottles.with.chroma.2.or.less : logi [1:293] FALSE FALSE TRUE FALSE FALSE FALSE ... +#> ..$ kandic.horizon : logi [1:293] FALSE FALSE FALSE FALSE FALSE FALSE ... +#> ..$ fragipan : logi [1:293] FALSE FALSE FALSE FALSE FALSE FALSE ... +#> ..$ fragic.soil.properties : logi [1:293] FALSE FALSE FALSE FALSE FALSE FALSE ... +#> ..$ lithologic.discontinuity : logi [1:293] FALSE FALSE FALSE FALSE FALSE FALSE ... +#> ..$ densic.contact : logi [1:293] FALSE FALSE FALSE FALSE FALSE FALSE ... +#> ..$ cambic.horizon : logi [1:293] FALSE FALSE FALSE FALSE FALSE FALSE ... +#> ..$ densic.materials : logi [1:293] FALSE FALSE FALSE FALSE FALSE FALSE ... +#> ..$ endosaturation : logi [1:293] FALSE FALSE FALSE FALSE FALSE FALSE ... +#> ..$ aquic.conditions : logi [1:293] FALSE FALSE FALSE FALSE FALSE FALSE ... +#> ..$ paralithic.contact : logi [1:293] FALSE FALSE FALSE FALSE FALSE FALSE ... +#> ..$ reduced.matrix : logi [1:293] FALSE FALSE FALSE FALSE FALSE FALSE ... +#> ..$ redox.concentrations : logi [1:293] FALSE FALSE FALSE FALSE FALSE FALSE ... +#> ..$ lithic.contact : logi [1:293] FALSE FALSE FALSE FALSE FALSE FALSE ... +#> ..$ redox.depletions.with.chroma.2.or.less : logi [1:293] FALSE FALSE FALSE FALSE FALSE FALSE ... +#> ..$ free.carbonates : logi [1:293] FALSE FALSE FALSE FALSE FALSE FALSE ... +#> ..$ strongly.contrasting.particle.size.class: logi [1:293] FALSE FALSE FALSE FALSE FALSE FALSE ... +#> $ restriction :'data.frame': 69 obs. of 8 variables: +#> ..$ peiid : int [1:69] 114228 206244 215676 215684 216189 216552 217527 217533 217649 217672 ... +#> ..$ resdept : int [1:69] NA 40 173 103 182 150 145 163 165 167 ... +#> ..$ resdepb : int [1:69] NA 80 203 203 203 203 203 203 203 203 ... +#> ..$ resthk_l: int [1:69] NA NA NA NA NA NA NA NA NA NA ... +#> ..$ resthk_r: int [1:69] NA NA 30 100 100 100 100 100 100 100 ... +#> ..$ resthk_h: int [1:69] NA NA NA NA NA NA NA NA NA NA ... +#> ..$ reskind : Factor w/ 21 levels "abrupt textural change",..: NA 3 1 1 1 1 1 1 1 1 ... +#> ..$ reshard : Factor w/ 14 levels "noncemented",..: NA NA 1 1 1 1 1 1 1 1 ... +#> $ frag_summary :'data.frame': 3427 obs. of 18 variables: +#> ..$ phiid : int [1:3427] 160463 160464 160465 160466 160467 160468 160487 160488 160489 160490 ... +#> ..$ fine_gravel : num [1:3427] 0 0 0 0 0 0 0 0 0 0 ... +#> ..$ gravel : num [1:3427] 0 0 0 0 0 0 0 0 0 0 ... +#> ..$ cobbles : num [1:3427] 0 0 0 0 0 0 0 0 0 0 ... +#> ..$ stones : num [1:3427] 0 0 0 0 0 0 0 0 0 0 ... +#> ..$ boulders : num [1:3427] 0 0 0 0 0 0 0 0 0 0 ... +#> ..$ channers : num [1:3427] 0 0 0 0 0 0 0 0 0 0 ... +#> ..$ flagstones : num [1:3427] 0 0 0 0 0 0 0 0 0 0 ... +#> ..$ parafine_gravel : num [1:3427] 0 0 0 0 0 0 0 0 0 0 ... +#> ..$ paragravel : num [1:3427] 0 0 0 0 0 0 0 0 0 0 ... +#> ..$ paracobbles : num [1:3427] 0 0 0 0 0 0 0 0 0 0 ... +#> ..$ parastones : num [1:3427] 0 0 0 0 0 0 0 0 0 0 ... +#> ..$ paraboulders : num [1:3427] 0 0 0 0 0 0 0 0 0 0 ... +#> ..$ parachanners : num [1:3427] 0 0 0 0 0 0 0 0 0 0 ... +#> ..$ paraflagstones : num [1:3427] 0 0 0 0 0 0 0 0 0 0 ... +#> ..$ unspecified : num [1:3427] 0 0 0 0 0 0 0 10 10 10 ... +#> ..$ total_frags_pct_nopf: num [1:3427] 0 0 0 0 0 0 0 10 10 10 ... +#> ..$ total_frags_pct : num [1:3427] 0 0 0 0 0 0 0 10 10 10 ... +#> $ frag_summary_v2 :'data.frame': 3440 obs. of 18 variables: +#> ..$ phiid : int [1:3440] NA NA NA NA NA NA NA NA NA NA ... +#> ..$ fine_gravel : num [1:3440] 0 0 0 0 0 0 0 0 0 0 ... +#> ..$ gravel : num [1:3440] 0 0 0 0 0 0 0 0 0 0 ... +#> ..$ cobbles : num [1:3440] 0 0 0 0 0 0 0 0 0 0 ... +#> ..$ stones : num [1:3440] 0 0 0 0 0 0 0 0 0 0 ... +#> ..$ boulders : num [1:3440] 0 0 0 0 0 0 0 0 0 0 ... +#> ..$ channers : num [1:3440] 0 0 0 0 0 0 0 0 0 0 ... +#> ..$ flagstones : num [1:3440] 0 0 0 0 0 0 0 0 0 0 ... +#> ..$ parafine_gravel : num [1:3440] 0 0 0 0 0 0 0 0 0 0 ... +#> ..$ paragravel : num [1:3440] 0 0 0 0 0 0 0 0 0 0 ... +#> ..$ paracobbles : num [1:3440] 0 0 0 0 0 0 0 0 0 0 ... +#> ..$ parastones : num [1:3440] 0 0 0 0 0 0 0 0 0 0 ... +#> ..$ paraboulders : num [1:3440] 0 0 0 0 0 0 0 0 0 0 ... +#> ..$ parachanners : num [1:3440] 0 0 0 0 0 0 0 0 0 0 ... +#> ..$ paraflagstones : num [1:3440] 0 0 0 0 0 0 0 0 0 0 ... +#> ..$ unspecified : num [1:3440] 0 0 0 0 0 0 0 0 0 0 ... +#> ..$ total_frags_pct_nopf: num [1:3440] 0 0 0 0 0 0 0 0 0 0 ... +#> ..$ total_frags_pct : num [1:3440] 0 0 0 0 0 0 0 0 0 0 ... +#> $ art_summary :'data.frame': 3427 obs. of 14 variables: +#> ..$ phiid : int [1:3427] 160463 160464 160465 160466 160467 160468 160487 160488 160489 160490 ... +#> ..$ art_fgr : num [1:3427] 0 0 0 0 0 0 0 0 0 0 ... +#> ..$ art_gr : num [1:3427] 0 0 0 0 0 0 0 0 0 0 ... +#> ..$ art_cb : num [1:3427] 0 0 0 0 0 0 0 0 0 0 ... +#> ..$ art_st : num [1:3427] 0 0 0 0 0 0 0 0 0 0 ... +#> ..$ art_by : num [1:3427] 0 0 0 0 0 0 0 0 0 0 ... +#> ..$ art_ch : num [1:3427] 0 0 0 0 0 0 0 0 0 0 ... +#> ..$ art_fl : num [1:3427] 0 0 0 0 0 0 0 0 0 0 ... +#> ..$ art_unspecified : num [1:3427] 0 0 0 0 0 0 0 0 0 0 ... +#> ..$ total_art_pct : num [1:3427] 0 0 0 0 0 0 0 0 0 0 ... +#> ..$ huartvol_cohesive : num [1:3427] 0 0 0 0 0 0 0 0 0 0 ... +#> ..$ huartvol_penetrable: num [1:3427] 0 0 0 0 0 0 0 0 0 0 ... +#> ..$ huartvol_innocuous : num [1:3427] 0 0 0 0 0 0 0 0 0 0 ... +#> ..$ huartvol_persistent: num [1:3427] 0 0 0 0 0 0 0 0 0 0 ... +#> $ surf_frag_summary:'data.frame': 611 obs. of 10 variables: +#> ..$ peiid : int [1:611] 35313 35317 35318 35404 36266 36331 36332 37061 38225 38226 ... +#> ..$ surface_fgravel : logi [1:611] NA NA NA NA NA NA ... +#> ..$ surface_gravel : num [1:611] NA NA NA NA NA NA NA NA NA NA ... +#> ..$ surface_cobbles : logi [1:611] NA NA NA NA NA NA ... +#> ..$ surface_stones : logi [1:611] NA NA NA NA NA NA ... +#> ..$ surface_boulders : logi [1:611] NA NA NA NA NA NA ... +#> ..$ surface_channers : num [1:611] NA NA NA NA NA NA NA NA NA NA ... +#> ..$ surface_flagstones : logi [1:611] NA NA NA NA NA NA ... +#> ..$ surface_paragravel : logi [1:611] NA NA NA NA NA NA ... +#> ..$ surface_paracobbles: logi [1:611] NA NA NA NA NA NA ... +#> $ texmodifier :'data.frame': 3624 obs. of 5 variables: +#> ..$ peiid : int [1:3624] 35313 35313 35313 35313 35313 35313 35313 35313 35317 35317 ... +#> ..$ phiid : int [1:3624] 160463 160464 160465 160466 160467 160468 160468 160468 160491 160491 ... +#> ..$ phtiid: int [1:3624] 162001 162002 162003 162004 162005 162006 162007 162008 162023 162024 ... +#> ..$ seqnum: int [1:3624] NA NA NA NA NA NA NA NA NA NA ... +#> ..$ texmod: Factor w/ 93 levels "by","byv","byx",..: NA NA NA NA NA NA NA NA NA NA ... +#> $ geomorph :'data.frame': 712 obs. of 7 variables: +#> ..$ peiid : int [1:712] 35404 36266 36331 36332 36332 37061 37061 38225 38225 38226 ... +#> ..$ geomfmod : chr [1:712] NA NA NA NA ... +#> ..$ geomfname : chr [1:712] "outwash plain" "outwash plain" "outwash plain" "ground moraine" ... +#> ..$ geomfeatid : int [1:712] NA NA NA NA NA NA NA NA NA NA ... +#> ..$ existsonfeat: int [1:712] NA NA NA NA NA NA NA NA NA NA ... +#> ..$ geomfiidref : int [1:712] 86 86 86 55 335 334 86 86 334 334 ... +#> ..$ geomftname : chr [1:712] "landform" "landform" "landform" "landform" ... +#> $ taxhistory :'data.frame': 1064 obs. of 20 variables: +#> ..$ peiid : int [1:1064] 35313 35313 35317 35317 35318 35318 35404 36266 36266 36331 ... +#> ..$ classdate : POSIXct[1:1064], format: "2000-04-05 00:00:00" "2012-04-06 17:11:07" ... +#> ..$ classifier : chr [1:1064] NA NA NA NA ... +#> ..$ classtype : chr [1:1064] "sampled as" "correlated" "correlated" "sampled as" ... +#> ..$ taxonname : chr [1:1064] "Drummer" "Drummer" "Drummer" "Drummer" ... +#> ..$ localphase : chr [1:1064] NA NA NA NA ... +#> ..$ taxonkind : chr [1:1064] NA "series" "series" NA ... +#> ..$ seriesstatus : chr [1:1064] NA NA NA NA ... +#> ..$ taxpartsize : chr [1:1064] NA "fine-silty" NA NA ... +#> ..$ taxorder : chr [1:1064] NA NA NA NA ... +#> ..$ taxsuborder : chr [1:1064] NA NA NA NA ... +#> ..$ taxgrtgroup : chr [1:1064] NA NA NA NA ... +#> ..$ taxsubgrp : chr [1:1064] NA NA NA NA ... +#> ..$ soiltaxedition: chr [1:1064] NA NA NA NA ... +#> ..$ osdtypelocflag: int [1:1064] 0 0 0 0 0 0 0 0 0 0 ... +#> ..$ taxmoistcl : chr [1:1064] NA NA NA NA ... +#> ..$ taxtempregime : chr [1:1064] NA NA NA NA ... +#> ..$ taxfamother : chr [1:1064] NA NA NA NA ... +#> ..$ psctopdepth : int [1:1064] NA NA NA NA NA NA 36 NA NA 25 ... +#> ..$ pscbotdepth : int [1:1064] NA NA NA NA NA NA 86 NA NA 102 ... +#> $ photo :'data.frame': 0 obs. of 4 variables: +#> ..$ siteiid : int(0) +#> ..$ recdate : chr(0) +#> ..$ textcat : chr(0) +#> ..$ imagepath: chr(0) +#> $ pm :'data.frame': 544 obs. of 10 variables: +#> ..$ siteiid : int [1:544] 35458 35458 37117 38301 38301 38284 38592 38296 38296 38297 ... +#> ..$ seqnum : int [1:544] 1 2 1 1 2 1 NA 1 2 1 ... +#> ..$ pmorder : int [1:544] NA NA NA NA NA NA NA NA NA NA ... +#> ..$ pmdept : int [1:544] NA NA NA NA NA NA NA NA NA NA ... +#> ..$ pmdepb : int [1:544] NA NA NA NA NA NA NA NA NA NA ... +#> ..$ pmmodifier : Factor w/ 13 levels "clayey","coarse-loamy",..: NA NA NA NA NA NA NA NA NA NA ... +#> ..$ pmgenmod : chr [1:544] NA NA NA NA ... +#> ..$ pmkind : Factor w/ 187 levels "sandstone","sandstone-noncalcareous",..: 102 108 102 102 108 102 108 102 108 102 ... +#> ..$ pmorigin : Factor w/ 162 levels "sandstone, unspecified",..: 73 73 NA NA NA NA NA NA NA NA ... +#> ..$ pmweathering: Factor w/ 3 levels "moderate","slight",..: NA NA NA NA NA NA NA NA NA NA ... +#> $ struct :'data.frame': 3359 obs. of 6 variables: +#> ..$ phiid : int [1:3359] 160463 160463 160464 160465 160466 160466 160467 160467 160468 160487 ... +#> ..$ structgrade : Factor w/ 7 levels "weak","moderate",..: 2 2 2 2 2 2 2 2 1 2 ... +#> ..$ structsize : Factor w/ 16 levels "coarse","coarse and very coarse",..: 6 3 3 6 6 6 6 6 6 6 ... +#> ..$ structtype : Factor w/ 14 levels "angular blocky",..: 1 4 4 9 8 1 8 1 8 9 ... +#> ..$ structid : int [1:3359] 1 2 NA NA 1 2 1 2 NA 1 ... +#> ..$ structpartsto: int [1:3359] 2 NA NA NA 2 NA 2 NA NA 2 ... +#> $ hzdesgn :'data.frame': 3427 obs. of 21 variables: +#> ..$ phiid : int [1:3427] 160463 160464 160465 160466 160467 160468 160487 160488 160489 160490 ... +#> ..$ seqnum : int [1:3427] NA NA NA NA NA NA NA NA NA NA ... +#> ..$ hzname : chr [1:3427] "Ap" "A" "BA" "Bg1" ... +#> ..$ hzdept : int [1:3427] 0 20 33 43 66 89 0 25 48 71 ... +#> ..$ hzdepb : int [1:3427] 20 33 43 66 89 152 25 48 71 107 ... +#> ..$ desgndisc : int [1:3427] NA NA NA NA NA NA NA NA NA NA ... +#> ..$ desgnmaster : Factor w/ 67 levels "O","A","E","B",..: 2 2 4 4 4 4 2 4 4 4 ... +#> ..$ desgnmasterprime: Factor w/ 5 levels "'","''","'''",..: NA NA NA NA NA NA NA NA NA NA ... +#> ..$ desgnvert : int [1:3427] NA NA NA NA NA NA NA NA NA NA ... +#> ..$ t : logi [1:3427] NA NA NA NA NA NA ... +#> ..$ p : logi [1:3427] NA NA NA NA NA NA ... +#> ..$ a : logi [1:3427] NA NA NA NA NA NA ... +#> ..$ c : logi [1:3427] NA NA NA NA NA NA ... +#> ..$ d : logi [1:3427] NA NA NA NA NA NA ... +#> ..$ x : logi [1:3427] NA NA NA NA NA NA ... +#> ..$ g : logi [1:3427] NA NA NA NA NA NA ... +#> ..$ e : logi [1:3427] NA NA NA NA NA NA ... +#> ..$ w : logi [1:3427] NA NA NA NA NA NA ... +#> ..$ r : logi [1:3427] NA NA NA NA NA NA ... +#> ..$ i : logi [1:3427] NA NA NA NA NA NA ... +#> ..$ b : logi [1:3427] NA NA NA NA NA NA ...
# } +
+
-

Site built with pkgdown 1.3.0.

+

Site built with pkgdown 1.4.1.

+
+ + diff --git a/docs/reference/get_extended_data_from_pedon_db.html b/docs/reference/get_extended_data_from_pedon_db.html index 2443d7f9..48d9f128 100644 --- a/docs/reference/get_extended_data_from_pedon_db.html +++ b/docs/reference/get_extended_data_from_pedon_db.html @@ -8,21 +8,25 @@ Extract accessory tables and summaries from a local pedonPC Database — get_extended_data_from_pedon_db • soilDB + + - + + - - + + + @@ -30,13 +34,14 @@ - + + @@ -47,6 +52,7 @@ + @@ -63,7 +69,7 @@ soilDB - 2.3.9 + 2.5 @@ -71,7 +77,7 @@ - @@ -92,6 +97,7 @@ +
@@ -103,13 +109,11 @@

Extract accessory tables and summaries from a local pedonPC Database

-

Extract accessory tables and summaries from a local pedonPC Database.

-
get_extended_data_from_pedon_db(dsn)
- +

Arguments

@@ -118,51 +122,49 @@

Arg

The path to a 'pedon.mdb' database.

- +

Details

This function currently works only on Windows.

-

Value

A list with the results.

-

See also

- +
-

Site built with pkgdown 1.3.0.

+

Site built with pkgdown 1.4.1.

+
+ + diff --git a/docs/reference/get_hz_data_from_NASIS_db.html b/docs/reference/get_hz_data_from_NASIS_db.html index 6ef51092..fe55e31a 100644 --- a/docs/reference/get_hz_data_from_NASIS_db.html +++ b/docs/reference/get_hz_data_from_NASIS_db.html @@ -8,21 +8,25 @@ Extract Horizon Data from a local NASIS Database — get_hz_data_from_NASIS_db • soilDB + + - + + - - + + + @@ -30,13 +34,14 @@ - + + @@ -47,6 +52,7 @@ + @@ -63,7 +69,7 @@ soilDB - 2.3.9 + 2.5 @@ -71,7 +77,7 @@ - @@ -92,6 +97,7 @@ +
@@ -103,13 +109,11 @@

Extract Horizon Data from a local NASIS Database

-

Get horizon-level data from a local NASIS database.

-
-
get_hz_data_from_NASIS_db(SS = TRUE, stringsAsFactors = default.stringsAsFactors())
- +
get_hz_data_from_NASIS_db(SS = TRUE, stringsAsFactors = default.stringsAsFactors())
+

Arguments

@@ -122,57 +126,53 @@

Arg

logical: should character vectors be converted to factors? This argument is passed to the uncode() function. It does not convert those vectors that have been set outside of uncode() (i.e. hard coded). The 'factory-fresh' default is TRUE, but this can be changed by setting options(stringsAsFactors = FALSE)

- +

Details

This function currently works only on Windows.

-

Value

A dataframe.

-

Note

NULL total rock fragment values are assumed to represent an _absense_ of rock fragments, and set to 0.

-

See also

get_hz_data_from_NASIS_db, get_site_data_from_NASIS_db

- +
-

Site built with pkgdown 1.3.0.

+

Site built with pkgdown 1.4.1.

+
+ + diff --git a/docs/reference/get_hz_data_from_pedon_db.html b/docs/reference/get_hz_data_from_pedon_db.html index 8e3c46bf..1aeb189a 100644 --- a/docs/reference/get_hz_data_from_pedon_db.html +++ b/docs/reference/get_hz_data_from_pedon_db.html @@ -8,21 +8,25 @@ Extract Horizon Data from a PedonPC Database — get_hz_data_from_pedon_db • soilDB + + - + + - - + + + @@ -30,13 +34,14 @@ - + + @@ -47,6 +52,7 @@ + @@ -63,7 +69,7 @@ soilDB - 2.3.9 + 2.5 @@ -71,7 +77,7 @@ - @@ -92,6 +97,7 @@ +
@@ -103,13 +109,11 @@

Extract Horizon Data from a PedonPC Database

-

Get horizon-level data from a PedonPC database.

-
get_hz_data_from_pedon_db(dsn)
- +

Arguments

@@ -118,57 +122,53 @@

Arg

The path to a 'pedon.mdb' database.

- +

Details

This function currently works only on Windows.

-

Value

A dataframe.

-

Note

NULL total rock fragment values are assumed to represent an _absense_ of rock fragments, and set to 0.

-

See also

- +
-

Site built with pkgdown 1.3.0.

+

Site built with pkgdown 1.4.1.

+
+ + diff --git a/docs/reference/get_lablayer_data_from_NASIS_db.html b/docs/reference/get_lablayer_data_from_NASIS_db.html index 1848d3ca..0622e334 100644 --- a/docs/reference/get_lablayer_data_from_NASIS_db.html +++ b/docs/reference/get_lablayer_data_from_NASIS_db.html @@ -8,21 +8,25 @@ Extract lab pedon layer data from a local NASIS Database — get_lablayer_data_from_NASIS_db • soilDB + + - + + - - + + + @@ -30,13 +34,14 @@ - + + @@ -47,6 +52,7 @@ + @@ -63,7 +69,7 @@ soilDB - 2.3.9 + 2.5 @@ -71,7 +77,7 @@ - @@ -92,6 +97,7 @@ +
@@ -103,62 +109,66 @@

Extract lab pedon layer data from a local NASIS Database

-

Get lab pedon layer-level(horizon-level) data from a local NASIS database.

-
-
get_lablayer_data_from_NASIS_db()
- +
get_lablayer_data_from_NASIS_db(SS = TRUE)
+ +

Arguments

+ + + + + + +
SS

fetch data from the currently loaded selected set in NASIS or from the entire local database (default: TRUE)

+

Value

A dataframe.

-

Details

This function currently works only on Windows, and requires a 'nasis_local' ODBC connection.

-

Note

This function queries KSSL laboratory site/horizon data from a local NASIS database from the lab layer data table.

-

See also

- +
-

Site built with pkgdown 1.3.0.

+

Site built with pkgdown 1.4.1.

+
+ + diff --git a/docs/reference/get_labpedon_data_from_NASIS_db.html b/docs/reference/get_labpedon_data_from_NASIS_db.html index 1c5d024c..fa80ef2c 100644 --- a/docs/reference/get_labpedon_data_from_NASIS_db.html +++ b/docs/reference/get_labpedon_data_from_NASIS_db.html @@ -8,21 +8,25 @@ Extract lab pedon data from a local NASIS Database — get_labpedon_data_from_NASIS_db • soilDB + + - + + - - + + + @@ -30,13 +34,14 @@ - + + @@ -47,6 +52,7 @@ + @@ -63,7 +69,7 @@ soilDB - 2.3.9 + 2.5 @@ -71,7 +77,7 @@ - @@ -92,6 +97,7 @@ +
@@ -103,62 +109,66 @@

Extract lab pedon data from a local NASIS Database

-

Get lab pedon-level data from a local NASIS database.

-
-
get_labpedon_data_from_NASIS_db()
- +
get_labpedon_data_from_NASIS_db(SS = TRUE)
+ +

Arguments

+ + + + + + +
SS

fetch data from the currently loaded selected set in NASIS or from the entire local database (default: TRUE)

+

Value

A dataframe.

-

Details

This function currently works only on Windows, and requires a 'nasis_local' ODBC connection.

-

Note

This fuction queries KSSL laboratory site/horizon data from a local NASIS database from the lab pedon data table.

-

See also

- +
-

Site built with pkgdown 1.3.0.

+

Site built with pkgdown 1.4.1.

+
+ + diff --git a/docs/reference/get_site_data_from_NASIS_db.html b/docs/reference/get_site_data_from_NASIS_db.html index fe553e00..900887ae 100644 --- a/docs/reference/get_site_data_from_NASIS_db.html +++ b/docs/reference/get_site_data_from_NASIS_db.html @@ -1,237 +1,245 @@ - - - - - - - - -Extract Site Data from a local NASIS Database — get_site_data_from_NASIS_db • soilDB - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
-
- - - -
- -
-
- - -
- -

Get site-level data from a local NASIS database.

- -
- -
get_site_data_from_NASIS_db(SS = TRUE, stringsAsFactors = default.stringsAsFactors())
- -

Arguments

- - - - - - - - - - -
SS

fetch data from Selected Set in NASIS or from the entire local database (default: TRUE)

stringsAsFactors

logical: should character vectors be converted to factors? This argument is passed to the uncode() function. It does not convert those vectors that have been set outside of uncode() (i.e. hard coded). The 'factory-fresh' default is TRUE, but this can be changed by setting options(stringsAsFactors = FALSE)

- -

Value

- -

A dataframe.

- -

Details

- -

When multiple "site bedrock" entries are present, only the shallowest is returned by this function.

- -

Note

- -

This function currently works only on Windows.

- -

See also

- - - - -

Examples

-
-
# NOT RUN { -## Example: export / convert DMS coordinates from NASIS and save to DD import file - -# load required libraries -library(soilDB) -library(rgdal) -library(plyr) - -# get site data from NASIS -s <- get_site_data_from_NASIS_db() - -# keep only those pedons with real coordinates -good.idx <- which(!is.na(s$x)) -s <- s[good.idx, ] - -# investigate multiple datums: -get_site_data_from_NASIS_db - -## this is not universally appropriate! -# assume missing is NAD83 -s$horizdatnm[is.na(s$horizdatnm)] <- 'NAD83' - -# check: OK -table(s$horizdatnm, useNA='always') - -# convert to NAD83 -old.coords <- cbind(s$x, s$y) - -# add temp column for projection information, and fill with proj4 style info -s$proj4 <- rep(NA, times=nrow(s)) -s$proj4 <- paste('+proj=longlat +datum=', s$horizdatnm, sep='') - -# iterate over pedons, and convert to WGS84 -new.coords <- ddply(s, 'siteiid', - .progress='text', .fun=function(i) { - coordinates(i) <- ~ x + y - proj4string(i) <- CRS(i$proj4) - i.t <- spTransform(i, CRS('+proj=longlat +datum=WGS84')) - i.c <- as.matrix(coordinates(i.t)) - return(data.frame(x.new=i.c[, 1], y.new=i.c[, 2])) - }) - -# merge in new coordinates -s <- join(s, new.coords) - -# any changes? -summary(sqrt(apply((s[, c('x', 'y')] - s[, c('x.new', 'y.new')])^2, 1, sum))) - -# save to update file for use with "Import of Standard WGS84 Georeference" calculation in NASIS -# note that this defines the coordinate source as "GPS", hence the last column of '1's. -std.coordinates.update.data <- unique(cbind(s[, c('siteiid', 'y.new', 'x.new')], 1)) -# save to file -write.table(std.coordinates.update.data, -file='c:/data/sgeoref.txt', col.names=FALSE, row.names=FALSE, sep='|') -# }
-
- -
- -
- - -
-

Site built with pkgdown 1.3.0.

-
-
-
- - - - - - + + + + + + + + +Extract Site Data from a local NASIS Database — get_site_data_from_NASIS_db • soilDB + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +
+
+ + + + +
+ +
+
+ + +
+

Get site-level data from a local NASIS database.

+
+ +
get_site_data_from_NASIS_db(SS = TRUE, stringsAsFactors = default.stringsAsFactors())
+ +

Arguments

+ + + + + + + + + + +
SS

fetch data from Selected Set in NASIS or from the entire local database (default: TRUE)

stringsAsFactors

logical: should character vectors be converted to factors? This argument is passed to the uncode() function. It does not convert those vectors that have been set outside of uncode() (i.e. hard coded). The 'factory-fresh' default is TRUE, but this can be changed by setting options(stringsAsFactors = FALSE)

+ +

Value

+ +

A dataframe.

+

Details

+ +

When multiple "site bedrock" entries are present, only the shallowest is returned by this function.

+

Note

+ +

This function currently works only on Windows.

+

See also

+ + + +

Examples

+
# \donttest{ +## Example: export / convert DMS coordinates from NASIS and save to DD import file + +# load required libraries +if(require(aqp) & + require(soilDB) & + require(rgdal) & + require(plyr)) { + +# get site data from NASIS +s <- try(get_site_data_from_NASIS_db()) + +if(!inherits(s, 'try-error')) { + # keep only those pedons with real coordinates + good.idx <- which(!is.na(s$x)) + s <- s[good.idx, ] + + ## this is not universally appropriate! + # assume missing is NAD83 + s$horizdatnm[is.na(s$horizdatnm)] <- 'NAD83' + + # check: OK + table(s$horizdatnm, useNA='always') + + # convert to NAD83 + old.coords <- cbind(s$x, s$y) + + if(nrow(s)) { + # add temp column for projection information, and fill with proj4 style info + s$proj4 <- rep(NA, times=nrow(s)) + s$proj4 <- paste('+proj=longlat +datum=', s$horizdatnm, sep='') + + # iterate over pedons, and convert to WGS84 + new.coords <- ddply(s, 'siteiid', + .progress='text', .fun=function(i) { + coordinates(i) <- ~ x + y + proj4string(i) <- CRS(i$proj4) + i.t <- spTransform(i, CRS('+proj=longlat +datum=WGS84')) + i.c <- as.matrix(coordinates(i.t)) + return(data.frame(x.new=i.c[, 1], y.new=i.c[, 2])) + }) + + # merge in new coordinates + s <- join(s, new.coords) + + # any changes? + summary(sqrt(apply((s[, c('x', 'y')] - s[, c('x.new', 'y.new')])^2, 1, sum))) + + # save to update file for use with "Import of Standard WGS84 Georeference" calculation + # in NASIS note that this defines the coordinate source as "GPS", hence the last + # column of '1's. + std.coordinates.update.data <- unique(cbind(s[, c('siteiid', 'y.new', 'x.new')], 1)) + # save to file + write.table(std.coordinates.update.data, + file='c:/data/sgeoref.txt', col.names=FALSE, + row.names=FALSE, sep='|') + } +}}# }
#> Loading required package: rgdal
#> Warning: package 'rgdal' was built under R version 3.5.3
#> rgdal: version: 1.4-8, (SVN revision 845) +#> Geospatial Data Abstraction Library extensions to R successfully loaded +#> Loaded GDAL runtime: GDAL 2.2.3, released 2017/11/20 +#> Path to GDAL shared files: C:/Users/Dylan.Beaudette/Documents/R/win-library/3.5/rgdal/gdal +#> GDAL binary built with GEOS: TRUE +#> Loaded PROJ.4 runtime: Rel. 4.9.3, 15 August 2016, [PJ_VERSION: 493] +#> Path to PROJ.4 shared files: C:/Users/Dylan.Beaudette/Documents/R/win-library/3.5/rgdal/proj +#> Linking to sp version: 1.3-2
#> multiple horizontal datums present, consider using WGS84 coordinates (x_std, y_std)
#> | | | 0% | | | 1% | |= | 1% | |= | 2% | |== | 2% | |== | 3% | |== | 4% | |=== | 4%
#> Warning: the condition has length > 1 and only the first element will be used
#> Warning: the condition has length > 1 and only the first element will be used
#> Warning: the condition has length > 1 and only the first element will be used
#> | |=== | 5% | |==== | 5%
#> Warning: the condition has length > 1 and only the first element will be used
#> Warning: the condition has length > 1 and only the first element will be used
#> Warning: the condition has length > 1 and only the first element will be used
#> | |==== | 6%
#> Warning: the condition has length > 1 and only the first element will be used
#> Warning: the condition has length > 1 and only the first element will be used
#> Warning: the condition has length > 1 and only the first element will be used
#> Warning: the condition has length > 1 and only the first element will be used
#> Warning: the condition has length > 1 and only the first element will be used
#> Warning: the condition has length > 1 and only the first element will be used
#> Warning: the condition has length > 1 and only the first element will be used
#> Warning: the condition has length > 1 and only the first element will be used
#> Warning: the condition has length > 1 and only the first element will be used
#> | |===== | 6%
#> Warning: the condition has length > 1 and only the first element will be used
#> Warning: the condition has length > 1 and only the first element will be used
#> Warning: the condition has length > 1 and only the first element will be used
#> | |===== | 7%
#> Warning: the condition has length > 1 and only the first element will be used
#> Warning: the condition has length > 1 and only the first element will be used
#> Warning: the condition has length > 1 and only the first element will be used
#> | |===== | 8% | |====== | 8% | |====== | 9% | |======= | 9% | |======= | 10% | |======= | 11% | |======== | 11% | |======== | 12% | |========= | 12%
#> Warning: the condition has length > 1 and only the first element will be used
#> Warning: the condition has length > 1 and only the first element will be used
#> Warning: the condition has length > 1 and only the first element will be used
#> Warning: the condition has length > 1 and only the first element will be used
#> Warning: the condition has length > 1 and only the first element will be used
#> Warning: the condition has length > 1 and only the first element will be used
#> | |========= | 13%
#> Warning: the condition has length > 1 and only the first element will be used
#> Warning: the condition has length > 1 and only the first element will be used
#> Warning: the condition has length > 1 and only the first element will be used
#> Warning: the condition has length > 1 and only the first element will be used
#> Warning: the condition has length > 1 and only the first element will be used
#> Warning: the condition has length > 1 and only the first element will be used
#> Warning: the condition has length > 1 and only the first element will be used
#> Warning: the condition has length > 1 and only the first element will be used
#> Warning: the condition has length > 1 and only the first element will be used
#> | |========= | 14% | |========== | 14% | |========== | 15% | |=========== | 15% | |=========== | 16% | |============ | 16% | |============ | 17% | |============ | 18% | |============= | 18% | |============= | 19%
#> Warning: the condition has length > 1 and only the first element will be used
#> Warning: the condition has length > 1 and only the first element will be used
#> Warning: the condition has length > 1 and only the first element will be used
#> | |============== | 19% | |============== | 20% | |============== | 21% | |=============== | 21% | |=============== | 22% | |================ | 22%
#> Warning: the condition has length > 1 and only the first element will be used
#> Warning: the condition has length > 1 and only the first element will be used
#> Warning: the condition has length > 1 and only the first element will be used
#> | |================ | 23% | |================ | 24% | |================= | 24% | |================= | 25% | |================== | 25% | |================== | 26%
#> Warning: the condition has length > 1 and only the first element will be used
#> Warning: the condition has length > 1 and only the first element will be used
#> Warning: the condition has length > 1 and only the first element will be used
#> | |=================== | 26%
#> Warning: the condition has length > 1 and only the first element will be used
#> Warning: the condition has length > 1 and only the first element will be used
#> Warning: the condition has length > 1 and only the first element will be used
#> | |=================== | 27%
#> Warning: the condition has length > 1 and only the first element will be used
#> Warning: the condition has length > 1 and only the first element will be used
#> Warning: the condition has length > 1 and only the first element will be used
#> | |=================== | 28% | |==================== | 28% | |==================== | 29%
#> Warning: the condition has length > 1 and only the first element will be used
#> Warning: the condition has length > 1 and only the first element will be used
#> Warning: the condition has length > 1 and only the first element will be used
#> | |===================== | 29% | |===================== | 30% | |===================== | 31%
#> Warning: the condition has length > 1 and only the first element will be used
#> Warning: the condition has length > 1 and only the first element will be used
#> Warning: the condition has length > 1 and only the first element will be used
#> | |====================== | 31%
#> Warning: the condition has length > 1 and only the first element will be used
#> Warning: the condition has length > 1 and only the first element will be used
#> Warning: the condition has length > 1 and only the first element will be used
#> Warning: the condition has length > 1 and only the first element will be used
#> Warning: the condition has length > 1 and only the first element will be used
#> Warning: the condition has length > 1 and only the first element will be used
#> | |====================== | 32% | |======================= | 32% | |======================= | 33%
#> Warning: the condition has length > 1 and only the first element will be used
#> Warning: the condition has length > 1 and only the first element will be used
#> Warning: the condition has length > 1 and only the first element will be used
#> | |======================= | 34%
#> Warning: the condition has length > 1 and only the first element will be used
#> Warning: the condition has length > 1 and only the first element will be used
#> Warning: the condition has length > 1 and only the first element will be used
#> | |======================== | 34%
#> Warning: the condition has length > 1 and only the first element will be used
#> Warning: the condition has length > 1 and only the first element will be used
#> Warning: the condition has length > 1 and only the first element will be used
#> Warning: the condition has length > 1 and only the first element will be used
#> Warning: the condition has length > 1 and only the first element will be used
#> Warning: the condition has length > 1 and only the first element will be used
#> | |======================== | 35%
#> Warning: the condition has length > 1 and only the first element will be used
#> Warning: the condition has length > 1 and only the first element will be used
#> Warning: the condition has length > 1 and only the first element will be used
#> | |========================= | 35% | |========================= | 36% | |========================== | 36% | |========================== | 37% | |========================== | 38% | |=========================== | 38% | |=========================== | 39% | |============================ | 39% | |============================ | 40% | |============================ | 41% | |============================= | 41% | |============================= | 42% | |============================== | 42% | |============================== | 43% | |============================== | 44% | |=============================== | 44% | |=============================== | 45% | |================================ | 45% | |================================ | 46% | |================================= | 46% | |================================= | 47% | |================================= | 48% | |================================== | 48% | |================================== | 49% | |=================================== | 49% | |=================================== | 50% | |=================================== | 51% | |==================================== | 51% | |==================================== | 52% | |===================================== | 52% | |===================================== | 53%
#> Warning: the condition has length > 1 and only the first element will be used
#> Warning: the condition has length > 1 and only the first element will be used
#> Warning: the condition has length > 1 and only the first element will be used
#> | |===================================== | 54% | |====================================== | 54% | |====================================== | 55% | |======================================= | 55%
#> Warning: the condition has length > 1 and only the first element will be used
#> Warning: the condition has length > 1 and only the first element will be used
#> Warning: the condition has length > 1 and only the first element will be used
#> | |======================================= | 56%
#> Warning: the condition has length > 1 and only the first element will be used
#> Warning: the condition has length > 1 and only the first element will be used
#> Warning: the condition has length > 1 and only the first element will be used
#> Warning: the condition has length > 1 and only the first element will be used
#> Warning: the condition has length > 1 and only the first element will be used
#> Warning: the condition has length > 1 and only the first element will be used
#> Warning: the condition has length > 1 and only the first element will be used
#> Warning: the condition has length > 1 and only the first element will be used
#> Warning: the condition has length > 1 and only the first element will be used
#> | |======================================== | 56%
#> Warning: the condition has length > 1 and only the first element will be used
#> Warning: the condition has length > 1 and only the first element will be used
#> Warning: the condition has length > 1 and only the first element will be used
#> | |======================================== | 57%
#> Warning: the condition has length > 1 and only the first element will be used
#> Warning: the condition has length > 1 and only the first element will be used
#> Warning: the condition has length > 1 and only the first element will be used
#> Warning: the condition has length > 1 and only the first element will be used
#> Warning: the condition has length > 1 and only the first element will be used
#> Warning: the condition has length > 1 and only the first element will be used
#> | |======================================== | 58%
#> Warning: the condition has length > 1 and only the first element will be used
#> Warning: the condition has length > 1 and only the first element will be used
#> Warning: the condition has length > 1 and only the first element will be used
#> | |========================================= | 58% | |========================================= | 59% | |========================================== | 59% | |========================================== | 60% | |========================================== | 61% | |=========================================== | 61% | |=========================================== | 62%
#> Warning: the condition has length > 1 and only the first element will be used
#> Warning: the condition has length > 1 and only the first element will be used
#> Warning: the condition has length > 1 and only the first element will be used
#> | |============================================ | 62%
#> Warning: the condition has length > 1 and only the first element will be used
#> Warning: the condition has length > 1 and only the first element will be used
#> Warning: the condition has length > 1 and only the first element will be used
#> | |============================================ | 63%
#> Warning: the condition has length > 1 and only the first element will be used
#> Warning: the condition has length > 1 and only the first element will be used
#> Warning: the condition has length > 1 and only the first element will be used
#> | |============================================ | 64% | |============================================= | 64% | |============================================= | 65% | |============================================== | 65% | |============================================== | 66% | |=============================================== | 66% | |=============================================== | 67% | |=============================================== | 68%
#> Warning: the condition has length > 1 and only the first element will be used
#> Warning: the condition has length > 1 and only the first element will be used
#> Warning: the condition has length > 1 and only the first element will be used
#> | |================================================ | 68%
#> Warning: the condition has length > 1 and only the first element will be used
#> Warning: the condition has length > 1 and only the first element will be used
#> Warning: the condition has length > 1 and only the first element will be used
#> | |================================================ | 69% | |================================================= | 69% | |================================================= | 70% | |================================================= | 71% | |================================================== | 71% | |================================================== | 72% | |=================================================== | 72% | |=================================================== | 73% | |=================================================== | 74% | |==================================================== | 74% | |==================================================== | 75% | |===================================================== | 75% | |===================================================== | 76% | |====================================================== | 76% | |====================================================== | 77% | |====================================================== | 78% | |======================================================= | 78% | |======================================================= | 79%
#> Warning: the condition has length > 1 and only the first element will be used
#> Warning: the condition has length > 1 and only the first element will be used
#> Warning: the condition has length > 1 and only the first element will be used
#> | |======================================================== | 79% | |======================================================== | 80% | |======================================================== | 81% | |========================================================= | 81% | |========================================================= | 82% | |========================================================== | 82%
#> Warning: the condition has length > 1 and only the first element will be used
#> Warning: the condition has length > 1 and only the first element will be used
#> Warning: the condition has length > 1 and only the first element will be used
#> Warning: the condition has length > 1 and only the first element will be used
#> Warning: the condition has length > 1 and only the first element will be used
#> Warning: the condition has length > 1 and only the first element will be used
#> | |========================================================== | 83%
#> Warning: the condition has length > 1 and only the first element will be used
#> Warning: the condition has length > 1 and only the first element will be used
#> Warning: the condition has length > 1 and only the first element will be used
#> | |========================================================== | 84% | |=========================================================== | 84% | |=========================================================== | 85% | |============================================================ | 85% | |============================================================ | 86% | |============================================================= | 86% | |============================================================= | 87% | |============================================================= | 88% | |============================================================== | 88% | |============================================================== | 89% | |=============================================================== | 89% | |=============================================================== | 90% | |=============================================================== | 91%
#> Warning: the condition has length > 1 and only the first element will be used
#> Warning: the condition has length > 1 and only the first element will be used
#> Warning: the condition has length > 1 and only the first element will be used
#> | |================================================================ | 91% | |================================================================ | 92%
#> Warning: the condition has length > 1 and only the first element will be used
#> Warning: the condition has length > 1 and only the first element will be used
#> Warning: the condition has length > 1 and only the first element will be used
#> | |================================================================= | 92% | |================================================================= | 93% | |================================================================= | 94% | |================================================================== | 94% | |================================================================== | 95% | |=================================================================== | 95% | |=================================================================== | 96% | |==================================================================== | 96% | |==================================================================== | 97% | |==================================================================== | 98% | |===================================================================== | 98% | |===================================================================== | 99% | |======================================================================| 99% | |======================================================================| 100%
#> Joining by: siteiid
#> Warning: cannot open file 'c:/data/sgeoref.txt': No such file or directory
#> Error in file(file, ifelse(append, "a", "w")): cannot open the connection
+
+ +
+ + +
+ + +
+

Site built with pkgdown 1.4.1.

+
+ +
+
+ + + + + + + + diff --git a/docs/reference/get_site_data_from_pedon_db.html b/docs/reference/get_site_data_from_pedon_db.html index 0854f48a..78b6d429 100644 --- a/docs/reference/get_site_data_from_pedon_db.html +++ b/docs/reference/get_site_data_from_pedon_db.html @@ -8,21 +8,25 @@ Extract Site Data from a PedonPC Database — get_site_data_from_pedon_db • soilDB + + - + + - - + + + @@ -30,13 +34,14 @@ - + + @@ -47,6 +52,7 @@ + @@ -63,7 +69,7 @@ soilDB - 2.3.9 + 2.5 @@ -71,7 +77,7 @@ - @@ -92,6 +97,7 @@ +
@@ -103,13 +109,11 @@

Extract Site Data from a PedonPC Database

-

Get site-level data from a PedonPC database.

-
get_site_data_from_pedon_db(dsn)
- +

Arguments

@@ -118,51 +122,49 @@

Arg

The path to a 'pedon.mdb' database.

- +

Value

A dataframe.

-

Note

This function currently works only on Windows.

-

See also

- +
-

Site built with pkgdown 1.3.0.

+

Site built with pkgdown 1.4.1.

+
+ + diff --git a/docs/reference/get_soilseries_from_NASIS.html b/docs/reference/get_soilseries_from_NASIS.html index 7d8842b9..6fcd6d0e 100644 --- a/docs/reference/get_soilseries_from_NASIS.html +++ b/docs/reference/get_soilseries_from_NASIS.html @@ -8,21 +8,25 @@ Get records from the Soil Classification (SC) database — get_soilseries_from_NASIS • soilDB + + - + + - - + + + @@ -30,13 +34,14 @@ - + + @@ -47,6 +52,7 @@ + @@ -63,7 +69,7 @@ soilDB - 2.3.9 + 2.5 @@ -71,7 +77,7 @@ - @@ -92,6 +97,7 @@ +
@@ -103,15 +109,13 @@

Get records from the Soil Classification (SC) database

-

These functions return records from the Soil Classification database, either from the local NASIS datbase (all series) or via web report (named series only).

-
-
get_soilseries_from_NASIS(stringsAsFactors = default.stringsAsFactors())
+    
get_soilseries_from_NASIS(stringsAsFactors = default.stringsAsFactors())
 get_soilseries_from_NASISWebReport(soils,
-stringsAsFactors = default.stringsAsFactors())
- +stringsAsFactors = default.stringsAsFactors())
+

Arguments

@@ -124,39 +128,41 @@

Arg

logical: should character vectors be converted to factors? This argument is passed to the uncode() function. It does not convert those vectors that have set outside of uncode() (i.e. hard coded). The 'factory-fresh' default is TRUE, but this can be changed by setting options(stringsAsFactors = FALSE)

- +

Value

A data.frame.

- +
-

Site built with pkgdown 1.3.0.

+

Site built with pkgdown 1.4.1.

+
+ + diff --git a/docs/reference/get_text_notes_from_NASIS_db.html b/docs/reference/get_text_notes_from_NASIS_db.html index e92535de..d1952a47 100644 --- a/docs/reference/get_text_notes_from_NASIS_db.html +++ b/docs/reference/get_text_notes_from_NASIS_db.html @@ -8,21 +8,25 @@ Extract text note data from a local NASIS Database — get_text_notes_from_NASIS_db • soilDB + + - + + - - + + + @@ -30,13 +34,14 @@ - + + @@ -47,6 +52,7 @@ + @@ -63,7 +69,7 @@ soilDB - 2.3.9 + 2.5 @@ -71,7 +77,7 @@ - @@ -92,6 +97,7 @@ +
@@ -103,13 +109,11 @@

Extract text note data from a local NASIS Database

-

Extract text note data from a local NASIS Database.

-
get_text_notes_from_NASIS_db(SS = TRUE, fixLineEndings = TRUE)
- +

Arguments

@@ -122,44 +126,103 @@

Arg

convert line endings from "\r\n" to "\n"

- +

Details

This function currently works only on Windows.

-

Value

A list with the results.

-

See also

-

Examples

-
# NOT RUN {
+    
# \donttest{ # query text note data -t <- get_text_notes_from_NASIS_db() +t <- try(get_text_notes_from_NASIS_db()) # show contents text note data, includes: siteobs, site, pedon, horizon level text notes data. -str(t) - +str(t)
#> List of 5 +#> $ pedon_text :'data.frame': 409 obs. of 8 variables: +#> ..$ recdate : POSIXct[1:409], format: "2008-04-04 00:00:00" "2012-04-06 00:00:00" ... +#> ..$ recauthor : chr [1:409] "LGH" "Tonie Endres" "Tonie Endres" "LGH" ... +#> ..$ pedontextkind: Factor w/ 9 levels "pedon note, formatted",..: 2 3 3 2 3 2 6 6 2 2 ... +#> ..$ textcat : chr [1:409] "editnote" NA NA "editnote" ... +#> ..$ textsubcat : chr [1:409] NA NA NA NA ... +#> ..$ textentry : chr [1:409] "Site and pedon number may be different than number given to original description." "Ownership changed from Illinois 108A_108B Shared to 11-04 Aurora MLRA PO. This will need to be checked to deter"| __truncated__ "Ownership changed from Illinois 108A_108B Shared to 11-04 Aurora MLRA PO. This will need to be checked to deter"| __truncated__ "Site and pedon number may be different than the number given to original description." ... +#> ..$ peiid : int [1:409] 35313 35313 35317 35317 35318 35318 35404 35404 36266 36266 ... +#> ..$ petextiid : int [1:409] 206095 511532 511536 206116 511537 206099 67762 67893 203656 68661 ... +#> $ site_text :'data.frame': 511 obs. of 8 variables: +#> ..$ recdate : POSIXct[1:511], format: "2012-04-06 00:00:00" "2012-04-06 00:00:00" ... +#> ..$ recauthor : chr [1:511] "Tonie Endres" "Tonie Endres" "Tonie Endres" NA ... +#> ..$ sitetextkind: Factor w/ 7 levels "site note, formatted",..: 3 3 3 6 5 6 5 2 3 5 ... +#> ..$ textcat : chr [1:511] NA NA NA "Map Unit Symbol/Name" ... +#> ..$ textsubcat : chr [1:511] NA NA NA NA ... +#> ..$ textentry : chr [1:511] "Ownership changed from Illinois 108A_108B Shared to 11-04 Aurora MLRA PO. This will need to be checked to deter"| __truncated__ "Ownership changed from Illinois 108A_108B Shared to 11-04 Aurora MLRA PO. This will need to be checked to deter"| __truncated__ "Ownership changed from Illinois 108A_108B Shared to 11-04 Aurora MLRA PO. This will need to be checked to deter"| __truncated__ "Map Unit Symbol: Wp\nMap Unit Name: Westland silty clay loam" ... +#> ..$ siteiid : int [1:511] 35362 35366 35367 35458 35458 36322 36322 36322 36322 36387 ... +#> ..$ sitetextiid : int [1:511] 495376 495380 495381 69688 69875 71081 71282 276574 495499 71163 ... +#> $ siteobs_text:'data.frame': 540 obs. of 8 variables: +#> ..$ recdate : POSIXct[1:540], format: NA NA ... +#> ..$ recauthor : chr [1:540] NA NA NA NA ... +#> ..$ siteobstextkind: Factor w/ 8 levels "site observation, formatted",..: NA NA NA NA NA NA NA NA NA NA ... +#> ..$ textcat : chr [1:540] NA NA NA NA ... +#> ..$ textsubcat : chr [1:540] NA NA NA NA ... +#> ..$ textentry : chr [1:540] NA NA NA NA ... +#> ..$ site_id : int [1:540] 35362 35366 35367 35458 36322 36387 36388 37117 38282 38283 ... +#> ..$ siteobstextiid : int [1:540] NA NA NA NA NA NA NA NA NA NA ... +#> $ horizon_text:'data.frame': 579 obs. of 8 variables: +#> ..$ recdate : POSIXct[1:579], format: "2008-03-25" "2008-02-06" ... +#> ..$ recauthor : chr [1:579] "LGH" "LGH" "LGH" NA ... +#> ..$ phorizontextkind: Factor w/ 6 levels "horizon note, formatted",..: 2 3 2 2 2 2 2 2 2 2 ... +#> ..$ textcat : chr [1:579] "editnote" "editnote" "editnote" NA ... +#> ..$ textsubcat : chr [1:579] NA NA NA NA ... +#> ..$ textentry : chr [1:579] "Calculated textural modifier and class in pedon horizon." "Calculated textural modifier and class in pedon horizon." "Calculated textural modifier and class in pedon horizon." "median penetrometer reading of 0.5" ... +#> ..$ phiid : int [1:579] 166850 166896 167033 177763 177764 177793 177796 177797 177855 177856 ... +#> ..$ phtextiid : int [1:579] 248751 244731 245520 18716 18717 18718 18719 18720 18721 18722 ... +#> $ photo_links :'data.frame': 0 obs. of 8 variables: +#> ..$ recdate : chr(0) +#> ..$ recauthor : chr(0) +#> ..$ siteobstextkind: Factor w/ 8 levels "site observation, formatted",..: +#> ..$ textcat : chr(0) +#> ..$ textsubcat : chr(0) +#> ..$ textentry : chr(0) +#> ..$ site_id : int(0) +#> ..$ siteobstextiid : int(0)
# view text categories for site text notes -table(t$site_text$textcat) - -# }
+if(!inherits(t, 'try-error')) + table(t$site_text$textcat)
#> +#> associated soils assocsoils +#> 70 10 11 +#> Caption edit notes editnote +#> 4 1 7 +#> edits GA259 GA639 +#> 4 2 1 +#> IL143 landform Landform +#> 2 14 12 +#> location Location Map Unit Symbol/Name +#> 21 21 36 +#> mini profile not converted noteid +#> 1 3 6 +#> old pedon number Pedon Description Physiography +#> 1 1 1 +#> quad correction RaCA Site ID +#> 6 2 1 +#> slope soil macro fauna Update +#> 19 1 1 +#> US VA023 VA173 +#> 1 4 1 +#> vegetation +#> 3
# } +
+
-

Site built with pkgdown 1.3.0.

+

Site built with pkgdown 1.4.1.

+
+ + diff --git a/docs/reference/get_veg_data_from_NASIS_db.html b/docs/reference/get_veg_data_from_NASIS_db.html index e44d78a2..e154e9b1 100644 --- a/docs/reference/get_veg_data_from_NASIS_db.html +++ b/docs/reference/get_veg_data_from_NASIS_db.html @@ -8,21 +8,25 @@ Extract veg data from a local NASIS Database — get_veg_data_from_NASIS_db • soilDB + + - + + - - + + + @@ -30,13 +34,14 @@ - + + @@ -47,6 +52,7 @@ + @@ -63,7 +69,7 @@ soilDB - 2.3.9 + 2.5 @@ -71,7 +77,7 @@ - @@ -92,6 +97,7 @@ +
@@ -103,13 +109,11 @@

Extract veg data from a local NASIS Database

-

Extract veg data from a local NASIS Database.

-
get_veg_data_from_NASIS_db(SS = TRUE)
- +

Arguments

@@ -118,36 +122,34 @@

Arg

get data from the currently loaded Selected Set in NASIS or from the entire local database (default: TRUE)

- +

Details

This function currently works only on Windows.

-

Value

A list with the results.

-

Examples

-
# NOT RUN {
+    
# \donttest{ # query text note data -v <- get_veg_from_NASIS_db() - +v <- try(get_veg_from_NASIS_db())
#> Error in get_veg_from_NASIS_db() : +#> could not find function "get_veg_from_NASIS_db"
# show contents veg data returned -str(v) - - -# }
+str(v)
#> 'try-error' chr "Error in get_veg_from_NASIS_db() : \n could not find function \"get_veg_from_NASIS_db\"\n" +#> - attr(*, "condition")=List of 2 +#> ..$ message: chr "could not find function \"get_veg_from_NASIS_db\"" +#> ..$ call : language get_veg_from_NASIS_db() +#> ..- attr(*, "class")= chr [1:3] "simpleError" "error" "condition"
+# } +
+
-

Site built with pkgdown 1.3.0.

+

Site built with pkgdown 1.4.1.

+
+ + diff --git a/docs/reference/get_veg_from_AK_Site.html b/docs/reference/get_veg_from_AK_Site.html index a8b20876..a7378afe 100644 --- a/docs/reference/get_veg_from_AK_Site.html +++ b/docs/reference/get_veg_from_AK_Site.html @@ -8,21 +8,25 @@ Retrieve Vegetation Data from an AK Site Database — get_veg_from_AK_Site • soilDB + + - + + - - + + + @@ -30,13 +34,14 @@ - + + @@ -47,6 +52,7 @@ + @@ -63,7 +69,7 @@ soilDB - 2.3.9 + 2.5 @@ -71,7 +77,7 @@ - @@ -92,6 +97,7 @@ +
@@ -103,13 +109,11 @@

Retrieve Vegetation Data from an AK Site Database

-

Retrieve Vegetation Data from an AK Site Database

-
get_veg_from_AK_Site(dsn)
- +

Arguments

@@ -118,51 +122,49 @@

Arg

file path the the AK Site access database

- +

Value

A dataframe with vegetation data in long format, linked to site ID.

-

Note

This function currently works only on Windows.

-

See also

- +
-

Site built with pkgdown 1.3.0.

+

Site built with pkgdown 1.4.1.

+
+ + diff --git a/docs/reference/get_veg_from_MT_veg_db.html b/docs/reference/get_veg_from_MT_veg_db.html index b1833eb6..81448f8f 100644 --- a/docs/reference/get_veg_from_MT_veg_db.html +++ b/docs/reference/get_veg_from_MT_veg_db.html @@ -8,21 +8,25 @@ Extract Site and Plot-level Data from a Montana RangeDB database — get_veg_from_MT_veg_db • soilDB + + - + + - - + + + @@ -30,13 +34,14 @@ - + + @@ -47,6 +52,7 @@ + @@ -63,7 +69,7 @@ soilDB - 2.3.9 + 2.5 @@ -71,7 +77,7 @@ - @@ -92,6 +97,7 @@ +
@@ -103,13 +109,11 @@

Extract Site and Plot-level Data from a Montana RangeDB database

-

Get Site and Plot-level data from a Montana RangeDB database.

-
get_veg_from_MT_veg_db(dsn)
- +

Arguments

@@ -118,51 +122,49 @@

Arg

The name of the Montana RangeDB front-end database connection (see details).

- +

Details

This function currently works only on Windows.

-

Value

A dataframe.

-

See also

- +
-

Site built with pkgdown 1.3.0.

+

Site built with pkgdown 1.4.1.

+
+ + diff --git a/docs/reference/get_veg_from_NPS_PLOTS_db.html b/docs/reference/get_veg_from_NPS_PLOTS_db.html index e5294acf..17853bc1 100644 --- a/docs/reference/get_veg_from_NPS_PLOTS_db.html +++ b/docs/reference/get_veg_from_NPS_PLOTS_db.html @@ -8,21 +8,25 @@ Retrieve Vegetation Data from an NPS PLOTS Database — get_veg_from_NPS_PLOTS_db • soilDB + + - + + - - + + + @@ -30,13 +34,14 @@ - + + @@ -47,6 +52,7 @@ + @@ -63,7 +69,7 @@ soilDB - 2.3.9 + 2.5 @@ -71,7 +77,7 @@ - @@ -92,6 +97,7 @@ +
@@ -103,13 +109,11 @@

Retrieve Vegetation Data from an NPS PLOTS Database

-

Used to extract species, stratum, and cover vegetation data from a backend NPS PLOTS Database. Currently works for any Microsoft Access database with an .mdb file format.

-
get_veg_from_NPS_PLOTS_db(dsn)
- +

Arguments

@@ -118,45 +122,45 @@

Arg

file path to the NPS PLOTS access database on your system.

- +

Value

A dataframe with vegetation data in a long format with linkage to NRCS soil pedon data via the site_id key field.

-

Note

This function currently only works on Windows.

- +
-

Site built with pkgdown 1.3.0.

+

Site built with pkgdown 1.4.1.

+
+ + diff --git a/docs/reference/get_veg_other_from_MT_veg_db.html b/docs/reference/get_veg_other_from_MT_veg_db.html index 76647f10..3ba985f9 100644 --- a/docs/reference/get_veg_other_from_MT_veg_db.html +++ b/docs/reference/get_veg_other_from_MT_veg_db.html @@ -8,21 +8,25 @@ Extract cover composition data from a Montana RangeDB database — get_veg_other_from_MT_veg_db • soilDB + + - + + - - + + + @@ -30,13 +34,14 @@ - + + @@ -47,6 +52,7 @@ + @@ -63,7 +69,7 @@ soilDB - 2.3.9 + 2.5 @@ -71,7 +77,7 @@ - @@ -92,6 +97,7 @@ +
@@ -103,13 +109,11 @@

Extract cover composition data from a Montana RangeDB database

-

Get cover composition data from a Montana RangeDB database.

-
get_veg_other_from_MT_veg_db(dsn)
- +

Arguments

@@ -118,51 +122,49 @@

Arg

The name of the Montana RangeDB front-end database connection (see details).

- +

Details

This function currently works only on Windows.

-

Value

A dataframe.

-

See also

- +
-

Site built with pkgdown 1.3.0.

+

Site built with pkgdown 1.4.1.

+
+ + diff --git a/docs/reference/get_veg_species_from_MT_veg_db.html b/docs/reference/get_veg_species_from_MT_veg_db.html index b057a9ac..f24ca67e 100644 --- a/docs/reference/get_veg_species_from_MT_veg_db.html +++ b/docs/reference/get_veg_species_from_MT_veg_db.html @@ -8,21 +8,25 @@ Extract species-level Data from a Montana RangeDB database — get_veg_species_from_MT_veg_db • soilDB + + - + + - - + + + @@ -30,13 +34,14 @@ - + + @@ -47,6 +52,7 @@ + @@ -63,7 +69,7 @@ soilDB - 2.3.9 + 2.5 @@ -71,7 +77,7 @@ - @@ -92,6 +97,7 @@ +
@@ -103,13 +109,11 @@

Extract species-level Data from a Montana RangeDB database

-

Get species-level data from a Montana RangeDB database.

-
get_veg_species_from_MT_veg_db(dsn)
- +

Arguments

@@ -118,51 +122,49 @@

Arg

The name of the Montana RangeDB front-end database connection (see details).

- +

Details

This function currently works only on Windows.

-

Value

A dataframe.

-

See also

- +
-

Site built with pkgdown 1.3.0.

+

Site built with pkgdown 1.4.1.

+
+ + diff --git a/docs/reference/index.html b/docs/reference/index.html index 8721118b..8b7408ba 100644 --- a/docs/reference/index.html +++ b/docs/reference/index.html @@ -8,21 +8,25 @@ Function reference • soilDB + + - + + - - + + + @@ -30,10 +34,12 @@ + + @@ -44,6 +50,7 @@ + @@ -60,7 +67,7 @@ soilDB - 2.3.9 + 2.5 @@ -68,7 +75,7 @@ - @@ -89,6 +95,7 @@ +
@@ -133,21 +140,21 @@

fetchNASISWebReport() get_progress_from_NASISWebReport() get_project_from_NASISWebReport() get_project_correlation_from_NASISWebReport() get_projectmapunit_from_NASISWebReport() get_projectmapunit2_from_NASISWebReport() get_legend_from_NASISWebReport() get_mapunit_from_NASISWebReport() get_component_from_NASISWebReport() get_chorizon_from_NASISWebReport() get_cosoilmoist_from_NASISWebReport() get_sitesoilmoist_from_NASISWebReport()

+

fetchNASIS() getHzErrorsNASIS()

-

Extract component tables from a the NASIS Web Reports

+

Fetch commonly used site/pedon/horizon or component data from NASIS.

-

fetchNASIS() fetchNASIS_pedons() fetchNASIS_components() getHzErrorsNASIS()

+

fetchNASISLabData()

-

Fetch commonly used site/pedon/horizon or component data from a local NASIS database.

+

Fetch lab data used site/horizon data from a PedonPC database.

-

fetchNASISLabData()

+

fetchNASISWebReport() get_progress_from_NASISWebReport() get_project_from_NASISWebReport() get_project_correlation_from_NASISWebReport() get_projectmapunit_from_NASISWebReport() get_projectmapunit2_from_NASISWebReport() get_legend_from_NASISWebReport() get_mapunit_from_NASISWebReport() get_component_from_NASISWebReport() get_chorizon_from_NASISWebReport() get_cosoilmoist_from_NASISWebReport() get_sitesoilmoist_from_NASISWebReport()

-

Fetch lab data used site/horizon data from a PedonPC database.

+

Extract component tables from a the NASIS Web Reports

@@ -175,9 +182,15 @@

fetchSDA_component() get_mapunit_from_SDA() get_component_from_SDA() get_chorizon_from_SDA() get_cosoilmoist_from_SDA() get_cosoilmoist_from_NASIS()

+

fetchSDA() get_mapunit_from_SDA() get_component_from_SDA() get_chorizon_from_SDA() get_cosoilmoist_from_SDA()

+ +

Download and Flatten Data from Soil Data Access

+ + + +

fetchSDA_spatial()

-

Extract component tables from Soil Data Access

+

Query SDA and Return Spatial Data

@@ -199,11 +212,17 @@

get_component_data_from_NASIS_db()

+

get_component_data_from_NASIS_db() get_component_restrictions_from_NASIS_db()

Extract component data from a local NASIS Database

+ +

get_cosoilmoist_from_NASIS()

+ +

Read and Flatten the Component Soil Moisture Tables

+ +

get_extended_data_from_NASIS_db()

@@ -240,12 +259,6 @@

get_phlabresults_data_from_NASIS_db()

- -

Extract phlabresults table from a local NASIS Database

- -

get_site_data_from_NASIS_db()

@@ -355,7 +368,7 @@

seriesExtent() seriesExtentAsGmap()

+

seriesExtent()

Get/Display Soil Series Extent

@@ -406,6 +419,12 @@

us_ss_timeline

Timeline of US Published Soil Surveys

+ + + +

waterDayYear()

+ +

Compute Water Day and Year

@@ -419,19 +438,23 @@

Contents

+ + + diff --git a/docs/reference/loafercreek-1.png b/docs/reference/loafercreek-1.png new file mode 100644 index 00000000..76589850 Binary files /dev/null and b/docs/reference/loafercreek-1.png differ diff --git a/docs/reference/loafercreek-2.png b/docs/reference/loafercreek-2.png new file mode 100644 index 00000000..73895a8a Binary files /dev/null and b/docs/reference/loafercreek-2.png differ diff --git a/docs/reference/loafercreek-3.png b/docs/reference/loafercreek-3.png new file mode 100644 index 00000000..a5485e58 Binary files /dev/null and b/docs/reference/loafercreek-3.png differ diff --git a/docs/reference/loafercreek-4.png b/docs/reference/loafercreek-4.png new file mode 100644 index 00000000..7938a3f3 Binary files /dev/null and b/docs/reference/loafercreek-4.png differ diff --git a/docs/reference/loafercreek-5.png b/docs/reference/loafercreek-5.png new file mode 100644 index 00000000..ab122e63 Binary files /dev/null and b/docs/reference/loafercreek-5.png differ diff --git a/docs/reference/loafercreek.html b/docs/reference/loafercreek.html index 65151cc7..dbf5542b 100644 --- a/docs/reference/loafercreek.html +++ b/docs/reference/loafercreek.html @@ -8,21 +8,25 @@ Example <code>SoilProfilecollection</code> Objects Returned by <code>fetchNASIS</code>. — loafercreek • soilDB + + - + + - - + + + @@ -30,13 +34,14 @@ - + + @@ -47,6 +52,7 @@ + @@ -63,7 +69,7 @@ soilDB - 2.3.9 + 2.5 @@ -71,7 +77,7 @@ - @@ -92,6 +97,7 @@ +
@@ -103,89 +109,93 @@

Example SoilProfilecollection Objects Returned by fetchNA

-

Several examples of soil profile collections returned by fetchNASIS(from='pedons') as SoilProfileCollection objects.

-
-
data(loafercreek)
-data(gopheridge)
-data(mineralKing)
- +
data(loafercreek)
+data(gopheridge)
+data(mineralKing)
+ +

Examples

-
# NOT RUN {
+    
# \donttest{ +if(require("aqp")) { # load example dataset -data("gopheridge") + data("gopheridge") -# what kind of object is this? -class(gopheridge) + # what kind of object is this? + class(gopheridge) -# how many profiles? -length(gopheridge) + # how many profiles? + length(gopheridge) -# there are 60 profiles, this calls for a split plot -par(mar=c(0,0,0,0), mfrow=c(2,1)) + # there are 60 profiles, this calls for a split plot + par(mar=c(0,0,0,0), mfrow=c(2,1)) -# plot soil colors -plot(gopheridge[1:30, ], name='hzname', color='soil_color') -plot(gopheridge[31:60, ], name='hzname', color='soil_color') + # plot soil colors + plot(gopheridge[1:30, ], name='hzname', color='soil_color') + plot(gopheridge[31:60, ], name='hzname', color='soil_color') -# need a larger top margin for legend -par(mar=c(0,0,4,0), mfrow=c(2,1)) -# generate colors based on clay content -plot(gopheridge[1:30, ], name='hzname', color='clay') -plot(gopheridge[31:60, ], name='hzname', color='clay') + # need a larger top margin for legend + par(mar=c(0,0,4,0), mfrow=c(2,1)) + # generate colors based on clay content + plot(gopheridge[1:30, ], name='hzname', color='clay') + plot(gopheridge[31:60, ], name='hzname', color='clay') -# single row and no labels -par(mar=c(0,0,0,0), mfrow=c(1,1)) -# plot soils sorted by depth to contact -plot(gopheridge, name='', print.id=FALSE, plot.order=order(gopheridge$bedrckdepth)) + # single row and no labels + par(mar=c(0,0,0,0), mfrow=c(1,1)) + # plot soils sorted by depth to contact + plot(gopheridge, name='', print.id=FALSE, plot.order=order(gopheridge$bedrckdepth)) -# plot first 10 profiles -plot(gopheridge[1:10, ], name='hzname', color='soil_color', label='pedon_id', id.style='side') + # plot first 10 profiles + plot(gopheridge[1:10, ], name='hzname', color='soil_color', label='pedon_id', id.style='side') -# add rock fragment data to plot: -addVolumeFraction(gopheridge[1:10, ], colname='total_frags_pct') + # add rock fragment data to plot: + addVolumeFraction(gopheridge[1:10, ], colname='total_frags_pct') -# add diagnostic horizons -addDiagnosticBracket(gopheridge[1:10, ], kind='argillic horizon', col='red', offset=-0.4) + # add diagnostic horizons + addDiagnosticBracket(gopheridge[1:10, ], kind='argillic horizon', col='red', offset=-0.4) -## loafercreek -data("loafercreek") -# plot first 10 profiles -plot(loafercreek[1:10, ], name='hzname', color='soil_color', label='pedon_id', id.style='side') + ## loafercreek + data("loafercreek") + # plot first 10 profiles + plot(loafercreek[1:10, ], name='hzname', color='soil_color', label='pedon_id', id.style='side') -# add rock fragment data to plot: -addVolumeFraction(loafercreek[1:10, ], colname='total_frags_pct') + # add rock fragment data to plot: + addVolumeFraction(loafercreek[1:10, ], colname='total_frags_pct') -# add diagnostic horizons -addDiagnosticBracket(loafercreek[1:10, ], kind='argillic horizon', col='red', offset=-0.4) -# }
+ # add diagnostic horizons + addDiagnosticBracket(loafercreek[1:10, ], kind='argillic horizon', col='red', offset=-0.4) +}
# } +
+
-

Site built with pkgdown 1.3.0.

+

Site built with pkgdown 1.4.1.

+
+ + diff --git a/docs/reference/mapunit_geom_by_ll_bbox-1.png b/docs/reference/mapunit_geom_by_ll_bbox-1.png new file mode 100644 index 00000000..4422e535 Binary files /dev/null and b/docs/reference/mapunit_geom_by_ll_bbox-1.png differ diff --git a/docs/reference/mapunit_geom_by_ll_bbox.html b/docs/reference/mapunit_geom_by_ll_bbox.html index b3ad7f23..ff16280b 100644 --- a/docs/reference/mapunit_geom_by_ll_bbox.html +++ b/docs/reference/mapunit_geom_by_ll_bbox.html @@ -8,21 +8,25 @@ Fetch Map Unit Geometry from SDA — mapunit_geom_by_ll_bbox • soilDB + + - + + - - + + + @@ -30,13 +34,14 @@ - + + @@ -47,6 +52,7 @@ + @@ -63,7 +69,7 @@ soilDB - 2.3.9 + 2.5 @@ -71,7 +77,7 @@ - @@ -92,6 +97,7 @@ +
@@ -103,13 +109,11 @@

Fetch Map Unit Geometry from SDA

-

Fetch map unit geometry from the SDA website by WGS84 bounding box.

-
mapunit_geom_by_ll_bbox(bbox, source = 'sda')
- +

Arguments

@@ -122,62 +126,62 @@

Arg

the source database, currently limited to soil data access (SDA)

- +

Details

The SDA website can be found at http://sdmdataaccess.nrcs.usda.gov. See examples for bounding box formatting.

-

Value

A SpatialPolygonsDataFrame of map unit polygons, in WGS84 (long,lat) coordinates.

-

References

http://casoilresource.lawr.ucdavis.edu/

-

Note

It appears that SDA does not actually return the spatial intersecion of map unit polygons and bounding box. Rather, just those polygons that are completely within the bounding box / overlap with the bbox. This function requires the `rgdal` package.

-

Examples

-
# fetch map unit geometry from a bounding-box: -# -# +------------- (-120.41, 38.70) -# | | -# | | -# (-120.54, 38.61) --------------+ - -
# NOT RUN { -# basic usage -b <- c(-120.54,38.61,-120.41,38.70) -x <- mapunit_geom_by_ll_bbox(b) # about 20 seconds - -# note that the returned geometry is everything overlapping the bbox -# and not an intersection... why? -plot(x) -rect(b[1], b[2], b[3], b[4], border='red', lwd=2) - - -# get map unit data for matching map unit keys -in.statement <- format_SQL_in_statement(unique(x$MUKEY)) -q <- paste("SELECT mukey, muname FROM mapunit WHERE mukey IN ", in.statement, sep="") -res <- SDA_query(q) -# }
+
# fetch map unit geometry from a bounding-box: +# +# +------------- (-120.41, 38.70) +# | | +# | | +# (-120.54, 38.61) --------------+ + +# \donttest{ +if(require(sp) & require(rgdal)) { + + # basic usage + b <- c(-120.54,38.61,-120.41,38.70) + x <- try(mapunit_geom_by_ll_bbox(b)) # about 20 seconds + + if(!inherits(x,'try-error')) + # note that the returned geometry is everything overlapping the bbox + # and not an intersection... why? + plot(x) + rect(b[1], b[2], b[3], b[4], border='red', lwd=2) + + + # get map unit data for matching map unit keys + in.statement <- format_SQL_in_statement(unique(x$MUKEY)) + q <- paste("SELECT mukey, muname FROM mapunit WHERE mukey IN ", in.statement, sep="") + res <- SDA_query(q) + } else { + message('could not download XML result from SDA') + }
#> OGR data source with driver: GML +#> Source: "C:\Users\Dylan.Beaudette\Documents\RtmpgbqL6F\file341c928233e.gml", layer: "mapunitpoly" +#> with 197 features +#> It has 8 fields
#> empty result set
# } +
+
-

Site built with pkgdown 1.3.0.

+

Site built with pkgdown 1.4.1.

+
+ + diff --git a/docs/reference/parseWebReport.html b/docs/reference/parseWebReport.html index 97087d32..bd277941 100644 --- a/docs/reference/parseWebReport.html +++ b/docs/reference/parseWebReport.html @@ -8,21 +8,25 @@ Parse contents of a web report, based on suplied arguments. — parseWebReport • soilDB + + - + + - - + + + @@ -30,13 +34,14 @@ - + + @@ -47,6 +52,7 @@ + @@ -63,7 +69,7 @@ soilDB - 2.3.9 + 2.5 @@ -71,7 +77,7 @@ - @@ -92,6 +97,7 @@ +
@@ -103,13 +109,11 @@

Parse contents of a web report, based on suplied arguments.

-

Parse contents of a web report, based on suplied arguments.

-
parseWebReport(url, args, index = 1)
- +

Arguments

@@ -126,35 +130,30 @@

Arg

Integer index specifiying the table to rerturn, or, NULL for a list of tables

- +

Details

Report argument names can be infered by inspection of the HTML source associated with any given web report.

-

Value

A data.frame object in the case of a single integer passed to index, a list object in the case of an integer vector or NULL passed to index.

-

Note

Most web reports are for internal use only.

-

Examples

-
# pending +
# \donttest{ +# pending +# }
+
-

Site built with pkgdown 1.3.0.

+

Site built with pkgdown 1.4.1.

+
+ + diff --git a/docs/reference/seriesExtent-1.png b/docs/reference/seriesExtent-1.png new file mode 100644 index 00000000..172e07a3 Binary files /dev/null and b/docs/reference/seriesExtent-1.png differ diff --git a/docs/reference/seriesExtent.html b/docs/reference/seriesExtent.html index 8ce9e970..728e9be4 100644 --- a/docs/reference/seriesExtent.html +++ b/docs/reference/seriesExtent.html @@ -8,21 +8,25 @@ Get/Display Soil Series Extent — seriesExtent • soilDB + + - + + - - + + + @@ -30,13 +34,14 @@ - + + @@ -47,6 +52,7 @@ + @@ -63,7 +69,7 @@ soilDB - 2.3.9 + 2.5 @@ -71,7 +77,7 @@ - @@ -92,6 +97,7 @@ +
@@ -103,14 +109,11 @@

Get/Display Soil Series Extent

-

Get or display the spatial extent of a named soil series using the Series Extent Explorer.

-
-
seriesExtent(s, timeout=60)
-seriesExtentAsGmap(s, timeout=60, exp=1.25)
- +
seriesExtent(s, timeout=60)
+

Arguments

@@ -122,53 +125,40 @@

Arg

- - - -
timeout

time that we are willing to wait for a response, in seconds

exp

expansion factor used to expand Google Maps region

- +

Details

Soil series extent data are downloaded from a static cache of GeoJSON files on SoilWeb servers. Cached data are typically updated annually.

-

Value

-

when calling seriesExtent, a SpatialPolygonsDataFrame object

- +

when calling seriesExtent, a SpatialPolygonsDataFrame object

References

http://casoilresource.lawr.ucdavis.edu/see

-

Note

-

These function require the `rgdal` and `dismo` packages.

- +

This function require the `rgdal` package.

Examples

-
# NOT RUN {
+    
# \donttest{ # fetch series extent for the 'Amador' soil series s <- seriesExtent('amador') -plot(s) -# fetch then plot the extent of the 'Amador' soil series -seriesExtentAsGmap('amador') -# }
+# plot SpatialPolygonsDataFrame +if(require(sp)) + plot(s)
+# }
+
-

Site built with pkgdown 1.3.0.

+

Site built with pkgdown 1.4.1.

+
+ + diff --git a/docs/reference/siblings.html b/docs/reference/siblings.html index 78036876..ab889601 100644 --- a/docs/reference/siblings.html +++ b/docs/reference/siblings.html @@ -8,21 +8,25 @@ Lookup siblings and cousins for a given soil series. — siblings • soilDB + + - + + - - + + + @@ -30,13 +34,14 @@ - + + @@ -47,6 +52,7 @@ + @@ -63,7 +69,7 @@ soilDB - 2.3.9 + 2.5 @@ -71,7 +77,7 @@ - @@ -92,6 +97,7 @@ +
@@ -103,13 +109,11 @@

Lookup siblings and cousins for a given soil series.

-

Lookup siblings and cousins for a given soil series, from the current fiscal year SSURGO snapshot via SoilWeb.

-
siblings(s, only.major=FALSE, component.data = FALSE, cousins = FALSE)
- +

Arguments

@@ -130,11 +134,10 @@

Arg

logical, should siblings-of-siblings (cousins) be returned?

- +

Details

The siblings of any given soil series are defined as those soil series (major and minor component) that share a parent map unit with the named series (as a major component). Cousins are siblings of siblings. Data are sourced from SoilWeb which maintains a copy of the current SSURGO snapshot.

-

Value

@@ -143,101 +146,94 @@

Value

sib.data

data.frame containing sibling component data

cousins

data.frame containing cousins, major component flag, and number of co-occurrences

cousin.data

data.frame containing cousin component data

- + + -

References

soilDB Soil Series Query Functionality

Related tutorial.

-

See also

OSDquery, siblings, fetchOSD

-

Examples

-
# basic usage +
# \donttest{ +# basic usage x <- siblings('zook') x$sib
#> series sibling majcompflag n -#> 1 zook Olmitz TRUE 14 +#> 1 zook Olmitz TRUE 15 #> 2 zook Vigar TRUE 12 -#> 3 zook Vesser TRUE 10 -#> 4 zook Ely TRUE 10 +#> 3 zook Ely TRUE 10 +#> 4 zook Vesser TRUE 10 #> 5 zook Excello TRUE 9 #> 6 zook Colo TRUE 8 #> 7 zook Nodaway TRUE 5 -#> 8 zook Zoe TRUE 3 -#> 9 zook Mt. Sterling TRUE 3 -#> 10 zook Kezan TRUE 2 -#> 11 zook Clamo TRUE 2 -#> 12 zook Humeston TRUE 1 -#> 13 zook Quiver TRUE 1 +#> 8 zook Mt. Sterling TRUE 3 +#> 9 zook Zoe TRUE 3 +#> 10 zook Clamo TRUE 2 +#> 11 zook Kezan TRUE 2 +#> 12 zook Quiver TRUE 1 +#> 13 zook Humeston TRUE 1 #> 14 zook Klum TRUE 1 #> 15 zook Wabash FALSE 64 -#> 16 zook Colo FALSE 53 +#> 16 zook Colo FALSE 56 #> 17 zook Chequest FALSE 46 -#> 18 zook Nodaway FALSE 43 +#> 18 zook Nodaway FALSE 44 #> 19 zook Humeston FALSE 27 -#> 20 zook Arbela FALSE 26 +#> 20 zook Arbela FALSE 23 #> 21 zook Ackmore FALSE 18 #> 22 zook Bremer FALSE 16 #> 23 zook Dockery FALSE 16 #> 24 zook Lamo FALSE 15 #> 25 zook Landes FALSE 13 -#> 26 zook Shell FALSE 12 -#> 27 zook Napa FALSE 12 -#> 28 zook Olmitz FALSE 11 +#> 26 zook Napa FALSE 12 +#> 27 zook Kennebec FALSE 12 +#> 28 zook Shell FALSE 12 #> 29 zook Coland FALSE 10 -#> 30 zook Kennebec FALSE 9 +#> 30 zook Olmitz FALSE 10 #> 31 zook Sawmill FALSE 9 #> 32 zook Vesser FALSE 7 #> 33 zook Nishna FALSE 5 -#> 34 zook Calco FALSE 2 -#> 35 zook Judson FALSE 2 -#> 36 zook Blackoar FALSE 2 -#> 37 zook Reading FALSE 2 +#> 34 zook Blackoar FALSE 5 +#> 35 zook Chase FALSE 2 +#> 36 zook Judson FALSE 2 +#> 37 zook Calco FALSE 2 #> 38 zook Quiver FALSE 2 -#> 39 zook Chase FALSE 2 -#> 40 zook Ely FALSE 1 -#> 41 zook Muir FALSE 1 -#> 42 zook Eudora FALSE 1 +#> 39 zook Reading FALSE 2 +#> 40 zook Toolesboro FALSE 1 +#> 41 zook Ely FALSE 1 +#> 42 zook Muir FALSE 1 #> 43 zook Clarinda FALSE 1 -#> 44 zook Floris FALSE 1 -#> 45 zook Toolesboro FALSE 1
+#> 44 zook Eudora FALSE 1
# restrict to siblings that are major components # e.g. the most likely siblings x <- siblings('zook', only.major = TRUE) x$sib
#> series sibling majcompflag n -#> 1 zook Olmitz TRUE 14 +#> 1 zook Olmitz TRUE 15 #> 2 zook Vigar TRUE 12 #> 3 zook Vesser TRUE 10 #> 4 zook Ely TRUE 10 #> 5 zook Excello TRUE 9 #> 6 zook Colo TRUE 8 #> 7 zook Nodaway TRUE 5 -#> 8 zook Mt. Sterling TRUE 3 -#> 9 zook Zoe TRUE 3 -#> 10 zook Kezan TRUE 2 -#> 11 zook Clamo TRUE 2 -#> 12 zook Quiver TRUE 1 -#> 13 zook Humeston TRUE 1 -#> 14 zook Klum TRUE 1
+#> 8 zook Zoe TRUE 3 +#> 9 zook Mt. Sterling TRUE 3 +#> 10 zook Clamo TRUE 2 +#> 11 zook Kezan TRUE 2 +#> 12 zook Humeston TRUE 1 +#> 13 zook Quiver TRUE 1 +#> 14 zook Klum TRUE 1
# }
+
-

Site built with pkgdown 1.3.0.

+

Site built with pkgdown 1.4.1.

+
+ + diff --git a/docs/reference/simplfyFragmentData.html b/docs/reference/simplfyFragmentData.html index c684f5f7..70221fc8 100644 --- a/docs/reference/simplfyFragmentData.html +++ b/docs/reference/simplfyFragmentData.html @@ -8,21 +8,25 @@ Simplify Coarse Fraction Data — simplifyFragmentData • soilDB + + - + + - - + + + @@ -30,13 +34,14 @@ - + + @@ -47,6 +52,7 @@ + @@ -63,7 +69,7 @@ soilDB - 2.3.9 + 2.5 @@ -71,7 +77,7 @@ - @@ -92,6 +97,7 @@ +
@@ -103,13 +109,11 @@

Simplify Coarse Fraction Data

-

Simplify multiple coarse fraction (>2mm) records by horizon.

-
simplifyFragmentData(rf, id.var, nullFragsAreZero = TRUE)
- +

Arguments

@@ -126,41 +130,43 @@

Arg

should fragment volumes of NULL be interpreted as 0? (default: TRUE), see details

- +

Details

This function is mainly intended for the processing of NASIS pedon/horizon data which contains multiple coarse fragment descriptions per horizon. simplifyFragmentData will "sieve out" coarse fragments into the USDA classes, split into hard and para- fragments.

The simplifyFragmentData function can be applied to data sources other than NASIS by careful use of the id.var argument. However, rf must contain coarse fragment volumes in the column "fragvol", fragment size (mm) in columns "fragsize_l", "fragsize_r", "fragsize_h", and fragment cementation class in "fraghard".

There are examples in the KSSL data tutorial.

- +
-

Site built with pkgdown 1.3.0.

+

Site built with pkgdown 1.4.1.

+
+ + diff --git a/docs/reference/simplifyColorData.html b/docs/reference/simplifyColorData.html index 6d9a935c..b477cbf9 100644 --- a/docs/reference/simplifyColorData.html +++ b/docs/reference/simplifyColorData.html @@ -8,21 +8,25 @@ Simplify Color Data by ID — simplifyColorData • soilDB + + - + + - - + + + @@ -30,13 +34,14 @@ - + + @@ -47,6 +52,7 @@ + @@ -63,7 +69,7 @@ soilDB - 2.3.9 + 2.5 @@ -71,7 +77,7 @@ - @@ -92,6 +97,7 @@ +
@@ -103,14 +109,12 @@

Simplify Color Data by ID

-

Simplify multiple Munsell color observations associated with each horizon.

-
simplifyColorData(d, id.var = "phiid", ...)
-mix_and_clean_colors(x, wt='pct', colorSpace='LAB', backTransform=FALSE)
- +mix_and_clean_colors(x, wt='pct', backTransform=FALSE)
+

Arguments

@@ -123,7 +127,7 @@

Arg

- + @@ -134,51 +138,50 @@

Arg

- - - - - +

character vector with the name of the column containing an ID that is unique among all horizons in d

...

further arguments passed on to mix_and_clean_colors(), see details

wt

a character vector with the name of the column containing color weights for mixing

colorSpace

a character vector with the name of color space in which mixing is performed ("LAB" or "sRGB")

backTransform

logical, should the mixed sRGB representation of soil color be transformed to closest Munsell chips?

logical, should the mixed sRGB representation of soil color be transformed to closest Munsell chips? This is performed by aqp::rgb2Munsell

- +

Details

-

This function is mainly intended for the processing of NASIS pedon/horizon data which may or may not contain multiple colors per horizon/moisture status combination. simplifyColorData will "mix" multiple colors associated with horizons in d, according to IDs specified by id.var, using "weights" (area percentages) specified by the wt argument to mix_and_clean_colors. Mixing is performed in the CIE LAB color space by default.

+

This function is mainly intended for the processing of NASIS pedon/horizon data which may or may not contain multiple colors per horizon/moisture status combination. simplifyColorData will "mix" multiple colors associated with horizons in d, according to IDs specified by id.var, using "weights" (area percentages) specified by the wt argument to mix_and_clean_colors.

+

Note that this function doesn't actually simulate the mixture of pigments on a surface, rather, "mixing" is approximated via weighted average in the CIELAB colorspace.

The simplifyColorData function can be applied to data sources other than NASIS by careful use of the id.var and wt arguments. However, d must contain Munsell colors split into columns named "colorhue", "colorvalue", and "colorchroma". In addition, the moisture state ("Dry" or "Moist") must be specified in a column named "colormoistst".

The mix_and_clean_colors funcion can be applied to arbitrary data sources as long as x contains sRGB coordinates in columns named "r", "g", and "b". This function should be applied to chunks of rows within which color mixtures make sense.

There are examples in the KSSL data tutorial and the soil color mixing tutorial.

- +
-

Site built with pkgdown 1.3.0.

+

Site built with pkgdown 1.4.1.

+
+ + diff --git a/docs/reference/soilDB-package.html b/docs/reference/soilDB-package.html index b72de11a..ccabfefb 100644 --- a/docs/reference/soilDB-package.html +++ b/docs/reference/soilDB-package.html @@ -8,21 +8,25 @@ Soil Database Interface — soilDB-package • soilDB + + - + + - - + + + @@ -30,13 +34,14 @@ - + + @@ -47,6 +52,7 @@ + @@ -63,7 +69,7 @@ soilDB - 2.3.9 + 2.5 @@ -71,7 +77,7 @@ - @@ -92,6 +97,7 @@ +
@@ -103,49 +109,48 @@

Soil Database Interface

-

This package provides methods for extracting soils information from local PedonPC and AK Site databases (MS Access format), local NASIS databases (MS SQL Server), and the SDA webservice. Currently USDA-NCSS data sources are supported, however, there are plans to develop interfaces to outside systems such as the Global Soil Mapping project.

-
- + +

Details

It can be difficult to locate all of the dependencies required for sending/processing SOAP requests, especially on UNIX-like operating systems. Windows binary packages for the dependencies can be found here. See fetchPedonPC for a simple wrapper function that should suffice for typical site/pedon/hz queries. An introduction to the soilDB package can be found here.

-

See also

- +
-

Site built with pkgdown 1.3.0.

+

Site built with pkgdown 1.4.1.

+
+ + diff --git a/docs/reference/uncode.html b/docs/reference/uncode.html index e58a7243..6cdbeb34 100644 --- a/docs/reference/uncode.html +++ b/docs/reference/uncode.html @@ -8,21 +8,25 @@ Convert coded values returned from NASIS and SDA queries to factors — uncode • soilDB + + - + + - - + + + @@ -30,13 +34,14 @@ - + + @@ -47,6 +52,7 @@ + @@ -63,7 +69,7 @@ soilDB - 2.3.9 + 2.5 @@ -71,7 +77,7 @@ - @@ -92,6 +97,7 @@ +
@@ -103,17 +109,15 @@

Convert coded values returned from NASIS and SDA queries to factors

-

These functions convert the coded values returned from NASIS or SDA to factors (e.g. 1 = Alfisols) using the metadata tables from NASIS. For SDA the metadata is pulled from a static snapshot in the soilDB package (/data/metadata.rda).

-
uncode(df, invert = FALSE, db = "NASIS",
-       drop.unused.levels = FALSE,
-       stringsAsFactors = default.stringsAsFactors()
+       droplevels = FALSE,
+       stringsAsFactors = default.stringsAsFactors()
        )
 code(df, ...)
- +

Arguments

@@ -130,7 +134,7 @@

Arg

- + @@ -138,38 +142,40 @@

Arg

- +

label specifying the soil database the data is coming from, which indicates whether or not to query metadata from local NASIS database ("NASIS") or use soilDB-local snapshot ("LIMS" or "SDA")

drop.unused.levelsdroplevels

logical: indicating whether to drop unused levels in classifying factors. This is useful when a class has large number of unused classes, which can waste space in tables and figures.

logical: should character vectors be converted to factors? The 'factory-fresh' default is TRUE, but this can be changed by setting options(stringsAsFactors = FALSE)

...

arguments passed on to uncode

- +

Details

-

These functions convert the coded values returned from NASIS into their plaintext representation. The converted values from NASIS, or sourced from SDA, are upgraded to specifically-leveled factors using the metadata tables from NASIS. For SDA the metadata is pulled from a static snapshot in the soilDB package.

- +

These functions convert the coded values returned from NASIS into their plain text representation. It duplicates the functionality of the CODELABEL function found in NASIS. This function is primarily intended to be used internally by other soilDB R functions, in order to minimizes the need to manually convert values.

+

The function works by iterating through the column names in a data frame and looking up whether they match any of the ColumnPhysicalNames found in the metadata domain tables. If matches are found then the columns coded values are converted to their corresponding factor levels. Therefore it is not advisable to reuse column names from NASIS unless the contents match the range of values and format found in NASIS. Otherwise uncode() will convert their values to NA.

+

When data is being imported from NASIS, the metadata tables are sourced directly from NASIS. When data is being imported from SDA or the NASIS Web Reports, the metadata is pulled from a static snapshot in the soilDB package.

+

Beware the default is to return the values as factors rather than strings. While strings are generally preferable, factors make plotting more convenient. Generally the factor level ordering returned by uncode() follows the naturally ordering of categories that would be expected (e.g. sand, silt, clay).

Value

-

A dataframe with the results.

- +

A data frame with the results.

Examples

-
# NOT RUN {
-# query component by nationalmusym
-comp = fetchSDA_component(WHERE = "nationalmusym = '2vzcp'")
-s = site(comp$spc)
-s = uncode(s, NASIS = FALSE)
-levels(s$taxorder)
-# }
+
# \donttest{ +if(require(aqp)) { + # query component by nationalmusym + comp <- fetchSDA(WHERE = "nationalmusym = '2vzcp'") + s <- site(comp) + + # use SDA uncoding domain via db argument + s <- uncode(s, db="SDA") + levels(s$taxorder) +}
#> single result set, returning a data.frame
#> single result set, returning a data.frame
#> single result set, returning a data.frame
#> single result set, returning a data.frame
#> single result set, returning a data.frame
#> single result set, returning a data.frame
#> NULL
# } +
+
-

Site built with pkgdown 1.3.0.

+

Site built with pkgdown 1.4.1.

+
+ + diff --git a/docs/reference/us_ss_timeline-1.png b/docs/reference/us_ss_timeline-1.png new file mode 100644 index 00000000..3b88d019 Binary files /dev/null and b/docs/reference/us_ss_timeline-1.png differ diff --git a/docs/reference/us_ss_timeline.html b/docs/reference/us_ss_timeline.html index 002e5fdd..340c7e3f 100644 --- a/docs/reference/us_ss_timeline.html +++ b/docs/reference/us_ss_timeline.html @@ -8,21 +8,25 @@ Timeline of US Published Soil Surveys — us_ss_timeline • soilDB + + - + + - - + + + @@ -30,13 +34,14 @@ - + + @@ -47,6 +52,7 @@ + @@ -63,7 +69,7 @@ soilDB - 2.3.9 + 2.5 @@ -71,7 +77,7 @@ - @@ -92,6 +97,7 @@ +
@@ -103,13 +109,12 @@

Timeline of US Published Soil Surveys

-

This dataset contains the years of each US Soil Survey was published.

-
-
data("us_ss_timeline")
- +
data("us_ss_timeline")
+ +

Format

A data frame with 5209 observations on the following 5 variables.

@@ -117,93 +122,150 @@

Formatyear

year of publication, a numeric vector

pdf

does a pdf exists, a logical vector

state

State abbrevation, a character vector

-

- + + +

Details

This data was web scraped from the NRCS Soils Website. The scraping procedure and a example plot are included in the examples section below.

-

Source

https://www.nrcs.usda.gov/wps/portal/nrcs/soilsurvey/soils/survey/state/

-

Examples

-
# NOT RUN {
-library(XML)
-library(RCurl)
-library(ggplot2)
+    
# \donttest{ -data(state) -st <- paste0(c(state.abb, "PR", "DC", "VI", "PB")) +if ( + require("XML") & + require("RCurl") & + require("ggplot2") & + require("gridExtra") +) { + +data(state) +st <- paste0(c(state.abb, "PR", "DC", "VI", "PB")) us_ss_timeline <- { - lapply(st, function(x) { - cat("getting", x, "\n") - url <- getURL(paste0( - "https://www.nrcs.usda.gov/wps/portal/nrcs/surveylist/soils/survey/state/?stateId=", x) + lapply(st, function(x) { + cat("getting", x, "\n") + url <- getURL(paste0( + "https://www.nrcs.usda.gov/wps/portal/nrcs/surveylist/soils/survey/state/?stateId=", x) ) - df <- readHTMLTable(url, which = 22, stringsAsFactors = FALSE) + df <- readHTMLTable(url, which = 22, stringsAsFactors = FALSE) df$state <- x - return(df) - }) ->.; - do.call("rbind", .) ->.; - names(.) <- c("ssa", "year", "pdf", "wss", "state") - .[.$year != "current", ] ->.; - } -us_ss_timeline <- within(us_ss_timeline, { - ssa = sapply(ssa, function(x) strsplit(x, "\r")[[1]][1]) - year = as.numeric(year) - pdf = ifelse(pdf == "Yes", TRUE, FALSE) + return(df) + }) ->.; + do.call("rbind", .) ->.; + names(.) <- c("ssa", "year", "pdf", "wss", "state") + .[!grepl(.$year, pattern="current"), ] ->.; +} +us_ss_timeline <- within(us_ss_timeline, { + ssa = sapply(ssa, function(x) strsplit(x, "\r")[[1]][1]) + year = as.numeric(substr(year, 3,6)) + pdf = ifelse(pdf == "Yes", TRUE, FALSE) wss = NULL - }) +}) -test <- as.data.frame(table(us_ss_timeline$year), stringsAsFactors = FALSE) +test <- as.data.frame(table(us_ss_timeline$year), stringsAsFactors = FALSE) -g1 <- ggplot(test, aes(x = as.numeric(Var1), y = Freq)) + +g1 <- ggplot(data = test, aes(x = Var1, y = Freq)) + geom_histogram(stat = "identity") + xlab("Year") + ylab("Count") + theme(aspect.ratio = 1) + ggtitle("Number of Published \n US Soil Surveys by Year") -g2 <- ggplot(test, aes(x = as.numeric(Var1), y = cumsum(Freq))) + +g2 <- ggplot(test, aes(x = Var1, y = cumsum(Freq))) + geom_histogram(stat = "identity") + xlab("Year") + ylab("Count") + theme(aspect.ratio = 1) + ggtitle("Cumulative Number of Published \n US Soil Surveys by Year") -gridExtra::grid.arrange(g1, g2, ncol = 2) -# }
+grid.arrange(g1, g2, ncol = 2) + +}# }
#> Loading required package: XML
#> Warning: package 'XML' was built under R version 3.5.3
#> Loading required package: RCurl
#> Warning: package 'RCurl' was built under R version 3.5.2
#> Loading required package: bitops
#> Warning: package 'bitops' was built under R version 3.5.2
#> getting AL +#> getting AK +#> getting AZ +#> getting AR +#> getting CA +#> getting CO +#> getting CT +#> getting DE +#> getting FL +#> getting GA +#> getting HI +#> getting ID +#> getting IL +#> getting IN +#> getting IA +#> getting KS +#> getting KY +#> getting LA +#> getting ME +#> getting MD +#> getting MA +#> getting MI +#> getting MN +#> getting MS +#> getting MO +#> getting MT +#> getting NE +#> getting NV +#> getting NH +#> getting NJ +#> getting NM +#> getting NY +#> getting NC +#> getting ND +#> getting OH +#> getting OK +#> getting OR +#> getting PA +#> getting RI +#> getting SC +#> getting SD +#> getting TN +#> getting TX +#> getting UT +#> getting VT +#> getting VA +#> getting WA +#> getting WV +#> getting WI +#> getting WY +#> getting PR +#> getting DC +#> getting VI +#> getting PB
#> Warning: Ignoring unknown parameters: binwidth, bins, pad
#> Warning: Ignoring unknown parameters: binwidth, bins, pad
+
-

Site built with pkgdown 1.3.0.

+

Site built with pkgdown 1.4.1.

+
+ + diff --git a/docs/reference/waterDayYear.html b/docs/reference/waterDayYear.html new file mode 100644 index 00000000..28179a26 --- /dev/null +++ b/docs/reference/waterDayYear.html @@ -0,0 +1,184 @@ + + + + + + + + +Compute Water Day and Year — waterDayYear • soilDB + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +
+
+ + + + +
+ +
+
+ + +
+

Compute "water" day and year, based on the end of the typical or legal dry season. This is September 30 in California.

+
+ +
waterDayYear(d, end = "09-30")
+ +

Arguments

+ + + + + + + + + + +
d

anything the can be safely converted to PPOSIXlt

end

"MM-DD" notation for end of water year

+ +

Details

+ +

This function doesn't know about leap-years. Probably worth checking.

+

Value

+ +

A data.frame object with the following

+
wy

the "water year"

+
wd

the "water day"

+ +

References

+ +

Ideas borrowed from: +https://github.com/USGS-R/dataRetrieval/issues/246 and +https://stackoverflow.com/questions/48123049/create-day-index-based-on-water-year

+ +

Examples

+
# try it +waterDayYear('2019-01-01')
#> wy wd +#> 1 2019 93
+
+ +
+ + + +
+ + + + + + + +