Title: | Interface to the 'Daymet' Web Services |
---|---|
Description: | Programmatic interface to the 'Daymet' web services (<http://daymet.ornl.gov>). Allows for easy downloads of 'Daymet' climate data directly to your R workspace or your computer. Routines for both single pixel data downloads and gridded (netCDF) data are provided. |
Authors: | Koen Hufkens [aut, cre] , BlueGreen Labs [cph, fnd] |
Maintainer: | Koen Hufkens <[email protected]> |
License: | AGPL-3 |
Version: | 1.7.1 |
Built: | 2024-12-30 03:35:02 UTC |
Source: | https://github.com/bluegreen-labs/daymetr |
Function to count the number of days in a given time period that meet a given set of criteria. This can be used to extract indices such as Growing Degree Days (tmin > 0), or days with precipitation (prcp != 0).
calc_nd( file, start_doy = 1, end_doy = 365, criteria, value, internal = FALSE, path = tempdir() )
calc_nd( file, start_doy = 1, end_doy = 365, criteria, value, internal = FALSE, path = tempdir() )
file |
path of a file containing the daily gridded Daymet data |
start_doy |
numeric day-of-year at which counting should begin. (default = 1) |
end_doy |
numeric day of year at which counting should end. (default = 365) |
criteria |
logical expression (">=",">","<=","<","==", "!=") to evaluate |
value |
the value that the criteria is evaluated against |
internal |
return to workspace ( |
path |
path to which to write data to disk (default = tempdir()) |
A raster object in the R workspace or a file on disk with summary statistics for every pixel which meet the predefined criteria. Output files if written to file will be named nd_YYYY.tif (with YYYY the year of the processed tile or ncss netCDF file).
## Not run: # download daily gridded data # using default settings (data written to tempdir()) download_daymet_ncss() # read in the Daymet file and report back the number # of days in a year with a minimum temperature lower # than 15 degrees C r <- calc_nd(file.path(tempdir(),"tmin_daily_1980_ncss.nc"), criteria = "<", value = 15, internal = TRUE) # plot the output terra::plot(r) ## End(Not run)
## Not run: # download daily gridded data # using default settings (data written to tempdir()) download_daymet_ncss() # read in the Daymet file and report back the number # of days in a year with a minimum temperature lower # than 15 degrees C r <- calc_nd(file.path(tempdir(),"tmin_daily_1980_ncss.nc"), criteria = "<", value = 15, internal = TRUE) # plot the output terra::plot(r) ## End(Not run)
Aggregates daily Daymet data by time interval to create convenient seasonal datasets for data exploration or modelling.
daymet_grid_agg( file, int = "seasonal", fun = "mean", internal = FALSE, path = tempdir() )
daymet_grid_agg( file, int = "seasonal", fun = "mean", internal = FALSE, path = tempdir() )
file |
The name of the file to be processed. Use daily gridded Daymet data. |
int |
Interval to aggregate by. Options are "monthly", "seasonal" or "annual". Seasons are defined as the astronomical seasons between solstices and equinoxes (default = "seasonal") |
fun |
Function to be used to aggregate data. Genertic R functions can be used. "mean" and "sum" are suggested. na.rm = TRUE by default. (default = "mean") |
internal |
logical If FALSE, write the output to a tif file using the Daymet file format protocol. |
path |
path to a directory where output files should be written. Used only if internal = FALSE (default = tempdir()) |
aggregated daily Daymet data as a tiff file written to disk or a raster stack when data is returned to the workspace.
## Not run: # This code calculates the average minimum temperature by # season for a subset region. # download default ncss tiled subset for 1980 # (daily tmin values only), works on tiles as well download_daymet_ncss() # Finally, run the function daymet_grid_agg( file = file.path(tempdir(),"/tmin_daily_1980_ncss.nc"), int = "seasonal", fun = "mean" ) ## End(Not run)
## Not run: # This code calculates the average minimum temperature by # season for a subset region. # download default ncss tiled subset for 1980 # (daily tmin values only), works on tiles as well download_daymet_ncss() # Finally, run the function daymet_grid_agg( file = file.path(tempdir(),"/tmin_daily_1980_ncss.nc"), int = "seasonal", fun = "mean" ) ## End(Not run)
Returns an offset dataset with data running from offset DOY in year - 1 to offset DOY in the current year. Two years of data (730 data layers) are required for this function to work. The output serves as input for further data processing and / or ecosystem modelling efforts.
daymet_grid_offset(data, offset = 264)
daymet_grid_offset(data, offset = 264)
data |
rasterStack or rasterBrick of 730 layers (2 consecutive years) |
offset |
offset of the time series in DOY (default = 264, sept 21) |
## Not run: my_subset <- daymet_gridded_offset(mystack, offset = 264) ## End(Not run)
## Not run: my_subset <- daymet_gridded_offset(mystack, offset = 264) ## End(Not run)
Combines data into a single mean daily temperature (tmean) gridded output (geotiff) for easy post processing and modelling. Optionally a raster object is returned to the current workspace.
daymet_grid_tmean(path = tempdir(), product, year, internal = FALSE)
daymet_grid_tmean(path = tempdir(), product, year, internal = FALSE)
path |
full path location of the daymet tiles (default = tempdir()) |
product |
either a tile number or a ncss product name |
year |
which year to process |
internal |
|
## Not run: # This code calculates the mean temperature # for all daymet tiles in a user provided # directory. In this example we first # download tile 11935 for tmin and tmax # download a tile download_daymet_tiles(tiles = 11935, start = 1980, end = 1980, param = c("tmin","tmax"), path = tempdir()) # calculate the mean temperature and export # the result to the R workspace (internal = TRUE) # If internal = FALSE, a file tmean_11935_1980.tif # is written into the source path (path_with_daymet_tiles) tmean <- daymet_grid_tmean(path = tempdir(), tile = 11935, year = 1980, internal = TRUE) ## End(Not run)
## Not run: # This code calculates the mean temperature # for all daymet tiles in a user provided # directory. In this example we first # download tile 11935 for tmin and tmax # download a tile download_daymet_tiles(tiles = 11935, start = 1980, end = 1980, param = c("tmin","tmax"), path = tempdir()) # calculate the mean temperature and export # the result to the R workspace (internal = TRUE) # If internal = FALSE, a file tmean_11935_1980.tif # is written into the source path (path_with_daymet_tiles) tmean <- daymet_grid_tmean(path = tempdir(), tile = 11935, year = 1980, internal = TRUE) ## End(Not run)
Function to download single location 'Daymet' data
download_daymet( site = "Daymet", lat = 36.0133, lon = -84.2625, start = 2000, end = as.numeric(format(Sys.time(), "%Y")) - 2, path = tempdir(), internal = TRUE, silent = FALSE, force = FALSE, simplify = FALSE )
download_daymet( site = "Daymet", lat = 36.0133, lon = -84.2625, start = 2000, end = as.numeric(format(Sys.time(), "%Y")) - 2, path = tempdir(), internal = TRUE, silent = FALSE, force = FALSE, simplify = FALSE )
site |
the site name. |
lat |
latitude (decimal degrees) |
lon |
longitude (decimal degrees) |
start |
start of the range of years over which to download data |
end |
end of the range of years over which to download data |
path |
set path where to save the data if internal = FALSE (default = NULL) |
internal |
|
silent |
|
force |
|
simplify |
output data as a tibble, logical |
Daymet data for a point location, returned to the R workspace or written to disk as a csv file.
## Not run: # The following commands download and process Daymet data # for 10 years of the >30 year of data available since 1980. daymet_data <- download_daymet( "testsite_name", lat = 36.0133, lon = -84.2625, start = 2000, end = 2010, internal = TRUE ) # We can now quickly calculate and plot # daily mean temperature. Also, take note of # the weird format of the header. This format # is not altered as to keep compatibility # with other ways of acquiring Daymet data # through the ORNL DAAC website. # The below command lists headers of # the downloaded nested list. # This data includes information on the site # location etc. The true climate data is stored # in the "data" part of the nested list. # In this case it can be accessed through # daymet_data$data. Other attributes include # for example the tile location (daymet_data$tile), # the altitude (daymet_data$altitude), etc. str(daymet_data) # load the tidyverse (install if necessary) if(!require(tidyverse)){install.package(tidyverse)} library(tidyverse) # Calculate the mean temperature from min # max temperatures and convert the year and doy # to a proper date format. daymet_data$data <- daymet_data$data |> mutate( tmean = (tmax..deg.c. + tmin..deg.c.)/2, date = as.Date(paste(year, yday, sep = "-"), "%Y-%j") ) # show a simple graph of the mean temperature plot(daymet_data$data$date, daymet_data$data$tmean, xlab = "Date", ylab = "mean temperature") # For other practical examples consult the included # vignette. ## End(Not run)
## Not run: # The following commands download and process Daymet data # for 10 years of the >30 year of data available since 1980. daymet_data <- download_daymet( "testsite_name", lat = 36.0133, lon = -84.2625, start = 2000, end = 2010, internal = TRUE ) # We can now quickly calculate and plot # daily mean temperature. Also, take note of # the weird format of the header. This format # is not altered as to keep compatibility # with other ways of acquiring Daymet data # through the ORNL DAAC website. # The below command lists headers of # the downloaded nested list. # This data includes information on the site # location etc. The true climate data is stored # in the "data" part of the nested list. # In this case it can be accessed through # daymet_data$data. Other attributes include # for example the tile location (daymet_data$tile), # the altitude (daymet_data$altitude), etc. str(daymet_data) # load the tidyverse (install if necessary) if(!require(tidyverse)){install.package(tidyverse)} library(tidyverse) # Calculate the mean temperature from min # max temperatures and convert the year and doy # to a proper date format. daymet_data$data <- daymet_data$data |> mutate( tmean = (tmax..deg.c. + tmin..deg.c.)/2, date = as.Date(paste(year, yday, sep = "-"), "%Y-%j") ) # show a simple graph of the mean temperature plot(daymet_data$data$date, daymet_data$data$tmean, xlab = "Date", ylab = "mean temperature") # For other practical examples consult the included # vignette. ## End(Not run)
This function downloads 'Daymet' data for several single pixel location, as specified by a batch file.
download_daymet_batch( file_location = NULL, start = 1980, end = as.numeric(format(Sys.time(), "%Y")) - 1, internal = TRUE, force = FALSE, silent = FALSE, path = tempdir(), simplify = FALSE )
download_daymet_batch( file_location = NULL, start = 1980, end = as.numeric(format(Sys.time(), "%Y")) - 1, internal = TRUE, force = FALSE, silent = FALSE, path = tempdir(), simplify = FALSE )
file_location |
file with several site locations and coordinates in a comma delimited format: site, latitude, longitude |
start |
start of the range of years over which to download data |
end |
end of the range of years over which to download data |
internal |
assign or FALSE, load data into workspace or save to disc |
force |
|
silent |
suppress the verbose output (default = FALSE) |
path |
set path where to save the data if internal = FALSE (default = tempdir()) |
simplify |
output data to a tibble, logical |
Daymet data for point locations as a nested list or data written to csv files
## Not run: # The download_daymet_batch() routine is a wrapper around # the download_daymet() function. It queries a file with # coordinates to easily download a large batch of daymet # pixel locations. When internal = TRUE, the data is stored # in a structured list in an R variable. If FALSE, the data # is written to disk. # create demo locations (two sites) locations <- data.frame(site = c("site1", "site2"), lat = rep(36.0133, 2), lon = rep(-84.2625, 2)) # write data to csv file write.table(locations, paste0(tempdir(),"/locations.csv"), sep = ",", col.names = TRUE, row.names = FALSE, quote = FALSE) # download data, will return nested list of daymet data df_batch <- download_daymet_batch(file_location = paste0(tempdir(), "/locations.csv"), start = 1980, end = 1980, internal = TRUE, silent = TRUE) # For other practical examples consult the included # vignette. ## End(Not run)
## Not run: # The download_daymet_batch() routine is a wrapper around # the download_daymet() function. It queries a file with # coordinates to easily download a large batch of daymet # pixel locations. When internal = TRUE, the data is stored # in a structured list in an R variable. If FALSE, the data # is written to disk. # create demo locations (two sites) locations <- data.frame(site = c("site1", "site2"), lat = rep(36.0133, 2), lon = rep(-84.2625, 2)) # write data to csv file write.table(locations, paste0(tempdir(),"/locations.csv"), sep = ",", col.names = TRUE, row.names = FALSE, quote = FALSE) # download data, will return nested list of daymet data df_batch <- download_daymet_batch(file_location = paste0(tempdir(), "/locations.csv"), start = 1980, end = 1980, internal = TRUE, silent = TRUE) # For other practical examples consult the included # vignette. ## End(Not run)
Function to geographically subset 'Daymet' regions exceeding tile limits
download_daymet_ncss( location = c(34, -82, 33.75, -81.75), start = 1980, end = 1980, param = "tmin", frequency = "daily", mosaic = "na", path = tempdir(), silent = FALSE, force = FALSE, ssl = TRUE )
download_daymet_ncss( location = c(34, -82, 33.75, -81.75), start = 1980, end = 1980, param = "tmin", frequency = "daily", mosaic = "na", path = tempdir(), silent = FALSE, force = FALSE, ssl = TRUE )
location |
location of a bounding box c(lat, lon, lat, lon) defined by a top left and bottom-right coordinates |
start |
start of the range of years over which to download data |
end |
end of the range of years over which to download data |
param |
climate variable you want to download vapour pressure (vp), minimum and maximum temperature (tmin,tmax), snow water equivalent (swe), solar radiation (srad), precipitation (prcp) , day length (dayl). The default setting is ALL, this will download all the previously mentioned climate variables. |
frequency |
frequency of the data requested (default = "daily", other options are "monthly" or "annual"). |
mosaic |
which tile mosiac to source from (na = Northern America, hi = Hawaii, pr = Puerto Rico), defaults to "na". |
path |
directory where to store the downloaded data (default = tempdir()) |
silent |
suppress the verbose output |
force |
|
ssl |
|
netCDF data file of an area circumscribed by the location bounding box
## Not run: # The following call allows you to subset gridded # Daymet data using a bounding box location. This # is an alternative way to query gridded data. The # routine is particularly helpful if you need certain # data which stradles boundaries of multiple tiles # or a smaller subset of a larger tile. Keep in mind # that there is a 6GB upper limit to the output file # so querying larger regions will result in an error. # To download larger areas use the download_daymet_tiles() # function. # Download a subset of a / multiple tiles # into your current working directory. download_daymet_ncss(location = c(34, -82, 33.75, -81.75), start = 1980, end = 1980, param = "tmin", path = tempdir()) # For other practical examples consult the included # vignette. ## End(Not run)
## Not run: # The following call allows you to subset gridded # Daymet data using a bounding box location. This # is an alternative way to query gridded data. The # routine is particularly helpful if you need certain # data which stradles boundaries of multiple tiles # or a smaller subset of a larger tile. Keep in mind # that there is a 6GB upper limit to the output file # so querying larger regions will result in an error. # To download larger areas use the download_daymet_tiles() # function. # Download a subset of a / multiple tiles # into your current working directory. download_daymet_ncss(location = c(34, -82, 33.75, -81.75), start = 1980, end = 1980, param = "tmin", path = tempdir()) # For other practical examples consult the included # vignette. ## End(Not run)
Function to batch download gridded 'Daymet' data tiles
download_daymet_tiles( location = c(18.9103, -114.6109), tiles, start = 1980, end = 1980, path = tempdir(), param = "ALL", silent = FALSE, force = FALSE )
download_daymet_tiles( location = c(18.9103, -114.6109), tiles, start = 1980, end = 1980, path = tempdir(), param = "ALL", silent = FALSE, force = FALSE )
location |
location of a point c(lat, lon) or a bounding box defined by a top left and bottom-right coordinates c(lat, lon, lat, lon) |
tiles |
which tiles to download, overrides geographic constraints |
start |
start of the range of years over which to download data |
end |
end of the range of years over which to download data |
path |
where should the downloaded tiles be stored (default = tempdir()) |
param |
climate variable you want to download vapour pressure (vp), minimum and maximum temperature (tmin,tmax), snow water equivalent (swe), solar radiation (srad), precipitation (prcp) , day length (dayl). The default setting is ALL, this will download all the previously mentioned climate variables. |
silent |
suppress the verbose output |
force |
|
downloads netCDF tiles as defined by the Daymet tile grid
## Not run: Download a single tile of minimum temperature download_daymet_tiles(location = c(18.9103, -114.6109), start = 1980, end = 1980, param = "tmin") # For other practical examples consult the included # vignette. ## End(Not run)
## Not run: Download a single tile of minimum temperature download_daymet_tiles(location = c(18.9103, -114.6109), start = 1980, end = 1980, param = "tmin") # For other practical examples consult the included # vignette. ## End(Not run)
Conversion to .tif to simplify workflows if the data that has been downloaded is to be handled in other software (e.g. QGIS).
nc2tif(path = tempdir(), files = NULL, overwrite = FALSE, silent = FALSE)
nc2tif(path = tempdir(), files = NULL, overwrite = FALSE, silent = FALSE)
path |
a character string showing the path to the directory containing Daymet .nc files (default = tempdir()) |
files |
a character vector containing the name of one or more files to be converted (optional) |
overwrite |
a logical controlling whether all files will be written, or whether files will not be written in the event that there is already a .tif of that file. (default = NULL) |
silent |
limit verbose output (default = FALSE) |
Converted geotiff files of all netCDF data in the provided directory (path).
## Not run: # The below command converts all netCDF data in # the provided path to geotiff files. Existing # files will be overwritten. If set to FALSE, # files will not be overwritten. # download the data download_daymet_ncss(param = "tmin", frequency = "annual", path = tempdir(), silent = TRUE) # convert files from nc to tif nc2tif(path = tempdir(), overwrite = TRUE) # print converted files print(list.files(tempdir(), "*.tif")) ## End(Not run)
## Not run: # The below command converts all netCDF data in # the provided path to geotiff files. Existing # files will be overwritten. If set to FALSE, # files will not be overwritten. # download the data download_daymet_ncss(param = "tmin", frequency = "annual", path = tempdir(), silent = TRUE) # convert files from nc to tif nc2tif(path = tempdir(), overwrite = TRUE) # print converted files print(list.files(tempdir(), "*.tif")) ## End(Not run)
Reads Single Pixel Daymet data into a nested list or tibble, preserving header data and critical file name information.
read_daymet(file, site, skip_header = FALSE, simplify = TRUE)
read_daymet(file, site, skip_header = FALSE, simplify = TRUE)
file |
a Daymet Single Pixel data file |
site |
a sitename (default = |
skip_header |
do not ingest header meta-data, logical |
simplify |
output tidy data (tibble), logical |
A nested data structure including site meta-data, the full header and the data as a 'data.frame()'.
## Not run: # download the data download_daymet( site = "Daymet", start = 1980, end = 1980, internal = FALSE, silent = TRUE ) # read in the Daymet file df <- read_daymet(paste0(tempdir(),"/Daymet_1980_1980.csv")) # print data structure print(str(df)) ## End(Not run)
## Not run: # download the data download_daymet( site = "Daymet", start = 1980, end = 1980, internal = FALSE, silent = TRUE ) # read in the Daymet file df <- read_daymet(paste0(tempdir(),"/Daymet_1980_1980.csv")) # print data structure print(str(df)) ## End(Not run)
Large simple feature collection containing the outlines of all the Daymet tiles available as well as projection information. This data was converted from a shapefile as provided on the Daymet main website.
tile_outlines
tile_outlines
SpatialPolygonDataFrame
tile ID number
minimum longitude
maximum longitude
minimum latitude
maximum latitude