Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

As a developer, I want documentation to help me learn how to contribute to the WRES codebase #365

Open
HankHerr-NOAA opened this issue Nov 21, 2024 · 1 comment
Assignees
Labels
documentation Improvements or additions to documentation

Comments

@HankHerr-NOAA
Copy link
Contributor

Feel free to reword the title as needed. As discussed last week, we need documentation to help guide folks in contributing to the WRES. At a minimum, it should be something high-level to serve as an entry point into understanding the code base. It may not need to be much more than that; its unclear at this time.

The current developer wiki only discusses editing wikis:

https://github.com/NOAA-OWP/wres/wiki/Instructions-for-Developers

This ticket is about fleshing that out a bit more.

Hank

@HankHerr-NOAA
Copy link
Contributor Author

HankHerr-NOAA commented Mar 4, 2025

Doing some of the work toward this ticket with a slight rewrite of the README.md. A first draft of my changes are below. This is a bit rushed; I'll take a closer look at it tomorrow and clean it up some more. In the mean time, if anyone sees something clearly missing, please let me know. The README I think should provide starter information; more detailed developer documentation should be elsewhere, perhaps in the wiki, and that material would be written at a later date.

More tomorrow. Have a great one!

Hank

==========

commitChecks

Water Resources Evaluation Service (WRES)

The Water Resources Evaluation Service (WRES) is a comprehensive service for evaluating the quality of model predictions, such as hydrometeorological forecasts. The WRES encapsulates a data-to-statistics evaluation pipeline, including reading data from files or web services of various formats, rescaling data, changing measurement units, filtering data, pairing predictions and observations, allocating pairs to pools based on pooling criteria (e.g., common forecast lead times), computing statistics and writing statistics formats. It can support relatively small evaluations with few features or time series using an in-memory data model for quick performance, or particularly large evaluations with data for many locations and a long period of time using a backing database, such as Postgres.

As described in the wiki, there are three modes of operation for the WRES: (1) "cluster mode" using a web-service instance; (2) "standalone mode" using a short-running instance; and (3) "standalone mode" using a long-running, local-server instance. This repository supports execution of the WRES through any of those three modes, and the libraries comprising the WRES can be broadly broken down into two categories:

  • The core WRES software. The Core WRES is the software that performs an evaluation. It parses an evaluation declaration, acquires and pairs the data, creates the statistics and writes the statistics to formats. It can be run as a stand-alone, executed from the command line (useful during development), but is typically executed by users through the COWRES web service front-end.
  • The WRES web service layer. The service software wraps the core WRES inside of a web-service deployment. The service layer includes the following modules: @wres-tasker@, @wres-worker@, @wres-redis@, and @wres-broker@.

While the service layer is only necessary for cluster mode, the core WRES is required for all modes of operation.

Below is described how to obtain the WRES software, build it locally, and execute it as a standalone, including a small example to exercise the software.

How can I obtain the software?

The WRES software is obtained by cloning this repository. No other repository is needed and dependencies will be automatically obtained using Gradle.

How can I build the software?

Gradle is used to build the software and release packages, and will gather the necessary version of Java and required dependencies needed to build the software automatically. Once the repository is cloned, to build and unit test the software, run this command:

./gradlew build

To install the WRES locally for standalone use, including Javadoc, run the following command:

./gradlew check javadoc installDist

This will produce a release distribution in the ./build/install/wres directory relative to the root directory of your repository.

How can I run the WRES using a standalone?

To run the WRES, you will need a recent version of the Java Runtime Environment (JRE) installed. To check whether you have an appropriate JRE installed locally, you can examine the result of the following command:

java -version

If this reports a version greater than 17.0, you can execute the WRES. Otherwise, you will need to install an appropriate JRE.

To execute an evaluation, you can run the following command on a Linux-like operating system from within the ./build/install/wres directory:

bin/wres myEvaluation.yml

On a Windows-like operating system, you can execute the following command:

bin/wres.bat myEvaluation.yml

Where myEvaluation.yml is the file that declares your evaluation; see the Declaration language wiki for more information.

How can I create a simple example evaluation?

Do the following within the ./build/install/wres directory:

  1. Create a file predictions.csv with the following content:
value_date,variable_name,location,measurement_unit,value
1985-06-01T13:00:00Z,streamflow,myLocation,CMS,21.0
1985-06-01T14:00:00Z,streamflow,myLocation,CMS,22.0
  1. Create a file observations.csv with the following content:
value_date,variable_name,location,measurement_unit,value
1985-06-01T13:00:00Z,streamflow,myLocation,CMS,23.0
1985-06-01T14:00:00Z,streamflow,myLocation,CMS,25.0
  1. Create a file myEvaluation.yml with the following content, adjusting the paths to reference the files you created (if you created the files inside the bin directory, no changes are needed):
observed: observations.csv
predicted: predictions.csv
  1. Execute the evaluation as follows:

    bin/wres myEvaluation.yml

By default, the results of the evaluation will be written to the user's temporary directory. The paths to the files should be reported on the console. For example:

Wrote 2 paths to foo.user/temp/wres_evaluation_7woOxSGA-AEvyg3eNSS_j9Jj9Hc

Running Against the Last Release

To run against the latest official release of the WRES, do the following:

  1. Navigate to the latest release: https://github.com/NOAA-OWP/wres/releases/latest.

  2. Download the core .zip from the assets for that latest release. The file to obtain should follow this naming convention: wres-DATE-VERSION.zip

  3. Unzip the core .zip and navigate into the folder that is created. For example:

cd build/install/wres/
  1. Execute your project
bin/wres myEvaluation.yml

Open source licensing info

  1. TERMS
  2. LICENSE

@HankHerr-NOAA HankHerr-NOAA self-assigned this Mar 4, 2025
@HankHerr-NOAA HankHerr-NOAA modified the milestone: v6.30 Mar 4, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
documentation Improvements or additions to documentation
Projects
None yet
Development

No branches or pull requests

1 participant