diff --git a/README.md b/README.md index 52f9fb9..b1e4b2d 100644 --- a/README.md +++ b/README.md @@ -21,12 +21,14 @@ ### What is Unity SDS? + Quite simply, an SDS (Science Data System) is an orchestrated set of networked compute and storage resources that is adapted to process science data through a pipeline. As described by [Hua et al. [2022]](#1): > Science Data Systems (SDSes) provide the capability to develop, test, process, and analyze instrument observational data efficiently, systematically, and at large scales. SDSes ingest the raw satellite instrument observations and process them from low‐level instrument values into higher level observational measurement values that compose the science data products. The [Unity SDS](https://github.com/unity-sds) is an implementation of an SDS by the Unity project at NASA Jet Propulsion Laboratory. ### What are triggers? + Trigger events are events that could potentially kick off processing in an SDS. Examples of trigger events are: 1. A raw data file is deposited into a location e.g. an S3 bucket or a local directory. @@ -43,9 +45,11 @@ These are just an initial subset of the different types of trigger events and th Trigger events by themselves don't automatically mean that SDS processing is ready to proceed. That's what evaluators are for. ### What are evaluators? + As described by [Hua et al. [2022]](#1): > A fundamental capability of an SDS is to systematically process science data through a series of data transformations from raw instrument data to geophysical measurements. Data are first made available to the SDS from GDS to be processed to higher level data products. The data transformation steps may utilize ancillary and auxiliary files as well as production rules that stipulate conditions for when each step should be executed. +In an SDS, evaluators are functions (irrespective of how they are deployed and called) that perform adaptation-specific evaluation to determine if the next step in the processing pipeline is ready for execution. In an SDS, evaluators are functions (irrespective of how they are deployed and called) that perform adaptation-specific evaluation to determine if the next step in the processing pipeline is ready for execution. As an example, the following shows the input-output diagram for the NISAR L-SAR L0B PGE (a.k.a. science algorithm): @@ -72,6 +76,7 @@ The following screenshot shows examples of both of these interfaces: It is the responsibility of the initiator to perform the routing of triggers to their respective evaluators. ### What is the Unity initiator? + The Unity initiator is the set of compute resources that enable the routing of trigger events to their respective evaluators. It is agnostic of the trigger event source and agnostic of the adaptation-specific evaluator code. It is completely driven by configuration (a.k.a. router configuration YAML). The following screenshot shows the current architecture for the initiator: ![initiator](https://github.com/unity-sds/unity-initiator/assets/387300/74f7c2cb-8542-4ad8-9212-e720077373c0) @@ -79,6 +84,7 @@ The Unity initiator is the set of compute resources that enable the routing of t The initiator topic, an SNS topic, is the common interface that all triggers will submit events to. The initiator topic is subscribed to by the initiator SQS queue (complete with dead-letter queue for resiliency) which in turn is subscribed to by the router Lambda function. How the router Lambda routes payloads of the trigger events is defined by the router configuration YAML. The full YAML schema for the router configuration is located [here](src/unity_initiator/resources/routers_schema.yaml). #### How the router works + In the context of trigger events where a new file is detected (payload_type=`url`), the router Lambda extracts the URL of the new file, instantiates a router object and attempts to match it up against of set of regular expressions defined in the router configuration file. Let's consider this minimal router configuration YAML file example: ``` @@ -87,7 +93,7 @@ initiator_config: payload_type: url: - regexes: - - !!python/regexp '/(?P(?PNISAR)_S(?P\d{3})_(?P\w{2,3})_(?P\w{3,4})_M(?P\d{2})_P(?P\d{5})_R(?P\d{2})_C(?P\d{2})_G(?P\d{2})_(?P\d{4}_\d{3}_\d{2}_\d{2}_\d{2}_\d{6})\d{3}\.vc(?P\w{2}))$' + - '/(?P(?PNISAR)_S(?P\d{3})_(?P\w{2,3})_(?P\w{3,4})_M(?P\d{2})_P(?P\d{5})_R(?P\d{2})_C(?P\d{2})_G(?P\d{2})_(?P\d{4}_\d{3}_\d{2}_\d{2}_\d{2}_\d{6})\d{3}\.vc(?P\w{2}))$' evaluators: - name: eval_nisar_ingest actions: @@ -105,6 +111,7 @@ initiator_config: ``` and a trigger event payload for a new file that was triggered: + ``` { "payload": "s3://test_bucket/prefix/NISAR_S198_PA_PA11_M00_P00922_R00_C01_G00_2024_010_17_57_57_714280000.vc29" @@ -114,13 +121,14 @@ and a trigger event payload for a new file that was triggered: The router will iterate over the set of url configs and attempt to match the URL against its set of regexes. If a match is successful, the router will iterate over the configured evaluators configs and perform the configured action to submit the URL payload to the evaluator interface (either SNS topic or DAG submission). In this case, the router sees that the action is `submit_to_sns_topic` and thus publishes the URL payload (and the regular expression captured groups as `payload_info`) to the SNS topic (`topic_arn`) configured in the action's parameters. In addition to the payload URL and the payload info, the router also includes the `on_success` parameters configured for the action. This will propagate pertinent info to the underlying evaluator code which would be used if evaluation is successful. In this case, if the evaulator successfully evaluates that everything is ready for this input file, it can proceed to submit a DAG run for the `submit_nisar_tlm_ingest` DAG in the underlying SPS. Let's consider another example but this time the configured action is to submit a DAG run instead of publishing to an evaluator's SNS topic: + ``` initiator_config: name: minimal config example payload_type: url: - regexes: - - !!python/regexp '/(?P(?PNISAR)_S(?P\d{3})_(?P\w{2,3})_(?P\w{3,4})_M(?P\d{2})_P(?P\d{5})_R(?P\d{2})_C(?P\d{2})_G(?P\d{2})_(?P\d{4}_\d{3}_\d{2}_\d{2}_\d{2}_\d{5})(?P\d{1,4})\.ldf)$' + - '/(?P(?PNISAR)_S(?P\d{3})_(?P\w{2,3})_(?P\w{3,4})_M(?P\d{2})_P(?P\d{5})_R(?P\d{2})_C(?P\d{2})_G(?P\d{2})_(?P\d{4}_\d{3}_\d{2}_\d{2}_\d{2}_\d{5})(?P\d{1,4})\.ldf)$' evaluators: - name: eval_nisar_l0a_readiness actions: @@ -143,6 +151,7 @@ initiator_config: ``` and a trigger event payload for a new file that was triggered: + ``` { "payload": "s3://test_bucket/prefix/NISAR_S198_PA_PA11_M00_P00922_R00_C01_G00_2024_010_17_57_57_714280000.ldf" @@ -168,10 +177,13 @@ In this case, the router sees that the action is `submit_dag_by_id` and thus mak ## Contents +* [Features](#features) +* [Contents](#contents) * [Quick Start](#quick-start) + * [Requirements](#requirements) * [Setting Up the End-to-End Demo](#setting-up-the-end-to-end-demo) - * [Deploying the Inititator](#deploying-the-initiator) - * [Deploying Example Evaluators](#deploying-example-evaluators-sns-topic-sqs-queue-lambda) + * [Deploying the Initiator](#deploying-the-initiator) + * [Deploying an Example Evaluator (SNS topic-\>SQS queue-\>Lambda)](#deploying-an-example-evaluator-sns-topic-sqs-queue-lambda) * [Deploying an S3 Event Notification Trigger](#deploying-an-s3-event-notification-trigger) * [Verify End-to-End Functionality (part 1)](#verify-end-to-end-functionality-part-1) * [Deploying an EventBridge Scheduler Trigger](#deploying-an-eventbridge-scheduler-trigger) @@ -183,8 +195,8 @@ In this case, the router sees that the action is `submit_dag_by_id` and thus mak * [Build Instructions](#build-instructions) * [Test Instructions](#test-instructions) * [Changelog](#changelog) -* [FAQ](#frequently-asked-questions-faq) -* [Contributing Guide](#contributing) +* [Frequently Asked Questions (FAQ)](#frequently-asked-questions-faq) +* [Contributing](#contributing) * [License](#license) * [References](#references) @@ -207,14 +219,19 @@ This guide provides a quick way to get started with our project. Please see our #### Deploying the Initiator 1. Clone repo: + ``` git clone https://github.com/unity-sds/unity-initiator.git ``` + 1. Change directory to the location of the inititator terraform: + ``` cd unity-initiator/terraform-unity/initiator/ ``` -1. Copy a sample router configuration YAML file to use for deployment and update the AWS region and AWS account ID to match your AWS environment. We will be using the NISAR TLM and AIRS RetStd test cases for this demo so we also rename the SNS topic ARNs for them accordingly: + +1. Copy a sample router configuration YAML file to use for deployment and update the AWS region and AWS account ID to match your AWS environment. We will be using the NISAR TLM test case for this demo so we also rename the SNS topic ARN for it accordingly: + ``` cp ../../tests/resources/test_router.yaml . export AWS_ACCOUNT_ID=$(aws sts get-caller-identity --output text | awk '{print $1}') @@ -223,122 +240,120 @@ This guide provides a quick way to get started with our project. Please see our sed -i "s/123456789012:eval_nisar_ingest/${AWS_ACCOUNT_ID}:uod-dev-eval_nisar_ingest-evaluator_topic/g" test_router.yaml sed -i "s/123456789012:eval_airs_ingest/${AWS_ACCOUNT_ID}:uod-dev-eval_airs_ingest-evaluator_topic/g" test_router.yaml ``` + 1. You will need an S3 bucket for terraform to stage the router Lambda zip file during deployment. Create one or reuse an existing one and set an environment variable for it: + ``` export CODE_BUCKET= ``` + 1. You will need an S3 bucket to store the router configuration YAML file. Create one or reuse an existing one (could be the same one in the previous step) and set an environment variable for it: + ``` export CONFIG_BUCKET= ``` -1. Set a deployment name: + +1. Set a project name: + ``` - export DEPLOYMENT_NAME=gmanipon-test + export PROJECT=gmanipon-test ``` + 1. Initialize terraform: + ``` terraform init ``` + 1. Run terraform apply: + ``` terraform apply \ - --var deployment_name=${DEPLOYMENT_NAME} \ + --var project=${PROJECT} \ --var code_bucket=${CODE_BUCKET} \ --var config_bucket=${CONFIG_BUCKET} \ --var router_config=test_router.yaml \ -auto-approve ``` + **Take note of the `initiator_topic_arn` that is output by terraform. It will be used when setting up any triggers.** -#### Deploying Example Evaluators (SNS topic->SQS queue->Lambda) -##### Evaluator Deployment for NISAR TLM (via staged data to the ISL) -1. Change directory to the location of the evaluators terraform: - ``` - cd ../evaluators - ``` -1. Make a copy of the `sns_sqs_lambda` directory for the NISAR TLM evaluator: +#### Deploying an Example Evaluator (SNS topic->SQS queue->Lambda) + +1. Change directory to the location of the sns_sqs_lambda evaluator terraform: + ``` cp -rp sns_sqs_lambda sns_sqs_lambda-nisar_tlm ``` + 1. Change directory into the NISAR TLM evaluator terraform: + ``` cd sns_sqs_lambda-nisar_tlm/ ``` + 1. Set the name of the evaluator to our NISAR example: + ``` export EVALUATOR_NAME=eval_nisar_ingest ``` -1. Note the implementation of the evaluator code. It currently doesn't do any real evaluation but simply returns that evaluation was successful: - ``` - cat data.tf - ``` -1. Initialize terraform: - ``` - terraform init - ``` -1. Run terraform apply: - ``` - terraform apply \ - --var evaluator_name=${EVALUATOR_NAME} \ - -auto-approve - ``` - **Take note of the `evaluator_topic_arn` that is output by terraform. It should match the respective topic ARN in the test_router.yaml file you used during the initiator deployment. If they match then the router Lambda is now able to submit payloads to this evaluator SNS topic.** -##### Evaluator Deployment for AIRS RetStd (via scheduled CMR query) -1. Change directory to the location of the evaluators terraform: - ``` - cd .. - ``` -1. Make a copy of the `sns_sqs_lambda` directory for the AIRS RetStd evaluator: - ``` - cp -rp sns_sqs_lambda sns_sqs_lambda-airs_retstd - ``` -1. Change directory into the AIRS RetStd evaluator terraform: - ``` - cd sns_sqs_lambda-airs_retstd/ - ``` -1. Set the name of the evaluator to our AIRS example: - ``` - export EVALUATOR_NAME=eval_airs_ingest - ``` 1. Note the implementation of the evaluator code. It currently doesn't do any real evaluation but simply returns that evaluation was successful: + ``` cat data.tf ``` + 1. Initialize terraform: + ``` terraform init ``` + 1. Run terraform apply: + ``` terraform apply \ --var evaluator_name=${EVALUATOR_NAME} \ -auto-approve ``` - **Take note of the `evaluator_topic_arn` that is output by terraform. It should match the respective topic ARN in the test_router.yaml file you used during the initiator deployment. If they match then the router Lambda is now able to submit payloads to this evaluator SNS topic.** + + **Take note of the `evaluator_topic_arn` that is output by terraform. It should match the topic ARN in the test_router.yaml file you used during the initiator deployment. If they match then the router Lambda is now able to submit payloads to this evaluator SNS topic.** #### Deploying an S3 Event Notification Trigger + 1. Change directory to the location of the s3_bucket_notification trigger terraform: + ``` cd ../../triggers/s3_bucket_notification/ ``` + 1. You will need an S3 bucket to configure event notification on. Create one or reuse an existing one (could be the same one in the previous steps) and set an environment variable for it: + ``` export ISL_BUCKET= ``` + 1. Specify an S3 prefix from which S3 event notifications will be emitted when objects are created: + ``` export ISL_BUCKET_PREFIX=incoming/ ``` + 1. Export the `initiator_topic_arn` that was output from the initiator terraform deployment: + ``` export INITIATOR_TOPIC_ARN= ``` + 1. Initialize terraform: + ``` terraform init ``` + 1. Run terraform apply: + ``` terraform apply \ --var isl_bucket=${ISL_BUCKET} \ @@ -346,11 +361,14 @@ This guide provides a quick way to get started with our project. Please see our --var initiator_topic_arn=${INITIATOR_TOPIC_ARN} \ -auto-approve ``` + 1. Verify that the S3 event notification was correctly hooked up to the initiator by looking at the initiator Lambda's CloudWatch logs for a entry similar to this: ![cloudwatch_logs_s3_testevent](https://github.com/unity-sds/unity-initiator/assets/387300/460a0d0b-ee01-480d-afab-ba70185341fc) #### Verify End-to-End Functionality (part 1) + 1. Create some fake NISAR TLM files and stage them up to the ISL bucket under the ISL prefix: + ``` for i in $(echo 24 25 29); do echo 'Hawaii, No Ka Oi!' > NISAR_S198_PA_PA11_M00_P00922_R00_C01_G00_2024_010_17_57_57_714280000.vc${i} @@ -358,63 +376,87 @@ This guide provides a quick way to get started with our project. Please see our rm NISAR_S198_PA_PA11_M00_P00922_R00_C01_G00_2024_010_17_57_57_714280000.vc${i} done ``` + 1. Verify that the `eval_nisar_ingest` evaluator Lambda function was called successfully for each of those staged files by looking at its CloudWatch logs for entries similar to this: ![eval_log_1](https://github.com/unity-sds/unity-initiator/assets/387300/34a273a5-5992-46f8-982b-0a0ec37d1798) #### Deploying an EventBridge Scheduler Trigger + 1. Change directory to the location of the s3_bucket_notification trigger terraform: + ``` cd ../scheduled_task/ ``` + 1. Note the implementation of the trigger lambda code. It currently hard codes a payload URL however in a real implementation, code would be written to query for new files from some REST API, database, etc. Here we simulate that and simply return a NISAR TLM file: + ``` cat data.tf ``` + 1. Initialize terraform: + ``` terraform init ``` -1. Run terraform apply. Note the DEPLOYMENT_NAME and INITIATOR_TOPIC_ARN environment variables should have been set in the previous steps. If not set them again: + +1. Run terraform apply. Note the PROJECT and INITIATOR_TOPIC_ARN environment variables should have been set in the previous steps. If not set them again: + ``` terraform apply \ - --var deployment_name=${DEPLOYMENT_NAME} \ + --var project=${PROJECT} \ --var initiator_topic_arn=${INITIATOR_TOPIC_ARN} \ -auto-approve ``` #### Verify End-to-End Functionality (part 2) + 1. The deployed EventBridge scheduler runs the trigger Lambda function with schedule expression of `rate(1 minute)`. After a minute, verify that the `eval_nisar_ingest` evaluator Lambda function was called successfully for each of those scheduled invocations by looking at its CloudWatch logs for entries similar to this: ![eval_log_2](https://github.com/unity-sds/unity-initiator/assets/387300/cae82e10-a736-43b7-8957-790fc29b5fea) #### Deploying an EventBridge Scheduler Trigger for Periodic CMR Queries + 1. Change directory to the location of the s3_bucket_notification trigger terraform: + ``` cd ../cmr_query/ ``` + 1. Note the implementation of the trigger lambda code. It will query CMR for granules for a particular collection within a timeframe, query its dynamodb table if they already exist, and if not, submit them as payload URLs to the initiator SNS topic and save them into the dynamodb table: + ``` cat lambda_handler.py ``` + 1. Set the CMR provider ID for the AIRS RetStd collection: + ``` export PROVIDER_ID=GES_DISC ``` + 1. Set the CMR concept ID for the AIRS RetStd collection: + ``` export CONCEPT_ID=C1701805619-GES_DISC ``` + 1. Set the amount of seconds to look back from the current epoch for granules in the collection. For example, we will set this value to 2 days (172800 seconds) so that when the CMR query lambda kicks off, it will query for all AIRS RetStd granules using a temporal search of `now - 172800 seconds` to `now`: + ``` export SECONDS_BACK=172800 ``` + 1. Initialize terraform: + ``` terraform init ``` -1. Run terraform apply. Note the DEPLOYMENT_NAME, CODE_BUCKET and INITIATOR_TOPIC_ARN environment variables should have been set in the previous steps. If not set them again: + +1. Run terraform apply. Note the PROJECT, CODE_BUCKET and INITIATOR_TOPIC_ARN environment variables should have been set in the previous steps. If not set them again: + ``` terraform apply \ - --var deployment_name=${DEPLOYMENT_NAME} \ + --var project=${PROJECT} \ --var code_bucket=${CODE_BUCKET} \ --var initiator_topic_arn=${INITIATOR_TOPIC_ARN} \ --var provider_id=${PROVIDER_ID} \ @@ -424,36 +466,49 @@ This guide provides a quick way to get started with our project. Please see our ``` #### Verify End-to-End Functionality (part 3) + 1. The deployed EventBridge scheduler runs the trigger CMR query Lambda function with schedule expression of `rate(1 minute)`. After a minute, verify that the `eval_airs_ingest` evaluator Lambda function was called successfully for each of those scheduled invocations by looking at its CloudWatch logs for entries similar to this: ![eval_log_3](https://github.com/user-attachments/assets/54b26349-91b2-4958-9082-47613da6c675) #### Tear Down + 1. Simply go back into each of the terraform directories for which `terraform apply` was run and run `terraform destroy`. ### Setup Instructions for Development 1. Clone repo: + ``` git clone https://github.com/unity-sds/unity-initiator.git ``` + 1. Install hatch: + ``` pip install hatch ``` + 1. Build virtualenv and install dependencies: + ``` cd unity-initiator hatch env create ``` + 1. Install dev tools: + ``` ./scripts/install_dev_tools.sh ``` + 1. Test pre-commit run: + ``` pre-commit run --all-files ``` + You should see the following output: + ``` check for merge conflicts...............................................................Passed check for broken symlinks...........................................(no files to check)Skipped @@ -477,14 +532,19 @@ This guide provides a quick way to get started with our project. Please see our 1. Follow [Setup Instructions for Development](#setup-instructions-for-development) above. 1. Enter environment: + ``` hatch shell ``` + 1. Build: + ``` hatch build ``` + Wheel and tarballs will be built in the `dist/` directory: + ``` $ tree dist dist @@ -500,14 +560,19 @@ This guide provides a quick way to get started with our project. Please see our 1. Follow [Setup Instructions for Development](#setup-instructions-for-development) above. 1. Enter environment: + ``` hatch shell ``` + 1. Run tests: + ``` hatch run pytest ``` + For more information during test runs, set the log level accordingly. For example: + ``` hatch run pytest -s -v --log-cli-level=INFO --log-level=INFO ``` @@ -541,8 +606,9 @@ For guidance on our governance approach, including decision-making process and o `unity-initiator` is distributed under the terms of the [MIT](https://spdx.org/licenses/MIT.html) license. ## References + [1] Hua, H., Manipon, G. and Shah, S. (2022). Scaling Big Earth Science Data Systems Via Cloud Computing. In Big Data Analytics in Earth, Atmospheric, and Ocean Sciences (eds T. Huang, T.C. Vance and C. Lynnes). -https://doi.org/10.1002/9781119467557.ch3 + diff --git a/src/unity_initiator/actions/submit_dag_by_id.py b/src/unity_initiator/actions/submit_dag_by_id.py index 1fcc20a..d8019bc 100644 --- a/src/unity_initiator/actions/submit_dag_by_id.py +++ b/src/unity_initiator/actions/submit_dag_by_id.py @@ -34,7 +34,9 @@ def execute(self): }, "note": "", } - response = httpx.post(url, auth=auth, headers=headers, json=body) + response = httpx.post( + url, auth=auth, headers=headers, json=body, verify=False + ) # nosec if response.status_code in (200, 201): success = True resp = response.json() diff --git a/src/unity_initiator/actions/submit_ogc_process_execution.py b/src/unity_initiator/actions/submit_ogc_process_execution.py new file mode 100644 index 0000000..9b3fcee --- /dev/null +++ b/src/unity_initiator/actions/submit_ogc_process_execution.py @@ -0,0 +1,50 @@ +import httpx + +from ..utils.logger import logger +from .base import Action + +__all__ = ["SubmitOgcProcessExecution"] + + +class SubmitOgcProcessExecution(Action): + def __init__(self, payload, payload_info, params): + super().__init__(payload, payload_info, params) + logger.info("instantiated %s", __class__.__name__) + + def execute(self): + logger.debug("executing execute in %s", __class__.__name__) + url = f"{self._params['ogc_processes_base_api_endpoint']}/processes/{self._params['process_id']}/execution" + logger.info("url: %s", url) + headers = {"Content-Type": "application/json", "Accept": "application/json"} + # body = { + # "inputs": self._params["execution_inputs"], + # "outputs": self._params["execution_outputs"], + # "subscriber": self._params["execution_subscriber"], + # } + body = { + "inputs": { + "payload": self._payload, + "payload_info": self._payload_info, + "on_success": self._params["on_success"], + }, + "outputs": None, + "subscriber": None, + } + response = httpx.post(url, headers=headers, json=body, verify=False) # nosec + if response.status_code in (200, 201): + success = True + resp = response.json() + logger.info( + "Successfully triggered the execution of the OGC Process %s: %s", + self._params["process_id"], + resp, + ) + else: + success = False + resp = response.text + logger.info( + "Failed to trigger the execution of the OGC Process %s: %s", + self._params["process_id"], + resp, + ) + return {"success": success, "response": resp} diff --git a/src/unity_initiator/resources/routers_schema.yaml b/src/unity_initiator/resources/routers_schema.yaml index a53505d..fdbabe9 100644 --- a/src/unity_initiator/resources/routers_schema.yaml +++ b/src/unity_initiator/resources/routers_schema.yaml @@ -25,7 +25,7 @@ initiator_config: # Configuration for matching payload (e.g. url) against a set of compiled regular expressions # and mapping any matches to a set of evaluators. regex_config: - regexes: list(compiled_regex(), required=True, min=1) + regexes: list(str, required=True, min=1) evaluators: list(include("evaluator_config"), required=True, min=1) # Configuration of actions that submit to evaluators. @@ -36,7 +36,7 @@ evaluator_config: # Currently only 2 types of actions are supported: # 1. submit payload to an SNS topic # 2. submit payload to an airflow DAG -action_config: any(include("submit_dag_by_id_action"), include("submit_to_sns_topic_action")) +action_config: any(include("submit_dag_by_id_action"), include("submit_to_sns_topic_action"), include("submit_ogc_process_execution_action")) # Configuration for submitting a payload to an airflow DAG. submit_dag_by_id_action: @@ -58,6 +58,17 @@ submit_to_sns_topic_action: topic_arn: str(required=False) on_success: include("on_success_actions", required=False) +# Configuration for submitting a OGC process execution. +submit_ogc_process_execution_action: + name: str(equals="submit_ogc_process_execution") + params: + process_id: str() + ogc_processes_base_api_endpoint: str(required=False) + execution_inputs: map(required=False) + execution_outputs: map(required=False) + execution_subscriber: map(required=False) + on_success: include("on_success_actions", required=False) + # Configuration to pass onto the evaluator to use when evaluation is a success. on_success_actions: - actions: list(include("action_config"), required=True, min=1, max=1) \ No newline at end of file + actions: list(include("action_config"), required=True, min=1, max=1) diff --git a/src/unity_initiator/router.py b/src/unity_initiator/router.py index 60521b2..998fbb2 100644 --- a/src/unity_initiator/router.py +++ b/src/unity_initiator/router.py @@ -1,5 +1,6 @@ import asyncio import json +import re from .evaluator import Evaluator from .utils.conf_utils import YamlConf, YamlConfEncoder @@ -14,6 +15,15 @@ class Router: def __init__(self, config_file): self._config_file = config_file self._config = YamlConf(self._config_file) + self._compile_regexes() + + def _compile_regexes(self): + """Compile all regex strings in the configuration.""" + for url_cfg in ( + self._config.get("initiator_config").get("payload_type").get("url", []) + ): + if "regexes" in url_cfg: + url_cfg["regexes"] = [re.compile(regex) for regex in url_cfg["regexes"]] def get_evaluators_by_url(self, url): found_match = False diff --git a/terraform-unity/evaluators/sns_sqs_lambda/.terraform.lock.hcl b/terraform-unity/evaluators/sns-sqs-lambda/.terraform.lock.hcl similarity index 100% rename from terraform-unity/evaluators/sns_sqs_lambda/.terraform.lock.hcl rename to terraform-unity/evaluators/sns-sqs-lambda/.terraform.lock.hcl diff --git a/terraform-unity/evaluators/sns_sqs_lambda/README.md b/terraform-unity/evaluators/sns-sqs-lambda/README.md similarity index 99% rename from terraform-unity/evaluators/sns_sqs_lambda/README.md rename to terraform-unity/evaluators/sns-sqs-lambda/README.md index 02b9327..da9e8f8 100644 --- a/terraform-unity/evaluators/sns_sqs_lambda/README.md +++ b/terraform-unity/evaluators/sns-sqs-lambda/README.md @@ -5,7 +5,7 @@ | Name | Version | |------|---------| -| [terraform](#requirement\_terraform) | ~> 1.4.6 | +| [terraform](#requirement\_terraform) | ~> 1.8.2 | | [archive](#requirement\_archive) | >=2.4.2 | | [aws](#requirement\_aws) | >=5.50.0 | | [local](#requirement\_local) | >=2.5.1 | diff --git a/terraform-unity/evaluators/sns_sqs_lambda/data.tf b/terraform-unity/evaluators/sns-sqs-lambda/data.tf similarity index 100% rename from terraform-unity/evaluators/sns_sqs_lambda/data.tf rename to terraform-unity/evaluators/sns-sqs-lambda/data.tf diff --git a/terraform-unity/evaluators/sns_sqs_lambda/locals.tf b/terraform-unity/evaluators/sns-sqs-lambda/locals.tf similarity index 100% rename from terraform-unity/evaluators/sns_sqs_lambda/locals.tf rename to terraform-unity/evaluators/sns-sqs-lambda/locals.tf diff --git a/terraform-unity/evaluators/sns_sqs_lambda/main.tf b/terraform-unity/evaluators/sns-sqs-lambda/main.tf similarity index 100% rename from terraform-unity/evaluators/sns_sqs_lambda/main.tf rename to terraform-unity/evaluators/sns-sqs-lambda/main.tf diff --git a/terraform-unity/evaluators/sns_sqs_lambda/output.tf b/terraform-unity/evaluators/sns-sqs-lambda/output.tf similarity index 100% rename from terraform-unity/evaluators/sns_sqs_lambda/output.tf rename to terraform-unity/evaluators/sns-sqs-lambda/output.tf diff --git a/terraform-unity/evaluators/sns_sqs_lambda/variables.tf b/terraform-unity/evaluators/sns-sqs-lambda/variables.tf similarity index 100% rename from terraform-unity/evaluators/sns_sqs_lambda/variables.tf rename to terraform-unity/evaluators/sns-sqs-lambda/variables.tf diff --git a/terraform-unity/triggers/scheduled_task/versions.tf b/terraform-unity/evaluators/sns-sqs-lambda/versions.tf similarity index 91% rename from terraform-unity/triggers/scheduled_task/versions.tf rename to terraform-unity/evaluators/sns-sqs-lambda/versions.tf index 5e8229b..9f64e30 100644 --- a/terraform-unity/triggers/scheduled_task/versions.tf +++ b/terraform-unity/evaluators/sns-sqs-lambda/versions.tf @@ -1,5 +1,5 @@ terraform { - required_version = "~> 1.4.6" + required_version = "~> 1.8.2" required_providers { archive = { diff --git a/terraform-unity/initiator/README.md b/terraform-unity/initiator/README.md index 82e0dc9..c2d0bc4 100644 --- a/terraform-unity/initiator/README.md +++ b/terraform-unity/initiator/README.md @@ -5,7 +5,7 @@ | Name | Version | |------|---------| -| [terraform](#requirement\_terraform) | ~> 1.4.6 | +| [terraform](#requirement\_terraform) | ~> 1.8.2 | | [aws](#requirement\_aws) | >=5.50.0 | | [local](#requirement\_local) | >=2.5.1 | | [null](#requirement\_null) | >=3.2.2 | @@ -26,7 +26,7 @@ No modules. | Name | Type | |------|------| -| [aws_cloudwatch_log_group.initiator_lambda_log_group](https://registry.terraform.io/providers/hashicorp/aws/latest/docs/resources/cloudwatch_log_group) | resource | +| [aws_cloudwatch_log_group.initiator_lambda](https://registry.terraform.io/providers/hashicorp/aws/latest/docs/resources/cloudwatch_log_group) | resource | | [aws_iam_policy.initiator_lambda_policy](https://registry.terraform.io/providers/hashicorp/aws/latest/docs/resources/iam_policy) | resource | | [aws_iam_role.initiator_lambda_iam_role](https://registry.terraform.io/providers/hashicorp/aws/latest/docs/resources/iam_role) | resource | | [aws_iam_role_policy_attachment.lambda_base_policy_attachment](https://registry.terraform.io/providers/hashicorp/aws/latest/docs/resources/iam_role_policy_attachment) | resource | @@ -35,7 +35,6 @@ No modules. | [aws_lambda_event_source_mapping.initiator_queue_event_source_mapping](https://registry.terraform.io/providers/hashicorp/aws/latest/docs/resources/lambda_event_source_mapping) | resource | | [aws_lambda_function.initiator_lambda](https://registry.terraform.io/providers/hashicorp/aws/latest/docs/resources/lambda_function) | resource | | [aws_s3_object.lambda_package](https://registry.terraform.io/providers/hashicorp/aws/latest/docs/resources/s3_object) | resource | -| [aws_s3_object.router_config](https://registry.terraform.io/providers/hashicorp/aws/latest/docs/resources/s3_object) | resource | | [aws_sns_topic.initiator_topic](https://registry.terraform.io/providers/hashicorp/aws/latest/docs/resources/sns_topic) | resource | | [aws_sns_topic_subscription.initiator_subscription](https://registry.terraform.io/providers/hashicorp/aws/latest/docs/resources/sns_topic_subscription) | resource | | [aws_sqs_queue.initiator_dead_letter_queue](https://registry.terraform.io/providers/hashicorp/aws/latest/docs/resources/sqs_queue) | resource | @@ -51,10 +50,8 @@ No modules. | Name | Description | Type | Default | Required | |------|-------------|------|---------|:--------:| | [code\_bucket](#input\_code\_bucket) | The S3 bucket where lambda zip files will be stored and accessed | `string` | n/a | yes | -| [config\_bucket](#input\_config\_bucket) | The S3 bucket where router configuration files will be stored and accessed | `string` | n/a | yes | -| [deployment\_name](#input\_deployment\_name) | The deployment name | `string` | n/a | yes | | [project](#input\_project) | The unity project its installed into | `string` | `"uod"` | no | -| [router\_config](#input\_router\_config) | The local path to the router configuration file to use | `string` | n/a | yes | +| [router\_config](#input\_router\_config) | The S3 URL to the router configuration file | `string` | n/a | yes | | [venue](#input\_venue) | The unity venue its installed into | `string` | `"dev"` | no | ## Outputs diff --git a/terraform-unity/initiator/locals.tf b/terraform-unity/initiator/locals.tf index a8f66e2..1ee28d9 100644 --- a/terraform-unity/initiator/locals.tf +++ b/terraform-unity/initiator/locals.tf @@ -1,5 +1,5 @@ locals { - function_name = "${var.project}-${var.venue}-${var.deployment_name}-inititator" + function_name = "${var.project}-${var.venue}-inititator" tags = { Venue = "dev" ServiceArea = "cs" diff --git a/terraform-unity/initiator/main.tf b/terraform-unity/initiator/main.tf index bb6fefc..36eacd6 100644 --- a/terraform-unity/initiator/main.tf +++ b/terraform-unity/initiator/main.tf @@ -3,7 +3,7 @@ resource "null_resource" "build_lambda_package" { provisioner "local-exec" { command = < [terraform](#requirement\_terraform) | ~> 1.4.6 | +| [terraform](#requirement\_terraform) | ~> 1.8.2 | | [archive](#requirement\_archive) | >=2.4.2 | | [aws](#requirement\_aws) | >=5.50.0 | | [local](#requirement\_local) | >=2.5.1 | @@ -52,7 +52,6 @@ No modules. |------|-------------|------|---------|:--------:| | [code\_bucket](#input\_code\_bucket) | The S3 bucket where lambda zip files will be stored and accessed | `string` | n/a | yes | | [concept\_id](#input\_concept\_id) | The concept ID for the data collection: https://cmr.earthdata.nasa.gov/search/site/docs/search/api.html#granule-search-by-parameters | `string` | n/a | yes | -| [deployment\_name](#input\_deployment\_name) | The deployment name | `string` | n/a | yes | | [initiator\_topic\_arn](#input\_initiator\_topic\_arn) | The ARN of the initiator SNS topic to publish S3 events to | `string` | n/a | yes | | [project](#input\_project) | The unity project its installed into | `string` | `"uod"` | no | | [provider\_id](#input\_provider\_id) | The short name for the data provider: https://cmr.earthdata.nasa.gov/search/site/docs/search/api.html#granule-search-by-parameters | `string` | n/a | yes | diff --git a/terraform-unity/triggers/cmr_query/locals.tf b/terraform-unity/triggers/cmr_query/locals.tf index a7a381e..546c83d 100644 --- a/terraform-unity/triggers/cmr_query/locals.tf +++ b/terraform-unity/triggers/cmr_query/locals.tf @@ -1,5 +1,5 @@ locals { - function_name = "${var.project}-${var.venue}-${var.deployment_name}-cmr_query" + function_name = "${var.project}-${var.venue}-cmr_query" tags = { Venue = "dev" ServiceArea = "cs" diff --git a/terraform-unity/triggers/cmr_query/variables.tf b/terraform-unity/triggers/cmr_query/variables.tf index 2d138e9..1736fb1 100644 --- a/terraform-unity/triggers/cmr_query/variables.tf +++ b/terraform-unity/triggers/cmr_query/variables.tf @@ -1,8 +1,3 @@ -variable "deployment_name" { - description = "The deployment name" - type = string -} - variable "project" { description = "The unity project its installed into" type = string diff --git a/terraform-unity/triggers/cmr_query/versions.tf b/terraform-unity/triggers/cmr_query/versions.tf index 5e8229b..9f64e30 100644 --- a/terraform-unity/triggers/cmr_query/versions.tf +++ b/terraform-unity/triggers/cmr_query/versions.tf @@ -1,5 +1,5 @@ terraform { - required_version = "~> 1.4.6" + required_version = "~> 1.8.2" required_providers { archive = { diff --git a/terraform-unity/triggers/s3_bucket_notification/.terraform.lock.hcl b/terraform-unity/triggers/s3-bucket-notification/.terraform.lock.hcl similarity index 100% rename from terraform-unity/triggers/s3_bucket_notification/.terraform.lock.hcl rename to terraform-unity/triggers/s3-bucket-notification/.terraform.lock.hcl diff --git a/terraform-unity/triggers/s3_bucket_notification/README.md b/terraform-unity/triggers/s3-bucket-notification/README.md similarity index 98% rename from terraform-unity/triggers/s3_bucket_notification/README.md rename to terraform-unity/triggers/s3-bucket-notification/README.md index 414184b..14c7a4e 100644 --- a/terraform-unity/triggers/s3_bucket_notification/README.md +++ b/terraform-unity/triggers/s3-bucket-notification/README.md @@ -5,7 +5,7 @@ | Name | Version | |------|---------| -| [terraform](#requirement\_terraform) | ~> 1.4.6 | +| [terraform](#requirement\_terraform) | ~> 1.8.2 | | [aws](#requirement\_aws) | >=5.50.0 | | [local](#requirement\_local) | >=2.5.1 | | [null](#requirement\_null) | >=3.2.2 | diff --git a/terraform-unity/triggers/s3_bucket_notification/main.tf b/terraform-unity/triggers/s3-bucket-notification/main.tf similarity index 100% rename from terraform-unity/triggers/s3_bucket_notification/main.tf rename to terraform-unity/triggers/s3-bucket-notification/main.tf diff --git a/terraform-unity/triggers/s3_bucket_notification/variables.tf b/terraform-unity/triggers/s3-bucket-notification/variables.tf similarity index 100% rename from terraform-unity/triggers/s3_bucket_notification/variables.tf rename to terraform-unity/triggers/s3-bucket-notification/variables.tf diff --git a/terraform-unity/triggers/s3_bucket_notification/versions.tf b/terraform-unity/triggers/s3-bucket-notification/versions.tf similarity index 100% rename from terraform-unity/triggers/s3_bucket_notification/versions.tf rename to terraform-unity/triggers/s3-bucket-notification/versions.tf diff --git a/terraform-unity/triggers/scheduled_task/.terraform.lock.hcl b/terraform-unity/triggers/scheduled-task/.terraform.lock.hcl similarity index 100% rename from terraform-unity/triggers/scheduled_task/.terraform.lock.hcl rename to terraform-unity/triggers/scheduled-task/.terraform.lock.hcl diff --git a/terraform-unity/triggers/scheduled_task/README.md b/terraform-unity/triggers/scheduled-task/README.md similarity index 95% rename from terraform-unity/triggers/scheduled_task/README.md rename to terraform-unity/triggers/scheduled-task/README.md index f8ff0da..97d3743 100644 --- a/terraform-unity/triggers/scheduled_task/README.md +++ b/terraform-unity/triggers/scheduled-task/README.md @@ -5,7 +5,7 @@ | Name | Version | |------|---------| -| [terraform](#requirement\_terraform) | ~> 1.4.6 | +| [terraform](#requirement\_terraform) | ~> 1.8.2 | | [archive](#requirement\_archive) | >=2.4.2 | | [aws](#requirement\_aws) | >=5.50.0 | | [local](#requirement\_local) | >=2.5.1 | @@ -42,7 +42,6 @@ No modules. | Name | Description | Type | Default | Required | |------|-------------|------|---------|:--------:| -| [deployment\_name](#input\_deployment\_name) | The deployment name | `string` | n/a | yes | | [initiator\_topic\_arn](#input\_initiator\_topic\_arn) | The ARN of the initiator SNS topic to publish S3 events to | `string` | n/a | yes | | [project](#input\_project) | The unity project its installed into | `string` | `"uod"` | no | | [venue](#input\_venue) | The unity venue its installed into | `string` | `"dev"` | no | diff --git a/terraform-unity/triggers/scheduled_task/data.tf b/terraform-unity/triggers/scheduled-task/data.tf similarity index 100% rename from terraform-unity/triggers/scheduled_task/data.tf rename to terraform-unity/triggers/scheduled-task/data.tf diff --git a/terraform-unity/triggers/scheduled_task/locals.tf b/terraform-unity/triggers/scheduled-task/locals.tf similarity index 78% rename from terraform-unity/triggers/scheduled_task/locals.tf rename to terraform-unity/triggers/scheduled-task/locals.tf index 68396e8..d7997ca 100644 --- a/terraform-unity/triggers/scheduled_task/locals.tf +++ b/terraform-unity/triggers/scheduled-task/locals.tf @@ -1,5 +1,5 @@ locals { - function_name = "${var.project}-${var.venue}-${var.deployment_name}-scheduled_task" + function_name = "${var.project}-${var.venue}-scheduled_task" tags = { Venue = "dev" ServiceArea = "cs" diff --git a/terraform-unity/triggers/scheduled_task/main.tf b/terraform-unity/triggers/scheduled-task/main.tf similarity index 91% rename from terraform-unity/triggers/scheduled_task/main.tf rename to terraform-unity/triggers/scheduled-task/main.tf index 78e57cf..16eea3b 100644 --- a/terraform-unity/triggers/scheduled_task/main.tf +++ b/terraform-unity/triggers/scheduled-task/main.tf @@ -50,7 +50,7 @@ resource "aws_cloudwatch_log_group" "scheduled_task_lambda_log_group" { } resource "aws_iam_role" "scheduler" { - name = "${var.project}-${var.venue}-${var.deployment_name}-cron-scheduler-role" + name = "${var.project}-${var.venue}-cron-scheduler-role" assume_role_policy = jsonencode({ Version = "2012-10-17" Statement = [ @@ -68,7 +68,7 @@ resource "aws_iam_role" "scheduler" { } resource "aws_iam_policy" "scheduler" { - name = "${var.project}-${var.venue}-${var.deployment_name}-cron-scheduler-policy" + name = "${var.project}-${var.venue}-cron-scheduler-policy" policy = jsonencode({ Version = "2012-10-17" Statement = [ @@ -90,7 +90,7 @@ resource "aws_iam_role_policy_attachment" "scheduler" { } resource "aws_scheduler_schedule" "run_scheduled_task" { - name = "${var.project}-${var.venue}-${var.deployment_name}-run_scheduled_task" + name = "${var.project}-${var.venue}-run_scheduled_task" schedule_expression = "rate(1 minute)" flexible_time_window { mode = "OFF" diff --git a/terraform-unity/triggers/scheduled_task/main.tf.cloudwatch_event b/terraform-unity/triggers/scheduled-task/main.tf.cloudwatch_event similarity index 100% rename from terraform-unity/triggers/scheduled_task/main.tf.cloudwatch_event rename to terraform-unity/triggers/scheduled-task/main.tf.cloudwatch_event diff --git a/terraform-unity/triggers/scheduled_task/variables.tf b/terraform-unity/triggers/scheduled-task/variables.tf similarity index 80% rename from terraform-unity/triggers/scheduled_task/variables.tf rename to terraform-unity/triggers/scheduled-task/variables.tf index f71ffef..495a65c 100644 --- a/terraform-unity/triggers/scheduled_task/variables.tf +++ b/terraform-unity/triggers/scheduled-task/variables.tf @@ -1,8 +1,3 @@ -variable "deployment_name" { - description = "The deployment name" - type = string -} - variable "project" { description = "The unity project its installed into" type = string diff --git a/terraform-unity/evaluators/sns_sqs_lambda/versions.tf b/terraform-unity/triggers/scheduled-task/versions.tf similarity index 91% rename from terraform-unity/evaluators/sns_sqs_lambda/versions.tf rename to terraform-unity/triggers/scheduled-task/versions.tf index 5e8229b..9f64e30 100644 --- a/terraform-unity/evaluators/sns_sqs_lambda/versions.tf +++ b/terraform-unity/triggers/scheduled-task/versions.tf @@ -1,5 +1,5 @@ terraform { - required_version = "~> 1.4.6" + required_version = "~> 1.8.2" required_providers { archive = { diff --git a/tests/resources/test_bad_router_3.yaml b/tests/resources/test_bad_router_3.yaml index d706d8c..6db3719 100644 --- a/tests/resources/test_bad_router_3.yaml +++ b/tests/resources/test_bad_router_3.yaml @@ -3,5 +3,5 @@ initiator_config: payload_type: url: - regexes: - - !!python/regexp '/(?PSISTER_EMIT_L1B_RDN_(?P\d{8}T\d{6})_(?P)_OBS\.bin)$' + - '/(?PSISTER_EMIT_L1B_RDN_(?P\d{8}T\d{6})_(?P)_OBS\.bin)$' evaluators: diff --git a/tests/resources/test_bad_router_4.yaml b/tests/resources/test_bad_router_4.yaml index 0b3cd5f..9fb8927 100644 --- a/tests/resources/test_bad_router_4.yaml +++ b/tests/resources/test_bad_router_4.yaml @@ -3,7 +3,7 @@ initiator_config: payload_type: url: - regexes: - - !!python/regexp '/(?PSISTER_EMIT_L1B_RDN_(?P\d{8}T\d{6})_(?P)_OBS\.bin)$' + - '/(?PSISTER_EMIT_L1B_RDN_(?P\d{8}T\d{6})_(?P)_OBS\.bin)$' evaluators: - name: eval_sbg_l2_readiness actions: diff --git a/tests/resources/test_bad_router_5.yaml b/tests/resources/test_bad_router_5.yaml index e7fbf53..818de30 100644 --- a/tests/resources/test_bad_router_5.yaml +++ b/tests/resources/test_bad_router_5.yaml @@ -3,7 +3,7 @@ initiator_config: payload_type: url: - regexes: - - !!python/regexp '/(?PSISTER_EMIT_L1B_RDN_(?P\d{8}T\d{6})_(?P)_OBS\.bin)$' + - '/(?PSISTER_EMIT_L1B_RDN_(?P\d{8}T\d{6})_(?P)_OBS\.bin)$' evaluators: - name: eval_sbg_l2_readiness actions: diff --git a/tests/resources/test_bad_router_6.yaml b/tests/resources/test_bad_router_6.yaml index 7181560..7024c5b 100644 --- a/tests/resources/test_bad_router_6.yaml +++ b/tests/resources/test_bad_router_6.yaml @@ -3,7 +3,7 @@ initiator_config: payload_type: url: - regexes: - - !!python/regexp '/(?PSISTER_EMIT_L1B_RDN_(?P\d{8}T\d{6})_(?P)_OBS\.bin)$' + - '/(?PSISTER_EMIT_L1B_RDN_(?P\d{8}T\d{6})_(?P)_OBS\.bin)$' evaluators: - name: eval_sbg_l2_readiness actions: diff --git a/tests/resources/test_bad_router_7.yaml b/tests/resources/test_bad_router_7.yaml index 017a895..292f940 100644 --- a/tests/resources/test_bad_router_7.yaml +++ b/tests/resources/test_bad_router_7.yaml @@ -3,7 +3,7 @@ initiator_config: payload_type: url: - regexes: - - !!python/regexp '/(?P(?PNISAR)_S(?P\d{3})_(?P\w{2,3})_(?P\w{3,4})_M(?P\d{2})_P(?P\d{5})_R(?P\d{2})_C(?P\d{2})_G(?P\d{2})_(?P\d{4}_\d{3}_\d{2}_\d{2}_\d{2}_\d{5})(?P\d{1,4})\.ldf)$' + - '/(?P(?PNISAR)_S(?P\d{3})_(?P\w{2,3})_(?P\w{3,4})_M(?P\d{2})_P(?P\d{5})_R(?P\d{2})_C(?P\d{2})_G(?P\d{2})_(?P\d{4}_\d{3}_\d{2}_\d{2}_\d{2}_\d{5})(?P\d{1,4})\.ldf)$' evaluators: - name: eval_nisar_l0a_readiness actions: diff --git a/tests/resources/test_router.yaml b/tests/resources/test_router.yaml index 8498ffc..f7212d1 100644 --- a/tests/resources/test_router.yaml +++ b/tests/resources/test_router.yaml @@ -13,7 +13,7 @@ initiator_config: # SBG example: L1B data staged to S3 bucket and payload is S3 url - regexes: - - !!python/regexp '/(?PSISTER_EMIT_L1B_RDN_(?P\d{8}T\d{6})_(?P\d{3})_OBS\.bin)$' + - '/(?PSISTER_EMIT_L1B_RDN_(?P\d{8}T\d{6})_(?P\d{3})_OBS\.bin)$' evaluators: # If the regex matches, the router submits a JSON payload to the eval_sbg_l2_readiness SNS topic that contains @@ -52,9 +52,9 @@ initiator_config: # M2020 example: xyz left finder; example of matching any one of a set of regexes - regexes: - - !!python/regexp 'ids-pipeline/pipes/nonlin_xyz_left/inputque/.L.{17}_.{3}RAS_N.{26}\.VIC-link' - - !!python/regexp 'ids-pipeline/pipes/nonlin_xyz_left/inputque/.R.{17}_.{3}RAS_N.{26}\.VIC-link' - - !!python/regexp 'ids-pipeline/pipes/nonlin_xyz_left/inputque/.L.{17}_.{3}DSP_N.{26}\.VIC-link' + - 'ids-pipeline/pipes/nonlin_xyz_left/inputque/.L.{17}_.{3}RAS_N.{26}\.VIC-link' + - 'ids-pipeline/pipes/nonlin_xyz_left/inputque/.R.{17}_.{3}RAS_N.{26}\.VIC-link' + - 'ids-pipeline/pipes/nonlin_xyz_left/inputque/.L.{17}_.{3}DSP_N.{26}\.VIC-link' evaluators: # If any of the regexes match, the router submits a JSON payload to the eval_m2020_xyz_left_finder SNS topic that contains @@ -79,7 +79,7 @@ initiator_config: # NISAR example: GDS stages satellite telemetry to S3 bucket and payload is S3 url - regexes: - - !!python/regexp '/(?P(?PNISAR)_S(?P\d{3})_(?P\w{2,3})_(?P\w{3,4})_M(?P\d{2})_P(?P\d{5})_R(?P\d{2})_C(?P\d{2})_G(?P\d{2})_(?P\d{4}_\d{3}_\d{2}_\d{2}_\d{2}_\d{6})\d{3}\.vc(?P\w{2}))$' + - '/(?P(?PNISAR)_S(?P\d{3})_(?P\w{2,3})_(?P\w{3,4})_M(?P\d{2})_P(?P\d{5})_R(?P\d{2})_C(?P\d{2})_G(?P\d{2})_(?P\d{4}_\d{3}_\d{2}_\d{2}_\d{2}_\d{6})\d{3}\.vc(?P\w{2}))$' evaluators: # If the regex matches, the router submits a JSON payload to the eval_nisar_ingest SNS topic that contains @@ -104,7 +104,7 @@ initiator_config: # NISAR example: GDS stages LDF (list of delivered files) to S3 bucket and payload is S3 url - regexes: - - !!python/regexp '/(?P(?PNISAR)_S(?P\d{3})_(?P\w{2,3})_(?P\w{3,4})_M(?P\d{2})_P(?P\d{5})_R(?P\d{2})_C(?P\d{2})_G(?P\d{2})_(?P\d{4}_\d{3}_\d{2}_\d{2}_\d{2}_\d{5})(?P\d{1,4})\.ldf)$' + - '/(?P(?PNISAR)_S(?P\d{3})_(?P\w{2,3})_(?P\w{3,4})_M(?P\d{2})_P(?P\d{5})_R(?P\d{2})_C(?P\d{2})_G(?P\d{2})_(?P\d{4}_\d{3}_\d{2}_\d{2}_\d{2}_\d{5})(?P\d{1,4})\.ldf)$' evaluators: # If the regex matches, the router submits a JSON payload to the eval_nisar_l0a_readiness DAG via Airflow REST API @@ -132,7 +132,7 @@ initiator_config: # AIRS RetStd example: scheduled task to periodically check for new AIRS granules published to CMR - regexes: - - !!python/regexp '/(?P(?PAIRS)\.(?P\d{4})\.(?P\d{2})\.(?P\d{2})\.(?P\d{3})\.(?PL.+?)\.(?P.+?)\.(?Pv\d+)\.(?P\d+)\.(?P\d+)\.(?P\d+)\.(?P.+?).hdf)$' + - '/(?P(?PAIRS)\.(?P\d{4})\.(?P\d{2})\.(?P\d{2})\.(?P\d{3})\.(?PL.+?)\.(?P.+?)\.(?Pv\d+)\.(?P\d+)\.(?P\d+)\.(?P\d+)\.(?P.+?).hdf)$' evaluators: # If the regex matches, the router submits a JSON payload to the eval_airs_ingest SNS topic that contains @@ -154,3 +154,19 @@ initiator_config: airflow_base_api_endpoint: xxx airflow_username: airflow_password: + + + - regexes: + - '(?<=/)(?Phello_world\.txt)$' + evaluators: + - name: eval_hello_world_readiness + actions: + - name: submit_ogc_process_execution + params: + process_id: eval_hello_world_readiness + ogc_processes_base_api_endpoint: ${ogc_processes_base_api_endpoint} + on_success: + actions: + - name: submit_ogc_process_execution + params: + process_id: hello_world