Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merge Release/2024 q2 into master #41

Merged
merged 26 commits into from
Jun 14, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
26 commits
Select commit Hold shift + click to select a range
0ea88b8
Merge pull request #38 from usdot-jpo-ode/master
dan-du-car Feb 28, 2024
5b80bae
Update devcontainer to use Java 21
mwodahl Mar 27, 2024
ae90855
Merge pull request #19 from CDOT-CV/Update/devcontainer
dmccoystephenson Apr 2, 2024
1f9a7a0
Removed travis build status from README
dmccoystephenson May 7, 2024
d825b7d
Removed quality gate status from README
dmccoystephenson May 7, 2024
c4af8cf
Changed 'Situation Data Warehouse' to 'Situational Data Exchange' in …
dmccoystephenson May 7, 2024
63d53e7
Adjusted headers in README to increase clarity
dmccoystephenson May 7, 2024
c0608e4
Reworded 'Overview' section of README & added link to SDX REST API do…
dmccoystephenson May 7, 2024
f8a0d74
Corrected capitalization in 'Release Notes' section of README
dmccoystephenson May 7, 2024
509cfb2
Revised 'Installation and Operation' section of README
dmccoystephenson May 7, 2024
c970ddd
Revised 'Configuration Reference' section of README
dmccoystephenson May 8, 2024
00ca2ab
Moved 'Confluent Cloud Integration' section to bottom of README
dmccoystephenson May 8, 2024
440ea3d
Revised 'Object Data Consumption' section of README
dmccoystephenson May 8, 2024
77cfdac
Corrected KAFKA_TYPE description
dmccoystephenson May 8, 2024
7f53b99
Fixed local kafka installation instructions referencing CC docker-com…
dmccoystephenson May 15, 2024
cd4c731
Removed SDW_GROUP_ID from sample.env & README
dmccoystephenson May 15, 2024
87a7573
Merge pull request #20 from CDOT-CV/docs/reviewing-and-revising-docum…
dmccoystephenson May 20, 2024
5a0cb88
Updated `Release_notes.md` for 1.7.0 release
dmccoystephenson May 24, 2024
8d1086e
Changed version to 1.7.0-SNAPSHOT
dmccoystephenson May 28, 2024
52a1118
Merge pull request #22 from CDOT-CV/version/change-version-to-1.7.0-S…
payneBrandon May 31, 2024
48236b9
Merge pull request #21 from CDOT-CV/docs/update-release-notes-2024-q2
payneBrandon May 31, 2024
6dd4020
Uncommented SDW_DESTINATION_URL in `sample.env` file
dmccoystephenson Jun 11, 2024
d2cd9d3
Merge pull request #23 from CDOT-CV/pr/addressing-usdot-comments-6-11…
dmccoystephenson Jun 11, 2024
b58f5aa
Merge pull request #39 from CDOT-CV/develop
dan-du-car Jun 11, 2024
e0cefd2
Update dockerhub.yml
SaikrishnaBairamoni Jun 12, 2024
95679d9
Merge pull request #40 from usdot-jpo-ode/develop
dan-du-car Jun 12, 2024
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
44 changes: 23 additions & 21 deletions .devcontainer/Dockerfile
Original file line number Diff line number Diff line change
@@ -1,28 +1,30 @@
# See here for image contents: https://github.com/microsoft/vscode-dev-containers/tree/v0.217.4/containers/java/.devcontainer/base.Dockerfile
# Install openJDK version 21 (includes maven, gradle, and node)
FROM cimg/openjdk:21.0.2-node

# [Choice] Java version (use -bullseye variants on local arm64/Apple Silicon): 11, 17, 11-bullseye, 17-bullseye, 11-buster, 17-buster
ARG VARIANT="17"
FROM mcr.microsoft.com/vscode/devcontainers/java:0-${VARIANT}
# set user to root to allow apt-get to run
USER root

# [Option] Install Maven
ARG INSTALL_MAVEN="true"
ARG MAVEN_VERSION="3.6.3"
# [Option] Install Gradle
ARG INSTALL_GRADLE="false"
ARG GRADLE_VERSION=""
RUN if [ "${INSTALL_MAVEN}" = "true" ]; then su vscode -c "umask 0002 && . /usr/local/sdkman/bin/sdkman-init.sh && sdk install maven \"${MAVEN_VERSION}\""; fi \
&& if [ "${INSTALL_GRADLE}" = "true" ]; then su vscode -c "umask 0002 && . /usr/local/sdkman/bin/sdkman-init.sh && sdk install gradle \"${GRADLE_VERSION}\""; fi
ARG USERNAME=vscode
ARG USER_UID=1000
ARG USER_GID=$USER_UID

# [Choice] Node.js version: none, lts/*, 16, 14, 12, 10
ARG NODE_VERSION="none"
RUN if [ "${NODE_VERSION}" != "none" ]; then su vscode -c "umask 0002 && . /usr/local/share/nvm/nvm.sh && nvm install ${NODE_VERSION} 2>&1"; fi

# [Optional] Uncomment this section to install additional OS packages.
# RUN apt-get update && export DEBIAN_FRONTEND=noninteractive \
# && apt-get -y install --no-install-recommends <your-package-list-here>
# Create non-root user vscode with sudo support
ENV DEBIAN_FRONTEND=noninteractive
RUN apt-get update \
#
# Create a non-root user to use if preferred - see https://aka.ms/vscode-remote/containers/non-root-user.
&& groupadd --gid $USER_GID $USERNAME \
&& useradd -s /bin/bash --uid $USER_UID --gid $USER_GID -m $USERNAME \
&& apt-get install -y sudo \
&& echo $USERNAME ALL=\(root\) NOPASSWD:ALL > /etc/sudoers.d/$USERNAME\
&& chmod 0440 /etc/sudoers.d/$USERNAME

# [Optional] Uncomment this line to install global node packages.
# RUN su vscode -c "source /usr/local/share/nvm/nvm.sh && npm install -g <your-package-here>" 2>&1
# RUN npm install -g <your-package-list-here>

# install kafkacat for testing purposes
RUN apt-get update && apt-get install -y kafkacat
RUN apt-get update && apt-get install -y kafkacat

# [Optional] Uncomment this section to install additional OS packages.
# RUN apt-get update && export DEBIAN_FRONTEND=noninteractive \
# && apt-get -y install --no-install-recommends <your-package-list-here>
15 changes: 5 additions & 10 deletions .devcontainer/devcontainer.json
Original file line number Diff line number Diff line change
Expand Up @@ -4,16 +4,6 @@
"name": "Java",
"build": {
"dockerfile": "Dockerfile",
"args": {
// Update the VARIANT arg to pick a Java version: 11, 17
// Append -bullseye or -buster to pin to an OS version.
// Use the -bullseye variants on local arm64/Apple Silicon.
"VARIANT": "11",
// Options
"INSTALL_MAVEN": "true",
"INSTALL_GRADLE": "false",
"NODE_VERSION": "none"
}
},

// Set *default* container specific settings.json values on container create.
Expand All @@ -26,6 +16,11 @@
"vscjava.vscode-java-pack"
],


"containerEnv": {
"SHELL": "/bin/bash"
},

// Use 'forwardPorts' to make a list of ports inside the container available locally.
// "forwardPorts": [],

Expand Down
7 changes: 6 additions & 1 deletion .github/workflows/dockerhub.yml
Original file line number Diff line number Diff line change
Expand Up @@ -20,8 +20,13 @@ jobs:
with:
username: ${{ secrets.DOCKERHUB_USERNAME }}
password: ${{ secrets.DOCKERHUB_TOKEN }}

- name: Replcae Docker tag
id: set_tag
run: echo "TAG=$(echo ${GITHUB_REF##*/} | sed 's/\//-/g')" >> $GITHUB_ENV

- name: Build
uses: docker/build-push-action@v5
with:
push: true
tags: usdotjpoode/jpo-sdw-depositor:${{ github.ref_name }}
tags: usdotjpoode/jpo-sdw-depositor:${{ env.TAG }}
4 changes: 2 additions & 2 deletions Dockerfile
Original file line number Diff line number Diff line change
Expand Up @@ -14,8 +14,8 @@ RUN mvn clean package -DskipTests
FROM eclipse-temurin:21-jre-alpine

WORKDIR /home
COPY --from=builder /home/target/jpo-sdw-depositor-1.6.0-SNAPSHOT.jar /home
COPY --from=builder /home/target/jpo-sdw-depositor-1.7.0-SNAPSHOT.jar /home

ENTRYPOINT ["java", \
"-jar", \
"/home/jpo-sdw-depositor-1.6.0-SNAPSHOT.jar"]
"/home/jpo-sdw-depositor-1.7.0-SNAPSHOT.jar"]
113 changes: 67 additions & 46 deletions README.md
Original file line number Diff line number Diff line change
@@ -1,95 +1,116 @@
# jpo-sdw-depositor [![Build Status](https://travis-ci.org/usdot-jpo-ode/jpo-sdw-depositor.svg?branch=dev)](https://travis-ci.org/usdot-jpo-ode/jpo-sdw-depositor) [![Quality Gate Status](https://sonarcloud.io/api/project_badges/measure?project=usdot.jpo.ode%3Ajpo-sdw-depositor&metric=alert_status)](https://sonarcloud.io/dashboard?id=usdot.jpo.ode%3Ajpo-sdw-depositor)
# jpo-sdw-depositor

Subscribes to a Kafka topic and deposits messages to the Situation Data Warehouse (SDW).
Subscribes to a Kafka topic and deposits messages to the [Situational Data Exchange (SDX)](https://sdx.trihydro.com/).

# Overview

This is a submodule of the [jpo-ode](https://github.com/usdot-jpo-ode/jpo-ode) repository. It subscribes to a Kafka topic and listens for incoming messages. Upon message arrival, this application deposits it over REST to the SDX.
## Overview

This is a submodule of the [jpo-ode](https://github.com/usdot-jpo-ode/jpo-ode) repository. It subscribes to a Kafka topic and listens for incoming messages. Upon message arrival, this application deposits the message to the SDX via [REST API](https://sdx-service.trihydro.com/index.html).

## Release Notes
The current version and release history of the Jpo-sdw-depositor: [Jpo-sdw-depositor Release Notes](<docs/Release_notes.md>)
The current version and release history of the jpo-sdw-depositor project: [jpo-sdw-depositor Release Notes](<docs/Release_notes.md>)

# Installation and Operation
## Installation and Operation

### Requirements

- Docker
- [Kafka](https://kafka.apache.org/)
- [Docker](https://www.docker.com/)

### Option 1: As ODE submodule
The jpo-sdw-depositor is intended to be run as a submodule of the [jpo-ode](https://github.com/usdot-jpo-ode/jpo-ode) project. The ODE project repository includes a docker-compose file that will run the depositor in conjunction with the ODE by default. The same environment variables mentioned in the [Configuration Reference](#configuration-reference) below will need to be set in the `.env` file in the root of the ODE project.

### Option 1: Standalone
### Option 2: Standalone (Depositor Only) with Remote Kafka

Use this option when you want to run the depositor by itself. This will listen to any Kafka topic you specify and deposit messages to the Situation Data Exchange.
Use this option when you want to run the depositor by itself and you already have a Kafka cluster running remotely. This option will run the depositor in a Docker container and connect to a remote Kafka cluster to listen for messages. The depositor will then deposit these messages to the SDX.

1. Configure your desired properties. See **Configuration Reference** at the bottom of this README.
2. Rename your `sample.env` file to `.env` if you haven't already done so
3. Execute the `run.sh` script OR execute these commands:
1. Rename your `sample.env` file to `.env`. This file contains the environment variables that the application will use to connect to Kafka and the SDX.
1. Configure your environment variables in the `.env` file. See the [Configuration Reference](#configuration-reference) below.
1. Execute the `run.sh` script OR execute these commands:

```
docker build -t jpo-sdw-depositor .
docker build -t jpo-sdw-depositor .
docker run --rm --env-file .env jpo-sdw-depositor:latest
```

### Option 3: With Local Kafka
Use this option when you want to run the depositor and you want to run a local Kafka cluster alongside it. This option will run the depositor and a Kafka cluster in Docker containers. The depositor will listen for messages on the local Kafka cluster and deposit them to the SDX.

### Option 2: As ODE submodule
1. Rename your `sample.env` file to `.env`. This file contains the environment variables that the application will use to connect to Kafka and the SDX.
1. Configure your environment variables in the `.env` file. See the [Configuration Reference](#configuration-reference) below.
1. Run the following command:

** IN PROGRESS! Further instructions pending ODE compatibility. **
```
docker compose up --build
```

Use this option when you want to run this module in conjuction with the [jpo-ode](https://github.com/usdot-jpo-ode/jpo-ode). The only action you must take here is to set the configuration properties in the env file. See the bottom of this README for a reference.
### Option 4: With Confluent Cloud Kafka
Use this option when you want to run the depositor and you want to connect to a Kafka cluster hosted by Confluent Cloud. This option will run the depositor in a Docker container and connect to a Kafka cluster hosted by Confluent Cloud to listen for messages. The depositor will then deposit these messages to the SDX.

1. Rename your `sample.env` file to `.env`. This file contains the environment variables that the application will use to connect to Kafka and the SDX.
1. Configure your environment variables in the `.env` file. See the [Configuration Reference](#configuration-reference) below.
1. Run the following command:

```
docker compose -f docker-compose-confluent-cloud.yml up --build
```

See the [Confluent Cloud Integration](#confluent-cloud-integration) section for more information.

# Configuration Reference
## Configuration Reference

**SOME OF THESE PROPERTIES ARE SENSITIVE. DO NOT PUBLISH THEM TO VERSION CONTROL**

You may configure these values in `jpo-sdw-depositor/src/main/resources/application.properties` or by editing them in the `sample.env` file.
It is recommended to use environment variables to configure the application, rather than hardcoding values in the `application.properties` file. This allows for easier configuration management and better security.

Alternatively, you can configure the application by editing the [application.properties](src\main\resources\application.properties) file.

**IMPORTANT** When using the env file method, you must You must rename or duplicate the `sample.env` file to `.env`
**IMPORTANT** When using the env file method, you must You must rename or duplicate the `sample.env` file to `.env` and fill in the values for the environment variables. The `.env` file is used to pass environment variables to the Docker container.


| Value in `application.properties` | Value as env var (in sample.env) | Description | Example Value |
|-----------------------------------|----------------------------------|-------------------------------------------------------|-----------------------------|
| sdw.kafkaBrokers | DOCKER_HOST_IP | Host IP ([instructions](https://github.com/usdot-jpo-ode/jpo-ode/wiki/Docker-management#obtaining-docker_host_ip)) | 10.1.2.3 || sdw.groupId | SDW_GROUP_ID | The Kafka group id to be used for message consumption | usdot.jpo.sdw | |
| sdw.kafkaPort | SDW_KAFKA_PORT | Port of the Kafka instance | 9092 |
| sdw.subscriptionTopic | SDW_SUBSCRIPTION_TOPIC | Kafka topic to listen to | topic.J2735TimBroadcastJson |
| sdw.kafkaBrokers | DOCKER_HOST_IP | Host IP ([instructions](https://github.com/usdot-jpo-ode/jpo-ode/wiki/Docker-management#obtaining-docker_host_ip)) | 10.1.2.3 |
| sdw.subscriptionTopics | SDW_SUBSCRIPTION_TOPIC | Kafka topic to listen to | topic.J2735TimBroadcastJson |
| sdw.destinationUrl | SDW_DESTINATION_URL | Full path of the SDX server address | 127.0.0.1 |
| sdw.apikey | SDW_API_KEY | SDX API Key (generated by [SDX](https://sdx.trihydro.com)) | (n/a)
| sdw.emailList | SDW_EMAIL_LIST | Comma-delimited email list to send error emails to | [email protected],[email protected]
| sdw.emailFrom | SDW_EMAIL_FROM | Support email to send from | [email protected]
N/A | KAFKA_TYPE | Type of Kafka connection to be used. Must be set to CONFLUENT, otherwise the application will default to a non-Confluent connection | CONFLUENT
N/A | CONFLUENT_KEY | Confluent Cloud API Key | (n/a)
N/A | CONFLUENT_SECRET | Confluent Cloud API Secret | (n/a)

# Confluent Cloud Integration
## Unit Testing
The unit tests can be run by executing the following command from the root directory of the project:
```
mvn test
```

It should be noted that Maven & Java are required to run the unit tests. If you do not have Maven or Java installed, you can reopen the project in the provided dev container and run the tests from there.

## Object Data Consumption
The KafkaConsumerRestDepositor will accept any string as input to be passed into the SDX. If provided a JSON object, the tokens of "encodedMsg" and "estimatedRemovalDate" will be passed through directly to the SDX in the form of the following:
> {depositRequests:[{"encodeType": STRING ,"encodedMsg": STRING, "estimatedRemovalDate": STRING}]}

If provided a string of non-json form, the value of "encodedMsg" will inherit the passed value and information will be passed to the SDX in the form of the following:
> {depositRequests:[{"encodeType": STRING ,"encodedMsg": STRING}]}

## Confluent Cloud Integration
Rather than using a local kafka instance, this project can utilize an instance of kafka hosted by Confluent Cloud via SASL.

## Environment variables
### Purpose & Usage
### Environment variables
#### Purpose & Usage
- The DOCKER_HOST_IP environment variable is used to communicate with the bootstrap server that the instance of Kafka is running on.
- The KAFKA_TYPE environment variable specifies what type of kafka connection will be attempted and is used to check if Confluent should be utilized.
- The CONFLUENT_KEY and CONFLUENT_SECRET environment variables are used to authenticate with the bootstrap server.

### Values
#### Values
- DOCKER_HOST_IP must be set to the bootstrap server address (excluding the port)
- KAFKA_TYPE must be set to "CONFLUENT"
- KAFKA_TYPE must be set to "CONFLUENT", otherwise the application will default to a non-Confluent connection
- CONFLUENT_KEY must be set to the API key being utilized for CC
- CONFLUENT_SECRET must be set to the API secret being utilized for CC

## CC Docker Compose File
### CC Docker Compose File
There is a provided docker-compose file (docker-compose-confluent-cloud.yml) that passes the above environment variables into the container that gets created. Further, this file doesn't spin up a local kafka instance since it is not required.

## Note
This has only been tested with Confluent Cloud but technically all SASL authenticated Kafka brokers can be reached using this method.

# Unit Testing
The unit tests can be run by executing the following command from the root directory of the project:
```
mvn test
```

It should be noted that Maven & Java are required to run the unit tests. If you do not have Maven or Java installed, you can reopen the project in the provided dev container and run the tests from there.

# Object data consumption
The KafkaConsumerRestDepositor will accept any string as input to be passed into the SDW. If provided a JSON object, the tokens of "encodedMsg" and "estimatedRemovalDate" will be passed through directly to the SDW in the form of the following:
{depositRequests:[{"encodeType": STRING ,"encodedMsg": STRING, "estimatedRemovalDate": STRING}]}

If provided a string of non-json form, the value of "encodedMsg" will inherit the passed value and information will be passed to the SDW in the form of the following:
{depositRequests:[{"encodeType": STRING ,"encodedMsg": STRING}]}
### Note
This has only been tested with Confluent Cloud but technically all SASL authenticated Kafka brokers can be reached using this method.
12 changes: 11 additions & 1 deletion docs/Release_notes.md
Original file line number Diff line number Diff line change
@@ -1,6 +1,16 @@
Jpo-sdw-depositor Release Notes
jpo-sdw-depositor Release Notes
----------------------------

Version 1.7.0, released June 2024
----------------------------------------
### **Summary**
The changes for the jpo-sdw-depositor 1.7.0 release include a Java update for the dev container, as well as revised documentation for accuracy and improved clarity/readability.

Enhancements in this release
- CDOT PR 19: Updated dev container to use Java 21
- CDOT PR 20: Revised documentation for accuracy & improved clarity/readability


Version 1.6.0, released February 2024
----------------------------------------

Expand Down
2 changes: 1 addition & 1 deletion pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -13,7 +13,7 @@
</parent>
<groupId>usdot.jpo.ode</groupId>
<artifactId>jpo-sdw-depositor</artifactId>
<version>1.6.0-SNAPSHOT</version>
<version>1.7.0-SNAPSHOT</version>
<packaging>jar</packaging>
<name>jpo-sdw-depositor</name>

Expand Down
6 changes: 3 additions & 3 deletions sample.env
Original file line number Diff line number Diff line change
@@ -1,13 +1,13 @@
DOCKER_HOST_IP=
#SDW_GROUP_ID=usdot.jpo.sdw
#SDW_KAFKA_PORT=9092
#SDW_DESTINATION_URL=https://webapp-integration.cvmvp.com/whtools/rest/v2/
SDW_DESTINATION_URL=https://sdx-service.trihydro.com/api/deposit-multi
SDW_SUBSCRIPTION_TOPIC=<your topic to subscribe to for deposit>
SDW_API_KEY=<your api key>
SDW_EMAIL_LIST=
SDW_EMAIL_FROM=
SPRING_MAIL_HOST=
SPRING_MAIL_PORT=

# Type of Kafka connection to be used. Must be set to CONFLUENT, otherwise the application will default to a non-Confluent connection
KAFKA_TYPE=
CONFLUENT_KEY=
CONFLUENT_SECRET=
1 change: 0 additions & 1 deletion src/main/resources/application.properties
Original file line number Diff line number Diff line change
Expand Up @@ -12,7 +12,6 @@ version=${project.version}
#Input Properties
#================
#sdw.kafkaBrokers=localhost
#sdw.kafkaPort=9092
#sdw.subscriptionTopics = topic.example1 topic.example2

#Output Properties
Expand Down
Loading