Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

test script updates #204

Merged
merged 5 commits into from
Jul 4, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
7 changes: 6 additions & 1 deletion README.md
Original file line number Diff line number Diff line change
Expand Up @@ -24,12 +24,17 @@ Tests

In the `test/` directory, there is a test suite that tries to make sure that no previously supported log line will break because of changing common patterns and such. It also returns results a lot faster than doing `sudo service logstash restart` :-).

The test suite needs the patterns provided by Logstash, you can easily pull these from github by running `git submodule update --init`. To run the test suite, you need a recent version of `ruby` (`2.6` or newer should work), and the `jls-grok` and `minitest` gems. Then simply execute `ruby test/test.rb`. NOTE: The whole test process can now be executed inside a docker container, simply by running the `runtests.sh` script.
The test suite needs the patterns provided by Logstash, you can easily pull these from github by running `git submodule update --init`. To run the test suite, you need a recent version of `ruby` (`2.6` or newer should work), and the `jls-grok` and `minitest` gems. Then simply execute `ruby test/test.rb`. NOTE: The whole test process can now be executed inside a docker container, simply by running the `test_grok_patterns.sh` script.

Adding new test cases can easily be done by creating new yaml files in the test directory. Each file specifies a grok pattern to validate, a sample log line, and a list of expected results.

Also, the example Logstash config file adds some informative tags that aid in finding grok failures and unparsed lines. If you're not interested in those, you can remove all occurrences of `add_tag` and `tag_on_failure` from the config file.

Additional test scripts are available for local tests (using docker containers):
- `test_grok_patterns.sh`: runs the test suite for the grok patterns in `postfix.grok`
- `test_logstash_config.sh`: validates the logstash config in `50-filter-postfix.conf`
- `test_pipeline.sh`: validates that the logstash config can be used in a simple logstash pipeline, and ensures that this results in parsed messages

Contributing
------------

Expand Down
12 changes: 9 additions & 3 deletions test_config_syntax.sh
Original file line number Diff line number Diff line change
@@ -1,9 +1,15 @@
#!/bin/sh

#
# This script is used to test the config syntax of the 50-filter-postfix.conf file.
#
# The configuration file is validated using the logstash --config.test_and_exit command in a docker container.
#

set -eux

docker run --rm -it \
--volume $(pwd)/postfix.grok:/etc/logstash/patterns.d/postfix.grok \
--volume $(pwd)/50-filter-postfix.conf:/usr/share/logstash/pipeline/50-filter-postfix.conf \
logstash:8.12.0 \
--volume "$(pwd)"/postfix.grok:/etc/logstash/patterns.d/postfix.grok \
--volume "$(pwd)"/50-filter-postfix.conf:/usr/share/logstash/pipeline/50-filter-postfix.conf \
logstash:8.14.1 \
logstash --config.test_and_exit -f /usr/share/logstash/pipeline/50-filter-postfix.conf
8 changes: 7 additions & 1 deletion test_grok_patterns.sh
Original file line number Diff line number Diff line change
@@ -1,5 +1,11 @@
#!/bin/sh

#
# This script is used to test the grok patterns in the postfix.grok file.
#
# The patterns are tested by running the test suite (in test/test.rb and test/*.yaml)
# against the patterns in the postfix.grok file in a docker container.
#
set -eux

DOCKERIMAGE="postfix-grok-patterns-runtests"
Expand All @@ -12,4 +18,4 @@ FROM ruby:slim
RUN gem install jls-grok minitest
EOF

docker run --volume $(pwd):"${VOLUMEPATH}" --workdir ${VOLUMEPATH} ${DOCKERIMAGE} sh -c "ruby test/test.rb"
docker run --volume "$(pwd)":"${VOLUMEPATH}" --workdir ${VOLUMEPATH} ${DOCKERIMAGE} sh -c "ruby test/test.rb"
76 changes: 76 additions & 0 deletions test_pipeline.sh
Original file line number Diff line number Diff line change
@@ -0,0 +1,76 @@
#!/bin/sh

#
# This script is used to test the logstash pipeline configuration.
#
# It sets up a logstash pipeline with the postfix configuration,
# sends a test logline through the pipeline and checks the results.
#

set -eux

INPUT=$(mktemp tmp.logstash.in.XXXXX)
OUTPUT=$(mktemp tmp.logstash.out.XXXXX)
PIPELINE=$(mktemp tmp.logstash.pipeline.XXXXX)

echo Preparing input data
echo "postfix/smtp[123]: 7EE668039: to=<[email protected]>, relay=127.0.0.1[127.0.0.1]:2525, delay=3.6, delays=0.2/0.02/0.04/3.3, dsn=2.0.0, status=sent (250 2.0.0 Ok: queued as 153053D)" > "$INPUT"

echo Preparing pipeline config
cat > "$PIPELINE" << EOF
input {
file {
path => "/tmp/logstash.in"
start_position => beginning
}
}
filter {
dissect {
mapping => {
"message" => "%{program}[%{pid}]: %{message}"
}
}
}
EOF

cat 50-filter-postfix.conf >> "$PIPELINE"

cat >> "$PIPELINE" << EOF
output {
file {
path => "/tmp/logstash.out"
}
}
EOF

echo Starting logstash docker container
CONTAINER_ID=$(docker run --rm --detach \
--volume ./"${INPUT}":/tmp/logstash.in \
--volume ./"${OUTPUT}":/tmp/logstash.out \
--volume ./postfix.grok:/etc/logstash/patterns.d/postfix.grok \
--volume ./"${PIPELINE}":/usr/share/logstash/pipeline/pipeline.conf \
logstash:8.12.0 \
logstash -f /usr/share/logstash/pipeline/pipeline.conf)

printf "Waiting for output from logstash "
until test -s "$OUTPUT"; do
printf "."
sleep 2
done
echo

docker stop --time 1 "$CONTAINER_ID" > /dev/null

if test "$(jq .tags[0] "$OUTPUT")" = '"_grok_postfix_success"'; then
echo Grok processing successful!
jq . "$OUTPUT"
else
echo "Grok processing failed :<"
jq . "$OUTPUT"
exit 1
fi

echo Cleaning up
rm -f "$INPUT" "$OUTPUT" "$PIPELINE"

echo Done