Skip to content

Commit

Permalink
Add unit and regression tests for conformance tests (#531)
Browse files Browse the repository at this point in the history
Resolves #476.

Co-authored-by: Matthias Büchse <[email protected]>
Signed-off-by: Martin Morgenstern <[email protected]>
  • Loading branch information
martinmo and mbuechse authored Mar 22, 2024
1 parent 73e9242 commit 2cf8efd
Show file tree
Hide file tree
Showing 14 changed files with 558 additions and 68 deletions.
6 changes: 3 additions & 3 deletions .github/workflows/lint-python.yml
Original file line number Diff line number Diff line change
Expand Up @@ -11,9 +11,9 @@ jobs:
lint-python-syntax:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v3
- uses: actions/setup-python@v4
- uses: actions/checkout@v4
- uses: actions/setup-python@v5
with:
python-version: '3.x'
python-version: '3.10'
- run: pip3 install flake8
- run: flake8
26 changes: 26 additions & 0 deletions .github/workflows/test-python.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,26 @@
---
name: Run Python unit and regression tests

"on":
push:
paths:
- '**.py'
- .github/workflows/test-python.yml

jobs:
run-pytest-tests:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
- uses: actions/setup-python@v5
with:
python-version: '3.10'
cache: 'pip'
cache-dependency-path: |
Tests/requirements.txt
Tests/test-requirements.txt
- run: pip3 install --upgrade pip setuptools
- run: pip3 install -r requirements.txt -r test-requirements.txt
working-directory: ./Tests
- run: pytest --cov
working-directory: ./Tests
3 changes: 3 additions & 0 deletions Tests/.coveragerc
Original file line number Diff line number Diff line change
@@ -0,0 +1,3 @@
[run]
omit = *_test.py
branch = True
2 changes: 2 additions & 0 deletions Tests/.gitignore
Original file line number Diff line number Diff line change
@@ -0,0 +1,2 @@
htmlcov/
.coverage
60 changes: 51 additions & 9 deletions Tests/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -70,20 +70,62 @@ docker run -it --env OS_CLOUD=CLOUDNAME -v ~/.config/openstack:/root/.config/ope
docker run -it --env OS_CLOUD=CLOUDNAME -v ~/.config/openstack:/root/.config/openstack:ro --entrypoint /bin/bash scs-compliance-check
```

## Maintaining the Python dependencies
## Information for developers

We manage Python dependencies in two files: `requirements.in` and `requirements.txt`.
The former one is meant to be edited by humans, whereas the latter one is
generated by `pip-compile` and contains an exact listing of *all* dependencies,
including transitive ones.
### Unit and regression tests

Some of the conformance tests scripts are themselves tested with unit tests.

To run them, first ensure that you have installed the unit test dependencies
in addition to the main dependencies (inside your [virtualenv as described
above](#local-execution-linux-bsd-)):

```shell
pip install -r test-requirements.txt
```

Now you can run the unit tests with `pytest`:

```shell
# Option A: let pytest discover and run all unit tests (**/*_test.py)
pytest

# Option B: run only a subset of the tests
pytest kaas/k8s-version-policy/k8s_version_policy_test.py

# Option C: produce a HTML code coverage report and open it
pytest --cov --cov-report=html
xdg-open htmldoc/index.html
```

You are encouraged to cover new conformance tests with unit tests!
We run the tests on a regular basis in our GitHub workflows.

### Maintaining the Python dependencies

We list our main Python dependencies in `requirements.in`. Additionally, we list
[unit tests](#unit-and-regression-tests) dependencies in `test-requirements.in`.
The `*.in` files are fed to `pip-compile` to produce corresponding `*.txt` files
that contain an exact, version-pinned listing of *all* dependencies, including
transitive ones.

`pip-compile` can be installed via `pip install pip-tools`.
It needs to be run in two cases:

1. You modified `requirements.in` (e.g., added a new dependency): run
`pip-compile requirements.in`.
2. You want to bump the pinned dependencies: run `pip-compile --upgrade requirements.in`.
1. You modified an `*.in` file: run `pip-compile <INFILE>`. For example:

```shell
pip-compile test-requirements.in
```

2. You want to bump the pinned dependencies: add the `--upgrade` flag to the
`pip-compile` invocation. For example:

```shell
pip-compile --upgrade requirements.in
```

Note: The Python version used for running `pip-compile` should be consistent. The currently
used version is documented in the header of the `requirements.txt`. It should match the
version used in the Docker image (see [Dockerfile](Dockerfile)).
version used in the Docker image (see [Dockerfile](Dockerfile)) and in our GitHub
workflows (`lint-python.yml` and `test-python.yml` in `.github/workflows`).
Original file line number Diff line number Diff line change
Expand Up @@ -8,9 +8,9 @@
Also check that the flavor properties encoded in the YAML match those encoded in the name.
"""
import os
import os.path

import sys
from pathlib import Path

import yaml

Expand Down Expand Up @@ -71,36 +71,40 @@ def check_spec(self, flavor_spec):
)


def main(argv):
if len(argv) != 2:
raise RuntimeError("must specify exactly one argument, PATH to the yaml directory")
yaml_path = argv[1]
yaml_files = sorted([
filename
for filename in os.listdir(yaml_path)
if filename.startswith("scs-0103-") and filename.endswith(".yaml")
])
def check(yaml_dir_path):
if not isinstance(yaml_dir_path, Path):
raise ValueError("yaml_dir_path must be a pathlib.Path")
yaml_paths = sorted(yaml_dir_path.glob("scs-0103-*.yaml"))
main_checker = Checker()
if not yaml_files:
if not yaml_paths:
main_checker.emit("no yaml files found!")
for filename in yaml_files:
with open(os.path.join(yaml_path, filename), "rb") as fileobj:
for yaml_path in yaml_paths:
with open(yaml_path, mode="rb") as fileobj:
flavor_spec_data = yaml.safe_load(fileobj)
if 'flavor_groups' not in flavor_spec_data:
main_checker.emit(f"file '{filename}': missing field 'flavor_groups'")
main_checker.emit(f"file '{yaml_path.name}': missing field 'flavor_groups'")
continue
checker = Checker()
for group in flavor_spec_data['flavor_groups']:
for flavor_spec in group['list']:
checker.check_spec(flavor_spec)
if checker.errors:
main_checker.emit(f"file '{filename}': found {checker.errors} errors")
main_checker.emit(f"file '{yaml_path.name}': found {checker.errors} errors")
return main_checker.errors


def validate_args(argv):
if len(sys.argv) != 2:
raise RuntimeError("must specify exactly one argument, PATH to the yaml directory")
yaml_path = Path(argv[1])
if not yaml_path.is_dir():
raise RuntimeError(f"not a directory: {yaml_path}")
return yaml_path


if __name__ == "__main__":
try:
sys.exit(main(sys.argv))
sys.exit(check(validate_args(sys.argv)))
except Exception as e:
print(f"CRITICAL: {e!s}", file=sys.stderr)
sys.exit(1)
40 changes: 40 additions & 0 deletions Tests/iaas/flavor-naming/check_yaml_test.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,40 @@
"""
Pytest based unit and regression tests for check_yaml.
(c) Martin Morgenstern <[email protected]>, 3/2024
SPDX-License-Identifier: CC-BY-SA-4.0
"""

import pytest
from check_yaml import check
from pathlib import Path


HERE = Path(__file__).parent
TEST_ROOT = HERE.parent.parent

CLEAN_YAML_DIR = Path(TEST_ROOT, "iaas")
BUGGY_YAML_DIR = Path(TEST_ROOT, "testing")

EXPECTED_ERRORS = """
ERROR: flavor 'SCS-1V-4': field 'cpu-type' contradicting name-v2 'SCS-1V-4'; found 'crowded-core', expected 'shared-core'
ERROR: flavor 'SCS-2V-8': field 'name-v1' contradicting name-v2 'SCS-2V-8'; found 'SCS-2V-8', expected 'SCS-2V:8'
ERROR: flavor 'SCS-4V-16': field 'ram' contradicting name-v2 'SCS-4V-16'; found 12, expected 16.0
ERROR: flavor 'SCS-8V-32': field 'disk' contradicting name-v2 'SCS-8V-32'; found 128, expected undefined
ERROR: flavor 'SCS-1V-2': field 'cpus' contradicting name-v2 'SCS-1V-2'; found 2, expected 1
ERROR: flavor 'SCS-2V-4-20s': field 'disk0-type' contradicting name-v2 'SCS-2V-4-20s'; found 'network', expected 'ssd'
ERROR: flavor 'SCS-4V-16-100s': field 'disk' contradicting name-v2 'SCS-4V-16-100s'; found 10, expected 100
ERROR: file 'scs-0103-v1-flavors-wrong.yaml': found 7 errors
""".strip()

TEST_PARAMS = (
(CLEAN_YAML_DIR, 0, ""),
(BUGGY_YAML_DIR, 1, EXPECTED_ERRORS),
)


@pytest.mark.parametrize("directory, num_errors, expected_output", TEST_PARAMS)
def test_check_yaml(capsys, directory, num_errors, expected_output):
assert check(directory) == num_errors
captured = capsys.readouterr()
assert captured.err.strip() == expected_output
82 changes: 45 additions & 37 deletions Tests/kaas/k8s-version-policy/k8s_version_policy.py
Original file line number Diff line number Diff line change
Expand Up @@ -24,7 +24,7 @@
(c) Hannes Baum <[email protected]>, 6/2023
(c) Martin Morgenstern <[email protected]>, 2/2024
(c) Matthias Büchse <[email protected]>, 3/2024
License: CC-BY-SA 4.0
SPDX-License-Identifier: CC-BY-SA-4.0
"""

from collections import Counter
Expand All @@ -46,7 +46,7 @@

MINOR_VERSION_CADENCE = timedelta(days=120)
PATCH_VERSION_CADENCE = timedelta(weeks=1)
CVE_VERSION_CADENCE = timedelta(days=3)
CVE_VERSION_CADENCE = timedelta(days=2)
CVE_SEVERITY = 8 # CRITICAL

HERE = Path(__file__).parent
Expand Down Expand Up @@ -167,8 +167,11 @@ def __str__(self):
return f"{self.major}.{self.minor}.{self.patch}"


K8sVersion.MINIMUM = K8sVersion(0, 0)


def parse_version(version_str: str) -> K8sVersion:
cleansed = version_str.removeprefix("v").strip()
cleansed = version_str.strip().removeprefix("v")
try:
major, minor, patch = cleansed.split(".")
return K8sVersion(int(major), int(minor), int(patch))
Expand Down Expand Up @@ -239,38 +242,33 @@ def parse_github_release_data(release_data: dict) -> K8sRelease:

@dataclass(frozen=True, eq=True)
class VersionRange:
"""Version range with a lower and upper bound."""
"""
Version range with a lower and upper bound.
Supports checking if a K8sVersion is in the range using the `in`
operator.
If `inclusive` is True, `upper_version` is inside the range (i.e.,
it is a closed interval), otherwise `upper_version` is outside.
If `upper_version` is not set, the range just represents a single
version, namely `lower_version`.
"""

# First version with the CVE; this value will be set if either an affected
# version is directly set in a CVE dataset or if the CVE dataset is in a
# non-standard format. If the variable is set, `lower_version` and
# `upper_version` create a range of affected versions.
lower_version: K8sVersion

# Last version with the CVE
upper_version: K8sVersion

# True if upper_version is included in the range of affected versions
upper_version: K8sVersion = None
inclusive: bool = False

def __post_init__(self):
if self.lower_version is None:
raise ValueError("lower_version must not be None")
if self.upper_version and self.upper_version < self.lower_version:
raise ValueError("lower_version must be lower than upper_version")

def __contains__(self, version: K8sVersion) -> bool:
# See the following link for more information about the format
# https://www.cve.org/AllResources/CveServices#cve-json-5

# Check if an `upper version` exists
if self.upper_version:
# Check if a `lower version` exists and compare the version against it
if self.lower_version:
gt = self.lower_version <= version
else:
gt = True
# Compare the version either with `less than` or `less than or equal` against the `upper version`
if self.inclusive:
return gt and self.upper_version >= version
return gt and self.upper_version > version
else:
# If no upper version exists, we only need to check if the version is equal to the `lower version`
if self.upper_version is None:
return self.lower_version == version
if self.inclusive:
return self.lower_version <= version <= self.upper_version
return self.lower_version <= version < self.upper_version


@dataclass
Expand Down Expand Up @@ -318,6 +316,9 @@ def parse_cve_version_information(cve_version_info: dict) -> VersionRange:
vi_lower_version = parse_version(vdata[0])
vi_upper_version = parse_version(vdata[1])

if vi_lower_version is None:
vi_lower_version = K8sVersion.MINIMUM

return VersionRange(vi_lower_version, vi_upper_version, inclusive)


Expand Down Expand Up @@ -391,9 +392,16 @@ async def get_k8s_cluster_info(kubeconfig, context=None) -> ClusterInfo:
return ClusterInfo(version, cluster_config.current_context['name'])


def check_k8s_version_recency(my_version: K8sVersion, releases_data: list[dict], cve_version_list=()) -> bool:
"""Check a given K8s cluster version against the list of released versions in order to find out if the version
is an accepted recent version according to the standard."""
def check_k8s_version_recency(
my_version: K8sVersion,
releases_data: list[dict],
cve_affected_ranges: set[VersionRange],
) -> bool:
"""
Check a given K8s cluster version against the list of released versions
in order to find out if the version is an accepted recent version according
to the standard.
"""

# iterate over all releases in the list, but only look at those whose branch matches
# we might break early assuming that the list is sorted somehow, but it is usually
Expand All @@ -411,11 +419,11 @@ def check_k8s_version_recency(my_version: K8sVersion, releases_data: list[dict],
if release.age > PATCH_VERSION_CADENCE:
# whoops, the cluster should have been updated to this (or a higher version) already!
return False
if my_version in cve_version_list and release.age > CVE_VERSION_CADENCE:
# -- three FIXMEs:
# (a) can the `in` ever become true, when we have a version vs a set of ranges
# (b) if the release still has the CVE, then there is no use if we updated to it?
# (c) the standard says "time period MUST be even shorter ... it is RECOMMENDED that ...",
ranges = [_range for _range in cve_affected_ranges if my_version in _range]
if ranges and release.age > CVE_VERSION_CADENCE:
# -- two FIXMEs:
# (a) if the release still has the CVE, then there is no use if we updated to it?
# (b) the standard says "time period MUST be even shorter ... it is RECOMMENDED that ...",
# so what is it now, a requirement or a recommendation?
# shouldn't we check for CVEs of my_version and then check whether the new one still has them?
# -- so, this has to be reworked in a major way, but for the time being, just emit an INFO
Expand Down
Loading

0 comments on commit 2cf8efd

Please sign in to comment.