Skip to content

Commit

Permalink
docs(testing-strategy): add unit testing test plan
Browse files Browse the repository at this point in the history
  • Loading branch information
ameteiko committed Jul 15, 2024
1 parent 81cf6b9 commit 4f19f42
Showing 1 changed file with 37 additions and 5 deletions.
42 changes: 37 additions & 5 deletions docs/contributor/test-plan-unittesting.md
Original file line number Diff line number Diff line change
Expand Up @@ -24,9 +24,13 @@ The primary audience for this document are `Team Leads` and `Software engineers`

## References

- github.com/stretchr/testify
- https://github.com/vektra/mockery
- https://github.com/kubernetes-sigs/controller-runtime/tree/main/pkg/client/fake

## Context of testing

Unit testing is part of the build and PRs have a hard dependency on all the unit tests to pass.
Unit testing is part of the build and PRs have a hard dependency on all the unit tests to pass.

When there are different versions of a code, or different feature flags, that can impact the logic of a given unit, each context has to have its own dedicated unit test or the same unit test needs to be run with all the identified contexts.

Expand All @@ -40,9 +44,34 @@ All the units of the code, in cloud-manager repository, and all its features are

### Assumptions and constraints

All the unit tests must pass or skipped as part of the build.
- All the unit tests must pass or skipped as part of the build.

- Code coverage must remain in an expected predefined treshold for the PR check to succeed.

- For any given source file, the percentage of the code coverage should not drop as part of the PR. (Good to have)

- To make a code testable, dependency injection as a set of design patterns must be used at all times. Having a hard dependency to a specific object in a code, makes unit testing it in isolation extremely difficult or even impossible.

- Developers are recommended to follow a TDD approach.

- For assertions, testify

Code coverage must remain in an expected predefined treshold for the PR check to succeed. For any given source file, the percentage of the code coverage should not drop as part of the PR. (Good to have)
- For mocking, developers have two options:

- Write their own fake/mock object
- Use testify/mock and/or mockery if needed.

- For mocking a http server, e.g. a call to gcp, net.http.httptest to be used

- To test a unit in isolation, which calls k8s APIs in its logic, pass fake k8s client instead.

### Code style and naming convention

- A test should follow this signature: func Test<Method Name>(t *testing.T). <Method Name> should be replaced by the source code method to be unit tested.
- When unit testing a method need more than one logical test, as it has conditional execution path depending on the input, one of the following approaches must be taken:
- A testify.suite.Suite to be created with as many methods as needed to test all execution pathes.
- A series of subtests to be executed with this signature: t.Run("name of the test, func(t *testing.T){...}).
- Naming of the methods/subtests for both above options should be in upper camel case and should convey the scenario/path it is testing.

## Test strategy

Expand All @@ -61,7 +90,9 @@ Level 1 tests, i.e. unit tests, must always pass and as they run in isolation, t

### Test deliverables

Code coverage report that shows the coverage percentage of all the separate source files, preferably in a format that provides a visual representation of the covered code. Also an aggregate percentage on the entire source code coverage.
Code coverage report that shows the coverage percentage of all the separate source files, preferably in a format that provides a visual representation of the covered code.

Also an aggregate percentage on the entire source code coverage.

### Entry and exit criteria to each phase

Expand Down Expand Up @@ -91,6 +122,7 @@ To prevent any false positives, unit tests must get executed in an env with enou

## Appendix A. Quality Evidence

Quality Evidence Provides a list of reports created during the test plan execution and all relevant quality evidence artifacts.
- For any given PR (to be replaced in following url), the unit test results can be found in https://github.com/kyma-project/cloud-manager/pull/<PR>/checks.
- For each build under the checks, the results of the test can be found in "Build and test" step.

## Appendix B. Supplementary References

0 comments on commit 4f19f42

Please sign in to comment.