This project provides a flexible and extensible test suite runner for validating implementations of the specification for securing W3C Verifiable Credentials using JSON Object Signing and Encryption (JOSE) and CBOR Object Signing and Encryption (COSE).
It's designed to work with different types of implementations (SDK or server) as long as they conform to a common CLI interface via Docker.
The suite makes use Digital Bazaar's mocha-w3c-interop-reporter.
- Project Structure
- Key Components
- Adding Implementations
- Running Tests
- Extending the Test Suite
- Docker Integration
- Troubleshooting
.
├── implementations/
│ ├── compose.yml
│ ├── implementations.json
│ └── [implementation folders]
├── tests/
│ ├── input/
│ └── output/
├── reports/
│ ├── index.html
│ └── suite.log
├── test-mapping.js
├── test-runner.js
├── test-util.js
└── README.md
This file defines the structure of the test suite. It exports two main objects:
TestResult
: An enum of possible test outcomes (success, failure, indeterminate, error).TestMapping
: A mapping of test names to their configurations. Each test configuration includes:number
: A unique identifier for the testinput_file
: The name of the input file to be used, representing:- For issuance, an unsigned Verifiable Credential or Presentation serialized as JSON (a
.json
file) - For verification, a signed Verifiable Credential or Presentation, encoded as a JWT string (JOSE),
Base64 string (COSE), or SD-JWT string (Selective Disclosure JWT) (a
.txt
file)
- For issuance, an unsigned Verifiable Credential or Presentation serialized as JSON (a
key_file
: The name of the key file to be used, representing a Verification Method (a.json
file)fn
: The function being tested eitherissue
orverify
disclosure_paths
: An array of paths to be disclosed in a Selective Disclosure JWT (e.g., a JSON array like["issuer", "validFrom", "credentialSubject.id"]
)feature
: The function being tested, one ofcredential_jose
,credential_cose
,credential_sdjwt
,presentation_jose
,presentation_cose
, orpresentation_sdjwt
expected_result
: The expected outcome of the test written to a file of the format below, whereresult
is one ofsuccess
,failure
,indeterminate
, orerror
; anddata
is a string containing a signed and encoded credential or presentation.{ "result": "success", "data": "..." }
This is the main test runner script. It does the following:
- Loads the implementations and their supported features
- Iterates through each implementation and test
- Skips tests for features not supported by an implementation
- Runs the tests and compares the results to the expected outcomes
- Generates a report of the test results
This file contains utility functions used by the test runner:
generateTestResults
: Executes the Docker command to run a test for a specific implementationcheckTestResults
: Reads and interprets the results of a test execution
To add a new implementation:
- Create a new folder in the
implementations/
directory with your implementation name. - Add your implementation files, including a Dockerfile that sets up your environment.
- Update
implementations/implementations.json
to include your new implementation and its supported features:Note: If your implementation does not support a feature, set the value to{ "your-implementation-name": { "features": { "feature1": true, "feature2": false, "feature3": true } } }
false
. This will cause the test runner to skip tests for that feature. - Update
implementations/compose.yml
to include your new service:services: your-implementation-name: build: ./your-implementation-name volumes: - ../tests/input:/tests/input - ../tests/output:/tests/output
To run the test suite:
- Ensure Docker and Docker Compose are installed on your system.
- Navigate to the project root directory.
- Run the test runner script (the exact command may vary based on your setup, e.g.,
node test-runner.js
). There is also an npm script that can be used to run the test suite:npm run test
- The test runner will execute each test for each implementation and generate a report in the
reports/
directory.
To add new tests:
- Add any necessary input files to the
tests/input/
directory. - Update
test-mapping.js
to include the new test configurations. - If testing a new feature, ensure implementations are updated to declare support (or lack thereof) for the new feature.
Each implementation should provide a Docker container that exposes a CLI with the following interface:
validate --input <input_file> --config '<config_json>' --output <output_file>
<input_file>
: Path to the input file within the container<config_json>
: JSON string containing test configuration<output_file>
: Path where the output should be written within the container
The Docker containers are run using Docker Compose, with volumes mounted to provide access to the input and output directories.
This configuration setup is designed to be flexible and can be modified to suit the specific requirements of each implementation, and it can be modified to suit the specific requirements of a given test suite.
If you encounter issues:
- Check the console output for error messages.
- Verify that all necessary files exist in the expected locations.
- Ensure that Docker containers have the necessary permissions to read input and write output.
- Check that implementations correctly handle the provided CLI arguments.
For more detailed debugging:
- Add
console.log
statements in the test runner or utility functions. - Inspect the Docker container logs for implementation-specific issues.
For any questions or issues not covered in this README, please open an issue in the project repository.