Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat: Validation job #1

Open
wants to merge 17 commits into
base: master
Choose a base branch
from
117 changes: 117 additions & 0 deletions .github/workflows/validation-refresh.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,117 @@
name: Refresh Conformance Validation Results
# This workflow is triggered on a schedule
on:
push:
branches: [ master ]
pull_request:
branches: [ master ]

jobs:
data-catalog:
runs-on: ubuntu-latest
outputs:
matrix: ${{ steps.matrix.outputs.result }}
steps:
- name: Install Node.js 20.x
uses: actions/setup-node@v3
with:
node-version: '20.x'
- name: Install dependencies
run: npm install @openactive/dataset-utils
- name: Get all Dataset Site URLs
id: matrix
uses: actions/github-script@v7
with:
script: |
const { getAllDatasetSiteUrls } = require('@openactive/dataset-utils');

const catalogUrls = [
'https://openactive.io/data-catalogs/data-catalog-collection.jsonld',
'https://openactive.io/data-catalogs/data-catalog-collection-preview.jsonld',
'https://openactive.io/data-catalogs/data-catalog-collection-test.jsonld'
];

async function fetchDatasetUrls(url) {
try {
const { urls, errors } = await getAllDatasetSiteUrls(url);

console.log(`Retrieved ${urls.length} dataset URLs for ${url}`);
if (errors.length > 0) {
console.error(`${errors.length} errors encountered during retrieval:`);
errors.forEach(error => {
console.error(`- [${error.status}] ${error.url}: ${error.message}`);
});
}

return urls;
} catch (error) {
console.error("Failed to fetch dataset URLs:", error);
throw error;
}
}

return (await Promise.all(catalogUrls.map(url => fetchDatasetUrls(url)))).flat();
- name: List all Dataset Site URLs
run: |
echo "${{ steps.matrix.outputs.result }}"

validation:
needs:
- data-catalog
runs-on: ubuntu-latest

# Defines a matrix strategy that includes all feeds in the OpenActive catalogues
strategy:
fail-fast: false
matrix:
feed: ${{fromJSON(needs.data-catalog.outputs.matrix)}}
steps:
# checks out Test Suite and places it in a directory named "tests."
- name: Checkout OpenActive Test Suite
uses: actions/checkout@v2
with:
repository: openactive/openactive-test-suite
path: tests

- name: Setup Node.js 18.17.1
uses: actions/setup-node@v1
with:
node-version: 18.17.1

# runs `npm install` to install the JavaScript dependencies for Test Suite, which is located in the "tests"
# directory
- name: Install OpenActive Test Suite
run: npm install
env:
# Puppeteer is not required as we are only running validation, so this will speed up the installation
PUPPETEER_SKIP_CHROMIUM_DOWNLOAD: true
working-directory: tests

# runs the OpenActive integration tests for the specified feed
- name: Run OpenActive Test Suite feed validation for ${{ matrix.feed }}
run: npm run validate-feeds "${{ matrix.feed }}"
env:
FORCE_COLOR: 1
NODE_CONFIG: |
{"broker": {"outputPath": "../../output/"}}
working-directory: tests

# uploads the test output as an artifact, which can be used for reference or debugging later
- name: Upload test output for ${{ matrix.feed }} mode as artifact
uses: actions/upload-artifact@v2
if: ${{ success() || failure() }}
with:
name: ${{ matrix.feed }}
path: ./tests/output/

# deploys a conformance certificate to Azure Blob Storage, but it is conditional and only runs when the GitHub
# branch is `master` and the `profile` is 'all-features' and the `mode` is 'controlled'. For more information on
# Coformance Certificates, see `packages/openactive-integration-tests/test/certification/README.md` in Test Suite
- name: Deploy validation results to Azure Blob Storage
uses: bacongobbler/[email protected]
with:
source_dir: ./tests/conformance/
container_name: '$web'
connection_string: ${{ secrets.CONFORMANCE_MONITOR_BLOB_STORAGE_CONNECTION_STRING }}
sync: false

Loading