-
-
Notifications
You must be signed in to change notification settings - Fork 12
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Performance tests #597
base: master
Are you sure you want to change the base?
Performance tests #597
Conversation
14dbb31
to
846c842
Compare
9d2ccb6
to
d7af1ff
Compare
1c8604d
to
1845b19
Compare
239cf57
to
a648792
Compare
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Looks awesome. Left some minor suggestions
```json | ||
{ | ||
"feature/cool-feature": 20, | ||
"bugfix/urgent-fix": 15 | ||
} |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
suggestion: A sentence or two explaining what the threshold number means would be useful
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I removed this file and merged everything with the testing docs that I asked Cal to write
@@ -0,0 +1,3 @@ | |||
{ | |||
"renatav/performance-tests": 15 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
question: can we bring this threshold to be lower? Is 15 times slower before it errors out too much?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
That's 15%. The default is 10%, this is just an example of how to overwrite the default value
.github/workflows/ci.yml
Outdated
|
||
env: | ||
ARTIFACT_NAME: benchmark-results${{ matrix.python-version }} | ||
BRANCH_NAME: 'master' # needs to be changed to master before merging |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
BRANCH_NAME: 'master' # needs to be changed to master before merging | |
BRANCH_NAME: 'master' |
.github/workflows/ci.yml
Outdated
script: | | ||
const artifacts = await github.rest.actions.listArtifactsForRepo({ | ||
owner: context.repo.owner, | ||
repo: context.repo.repo, | ||
}); | ||
const filteredArtifacts = artifacts.data.artifacts.filter(artifact => artifact.name === process.env.ARTIFACT_NAME); | ||
let latestArtifact = null; | ||
|
||
for (const artifact of filteredArtifacts) { | ||
const run = await github.rest.actions.getWorkflowRun({ | ||
owner: context.repo.owner, | ||
repo: context.repo.repo, | ||
run_id: artifact.workflow_run.id, | ||
}); | ||
|
||
if (run.data.head_branch === process.env.BRANCH_NAME) { | ||
if (!latestArtifact || new Date(artifact.created_at) > new Date(latestArtifact.created_at)) { | ||
latestArtifact = artifact; | ||
} | ||
} | ||
} | ||
|
||
if (latestArtifact) { | ||
console.log(`Found latest artifact: ${latestArtifact.id}`); | ||
core.setOutput('artifact_id', latestArtifact.id.toString()); | ||
} else { | ||
console.log('No matching artifacts found.'); | ||
core.setOutput('artifact_id', ''); | ||
} |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
question: could we check in this javascript (and other) code somewhere in the .github
(e.g. .github/scripts
) directory and have GitHub actions require the file? The benefit is that we have syntax highlighting and would be easier to debug [1]?
…load code to a separate js file
a1fd500
to
a7f863f
Compare
19a4aca
to
a7f863f
Compare
.github/workflows/ci.yml
Outdated
- name: Run pre-commit and test with pytest | ||
# - name: Run pre-commit and test with pytest | ||
# run: | | ||
# pre-commit run --all-files | ||
# pytest taf/tests | ||
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
question: is running pytest taf/tests/
and pre-commit supposed to be commented out?
# Use a step to extract the branch name from github.ref | ||
- name: Extract Branch Name | ||
id: extract_branch | ||
run: echo "::set-output name=branch::$(echo ${GITHUB_REF#refs/heads/})" |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
suggestion: set-output is getting deprecated. Could we please use the new approach [1]? The deprecation notice is here.
## Artifact Handling | ||
|
||
- **Uploading Artifacts**: After benchmarks are run, the resulting `0001_output.json` file is uploaded as an artifact to GitHub Actions, allowing it to be used in future benchmark comparisons. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
suggestion: adding link to the already uploaded artifacts (from some older GitHub Actions build as an example) would be very useful here
Description (e.g. "Related to ...", etc.)
Integrate benchmark tests into CI:
There are probably many ways in which we can compare this whole logic, but I think this is a good starting point and we'll have some basic performance tests. I think that we can close the issue and open new ones if we think of any improvements
Closes #571