You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
import json
with open("test.out") as file:
for line in file.readlines():
if 'Reporting traffic before failure injection' not in line:
continue
log_data = json.loads(line)
metrics_data = json.loads(log_data["Output"].split('\t')[-1])
print(log_data["Test"], float(metrics_data["qps"]))
Why is this needed?
Track impact of QPS for tests over time.
The text was updated successfully, but these errors were encountered:
What would you like to be added?
Prow page for a test includes "Artifacts" link, which includes a a file with log.
https://gcsweb.k8s.io/gcs/kubernetes-ci-logs/logs/ci-etcd-robustness-main-amd64/1858942059486908416/artifacts/
What if we could parse the file and track the progress over time? No longer guesses over what QPS is good
What if we could do analysis for different dimensions? We could find test scenarios that might need improvemnt
What if we could visualize it? Like https://perf-dash.k8s.io/#/?jobname=gce-5000Nodes&metriccategoryname=APIServer&metricname=LoadResponsiveness_Prometheus&Resource=pods&Scope=cluster&Subresource=&Verb=LIST
Script to parse it
Why is this needed?
Track impact of QPS for tests over time.
The text was updated successfully, but these errors were encountered: