-
Notifications
You must be signed in to change notification settings - Fork 180
Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
fix(robot-server): fetching of data files used in runs of a protocol (#…
…15908) Closes AUTH-638 # Overview Fixes a bug where CSV files used in protocol runs were not being saved to the CSV RTP table for runs, and as a result, were not being included in response for the `/protocols/{protocolId}/dataFiles` endpoint. ## Test Plan and Hands on Testing Testing with app & ODD: - [x] Upload a protocol that uses a CSV parameter - [x] Send the protocol to a robot, upload the csv file to use - [x] Run the protocol - [x] Run the protocol again with a different CSV file -> Do this 5 times (so total 6 runs with 6 different CSV files). By re-running the protocol 6 times, we are making the robot delete its oldest analysis (since max analyses per protocol is 5), essentially deleting the first CSV file from the *analysis* csv table, but not from runs table - [x] Check that when you run the protocol again on the ODD, it shows you all the 6 different CSV files previously uploaded Testing with Postman/ direct HTTP requests: - [x] Upload a few data files - [x] Upload a protocol that uses a CSV parameter and specify a data file (data_file_1) for the CSV param - [x] Start a new analysis for the same protocol by specifying a second data file (data_file_2) for the CSV param - [x] Create a run for the protocol by specifying data_file_1 for its CSV param - [x] Create another run for the protocol by specifying a third data file (data_file_3) for its CSV param - [x] Check that the response to `GET /protocols/{protocolId}/dataFiles` contains the 3 data files used with the runs & analyses. Check that they are listed in the order that the files were uploaded to the server (via `POST /dataFiles`) ## Changelog - wired up CSV RTP table insertion during run creation - updated the run deletion code to remove the CSV RTP entry from the `run_csv_rtp_table` before deleting the run. - updated the `../{protocolId}/dataFiles` response so that it lists the files in the order they were uploaded. - added tests ## Risk assessment Low. Fixes bug
- Loading branch information
Showing
9 changed files
with
322 additions
and
47 deletions.
There are no files selected for viewing
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
250 changes: 250 additions & 0 deletions
250
...er/tests/integration/http_api/protocols/test_get_csv_files_used_with_protocol.tavern.yaml
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,250 @@ | ||
test_name: Test the /protocols/{protocolID}/dataFiles endpoint | ||
|
||
marks: | ||
- usefixtures: | ||
- ot2_server_base_url | ||
|
||
stages: | ||
# The order of these data file uploads is important for this test, | ||
# since the list of data files returned for the specified protocol is in upload order. | ||
# The order in which the files are uploaded in this test is the same as the order in which | ||
# these files are uploaded in the overall integration tests suite. | ||
# Until we add data file cleanup after each test, maintaining this order within the suite | ||
# will be important. | ||
|
||
# sample_record -> test | ||
# sample_plates -> sample_record | ||
# test -> sample_plates | ||
- name: Upload data file 1 | ||
request: | ||
url: '{ot2_server_base_url}/dataFiles' | ||
method: POST | ||
files: | ||
file: 'tests/integration/data_files/test.csv' | ||
response: | ||
save: | ||
json: | ||
data_file_1_id: data.id | ||
data_file_1_name: data.name | ||
status_code: | ||
- 201 | ||
- 200 | ||
|
||
- name: Upload data file 2 | ||
request: | ||
url: '{ot2_server_base_url}/dataFiles' | ||
method: POST | ||
files: | ||
file: 'tests/integration/data_files/sample_record.csv' | ||
response: | ||
save: | ||
json: | ||
data_file_2_id: data.id | ||
data_file_2_name: data.name | ||
status_code: | ||
- 201 | ||
- 200 | ||
|
||
- name: Upload data file 3 | ||
request: | ||
url: '{ot2_server_base_url}/dataFiles' | ||
method: POST | ||
files: | ||
file: 'tests/integration/data_files/sample_plates.csv' | ||
response: | ||
save: | ||
json: | ||
data_file_3_id: data.id | ||
data_file_3_name: data.name | ||
status_code: | ||
- 201 | ||
- 200 | ||
|
||
- name: Upload protocol with CSV file ID | ||
request: | ||
url: '{ot2_server_base_url}/protocols' | ||
method: POST | ||
data: | ||
runTimeParameterFiles: '{{"liq_handling_csv_file": "{data_file_1_id}"}}' | ||
files: | ||
files: 'tests/integration/protocols/basic_transfer_with_run_time_parameters.py' | ||
response: | ||
save: | ||
json: | ||
protocol_id: data.id | ||
analysis_id: data.analysisSummaries[0].id | ||
run_time_parameters_data1: data.analysisSummaries[0].runTimeParameters | ||
strict: | ||
json:off | ||
status_code: 201 | ||
json: | ||
data: | ||
analysisSummaries: | ||
- id: !anystr | ||
status: pending | ||
runTimeParameters: | ||
- displayName: Liquid handling CSV file | ||
variableName: liq_handling_csv_file | ||
description: A CSV file that contains wells to use for pipetting | ||
type: csv_file | ||
file: | ||
id: '{data_file_1_id}' | ||
name: 'test.csv' | ||
|
||
- name: Wait until analysis is completed | ||
max_retries: 5 | ||
delay_after: 1 | ||
request: | ||
url: '{ot2_server_base_url}/protocols/{protocol_id}' | ||
response: | ||
status_code: 200 | ||
json: | ||
data: | ||
analyses: [] | ||
analysisSummaries: | ||
- id: '{analysis_id}' | ||
status: completed | ||
id: !anything | ||
protocolType: !anything | ||
files: !anything | ||
createdAt: !anything | ||
robotType: !anything | ||
protocolKind: !anything | ||
metadata: !anything | ||
links: !anything | ||
|
||
- name: Start a new analysis with a different CSV file | ||
request: | ||
url: '{ot2_server_base_url}/protocols/{protocol_id}/analyses' | ||
method: POST | ||
json: | ||
data: | ||
forceReAnalyze: true | ||
runTimeParameterFiles: | ||
liq_handling_csv_file: '{data_file_3_id}' | ||
response: | ||
strict: | ||
- json:off | ||
status_code: 201 | ||
json: | ||
data: | ||
- id: '{analysis_id}' | ||
status: completed | ||
- id: !anystr | ||
status: pending | ||
runTimeParameters: | ||
- displayName: Liquid handling CSV file | ||
variableName: liq_handling_csv_file | ||
description: A CSV file that contains wells to use for pipetting | ||
type: csv_file | ||
file: | ||
id: '{data_file_3_id}' | ||
name: 'sample_plates.csv' | ||
|
||
- name: Wait until analysis is completed | ||
max_retries: 5 | ||
delay_after: 1 | ||
request: | ||
url: '{ot2_server_base_url}/protocols/{protocol_id}' | ||
response: | ||
status_code: 200 | ||
json: | ||
data: | ||
analyses: [] | ||
analysisSummaries: | ||
- id: '{analysis_id}' | ||
status: completed | ||
- id: !anystr | ||
status: completed | ||
id: !anything | ||
protocolType: !anything | ||
files: !anything | ||
createdAt: !anything | ||
robotType: !anything | ||
protocolKind: !anything | ||
metadata: !anything | ||
links: !anything | ||
|
||
- name: Create a run from the protocol and a CSV file | ||
request: | ||
url: '{ot2_server_base_url}/runs' | ||
method: POST | ||
json: | ||
data: | ||
protocolId: '{protocol_id}' | ||
runTimeParameterFiles: | ||
liq_handling_csv_file: '{data_file_1_id}' | ||
response: | ||
status_code: 201 | ||
save: | ||
json: | ||
run_id1: data.id | ||
run_time_parameters_data2: data.runTimeParameters | ||
strict: | ||
json:off | ||
json: | ||
data: | ||
id: !anystr | ||
ok: True | ||
createdAt: !re_fullmatch "\\d{4}-\\d{2}-\\d{2}T\\d{2}:\\d{2}:\\d{2}\\.\\d+(Z|([+-]\\d{2}:\\d{2}))" | ||
status: idle | ||
runTimeParameters: | ||
- displayName: Liquid handling CSV file | ||
variableName: liq_handling_csv_file | ||
description: A CSV file that contains wells to use for pipetting | ||
type: csv_file | ||
file: | ||
id: '{data_file_1_id}' | ||
name: 'test.csv' | ||
|
||
- name: Create another run from the protocol and a different CSV file | ||
request: | ||
url: '{ot2_server_base_url}/runs' | ||
method: POST | ||
json: | ||
data: | ||
protocolId: '{protocol_id}' | ||
runTimeParameterFiles: | ||
liq_handling_csv_file: '{data_file_2_id}' | ||
response: | ||
status_code: 201 | ||
save: | ||
json: | ||
run_id2: data.id | ||
run_time_parameters_data3: data.runTimeParameters | ||
strict: | ||
json:off | ||
json: | ||
data: | ||
id: !anystr | ||
ok: True | ||
createdAt: !re_fullmatch "\\d{4}-\\d{2}-\\d{2}T\\d{2}:\\d{2}:\\d{2}\\.\\d+(Z|([+-]\\d{2}:\\d{2}))" | ||
status: idle | ||
runTimeParameters: | ||
- displayName: Liquid handling CSV file | ||
variableName: liq_handling_csv_file | ||
description: A CSV file that contains wells to use for pipetting | ||
type: csv_file | ||
file: | ||
id: '{data_file_2_id}' | ||
name: 'sample_record.csv' | ||
|
||
- name: Fetch data files used with the protocol so far | ||
request: | ||
url: '{ot2_server_base_url}/protocols/{protocol_id}/dataFiles' | ||
response: | ||
status_code: 200 | ||
json: | ||
meta: | ||
cursor: 0 | ||
totalLength: 3 | ||
data: | ||
- id: '{data_file_1_id}' | ||
name: "test.csv" | ||
createdAt: !re_fullmatch "\\d{4}-\\d{2}-\\d{2}T\\d{2}:\\d{2}:\\d{2}\\.\\d+(Z|([+-]\\d{2}:\\d{2}))" | ||
- id: '{data_file_2_id}' | ||
name: "sample_record.csv" | ||
createdAt: !re_fullmatch "\\d{4}-\\d{2}-\\d{2}T\\d{2}:\\d{2}:\\d{2}\\.\\d+(Z|([+-]\\d{2}:\\d{2}))" | ||
- id: '{data_file_3_id}' | ||
name: "sample_plates.csv" | ||
createdAt: !re_fullmatch "\\d{4}-\\d{2}-\\d{2}T\\d{2}:\\d{2}:\\d{2}\\.\\d+(Z|([+-]\\d{2}:\\d{2}))" |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Oops, something went wrong.