You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
We recently encountered a problem with the attempted download of a very large recording file. We need to find a solution to prevent this from happening again. Possible approaches include: (1) preventing experiments from producing very large recordings in the first place, and/or (2) checking the file size in the experimenter download views and handling the case when the file is too large (e.g. similar to how we handle the option to download all video files).
Implementation options and Scoping
In the experiment runner
The EFP does have a maxRecordingTime parameter, but the default is very high (100000000 seconds). So the first TO DO item is:
Reduce the default maxRecordingTime parameter to a more sensible value.
This value can be changed by researchers, so we could also consider limiting the upper limit for this parameter. If so, we should make this clear in the documentation.
We should also consider options for logging that the upper recording limit has been reached. Currently this event does produce a warning in the browser's console, but that isn't sufficient since that doesn't actually alert the participant or researcher. Perhaps the most straightforward option is to add this as a flag to the JSON response data (e.g. exceededUpperRecordingLimit: true). Here's the relevant bit of EFP code for handling recordings that have exceeded the time limit.
Whatever we decide to do in EFP, we should make note of for lookit-jspsych as we will want to do the same thing there.
In the lookit-api
Here is the relevant bit of code for downloading single video files. It doesn't look like we do any file size check here, so that is another possible TO DO item.
Seems like the preventing large recordings solution is the easiest way to go and we can start with that (let's discuss Monday what a reasonable limit would be and where we can add this in the documentation). I think I'm partial to the ¿ porque no los dos? method and also putting in a check for handling larger file downloads (based on our multiple downloads method) in case there are any others already lurking/ to future-proof.
Summary
We recently encountered a problem with the attempted download of a very large recording file. We need to find a solution to prevent this from happening again. Possible approaches include: (1) preventing experiments from producing very large recordings in the first place, and/or (2) checking the file size in the experimenter download views and handling the case when the file is too large (e.g. similar to how we handle the option to download all video files).
Implementation options and Scoping
In the experiment runner
The EFP does have a
maxRecordingTime
parameter, but the default is very high (100000000 seconds). So the first TO DO item is:maxRecordingTime
parameter to a more sensible value.This value can be changed by researchers, so we could also consider limiting the upper limit for this parameter. If so, we should make this clear in the documentation.
We should also consider options for logging that the upper recording limit has been reached. Currently this event does produce a warning in the browser's console, but that isn't sufficient since that doesn't actually alert the participant or researcher. Perhaps the most straightforward option is to add this as a flag to the JSON response data (e.g.
exceededUpperRecordingLimit: true
). Here's the relevant bit of EFP code for handling recordings that have exceeded the time limit.Whatever we decide to do in EFP, we should make note of for lookit-jspsych as we will want to do the same thing there.
In the lookit-api
Here is the relevant bit of code for downloading single video files. It doesn't look like we do any file size check here, so that is another possible TO DO item.
For reference, here's how we handle the 'download all videos' option, which uses the
build_zipfile_of_videos
task. So maybe we can re-use this method for single-file downloads that exceed some size limit.The text was updated successfully, but these errors were encountered: