You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I'm using fabric to send a stream of binary data (can be GBs in size). The invoke code reads the data 1 byte at a time. This seems to result in an upload speed maxing out at 10KB/s, which would take days to upload the necessary data.
Perhaps there can be a way to customise the chunk size for reading?
I easily achieved over 2MB/s after changing that number (atleast 200x improvement, probably more if I had a faster internet connection):
--- a/invoke/runners.py+++ b/invoke/runners.py@@ -790,13 +790,13 @@ class Runner(object):
# read instead of once per session, which could be costly (?).
bytes_ = None
if ready_for_reading(input_):
- bytes_ = input_.read(bytes_to_read(input_))+ bytes_ = input_.read(2**20)
# Decode if it appears to be binary-type. (From real terminal
# streams, usually yes; from file-like objects, often no.)
The text was updated successfully, but these errors were encountered:
Sorry, mixing up issues. #915 solves #818. This is a separate issue (encountered in the same application I developed). I haven't proposed a solution to this yet, though the obvious thing is to add a chunk_size parameter somewhere.
I'm using fabric to send a stream of binary data (can be GBs in size). The invoke code reads the data 1 byte at a time. This seems to result in an upload speed maxing out at 10KB/s, which would take days to upload the necessary data.
Perhaps there can be a way to customise the chunk size for reading?
I easily achieved over 2MB/s after changing that number (atleast 200x improvement, probably more if I had a faster internet connection):
The text was updated successfully, but these errors were encountered: