-
Notifications
You must be signed in to change notification settings - Fork 35
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
decrypt_file_iter
, a generator yielding chunks that *would* be passed to on_data
? Concise alternative to using on_data
for streaming
#246
Comments
decrypt_file
?1decrypt_file_iter
, a generator yielding chunks that *would* be passed to on_data
? Concise alternative to using on_data
for streaming
I'm sure it might be aesthetically pleasing from a "design purity" point of view, but does it give you anything that you can't do with |
Okay I've just figured out what it would give me that I can't do with E.g. right now I'm trying to Would you be open to merging it if I add this, to simplify this use case? |
It depends on how the proposed changes look. After all, I would have to provide on-going support indefinitely, for an uncommon use case. In terms of your use case, this could be addressed with the current setup by the |
I've updated the documentation to talk about threading constraints when processing data. |
Is your feature request related to a problem? Please describe.
It'd be nice to be able to iterate over streamed chunks in
gpg.decrypt_file
instead of having to set anon_data
callback.Describe the solution you'd like
Use one of the functions from https://stackoverflow.com/questions/9968592/turn-functions-with-a-callback-into-python-generators to wrap
decrypt_file
,yield
-ing each chunk of the file as it streams.Then once the iterator terminates, perhaps a separate method could get the result from the
GPG
object? Not sure the best way to handle this. Or maybe just raise an exception if there's any failure and ignore theresult
object otherwise?Describe alternatives you've considered
Just using
on_data
and doing this myself. 🙂The text was updated successfully, but these errors were encountered: