-
Notifications
You must be signed in to change notification settings - Fork 8
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
db_volume_write() corrupted files #38
Comments
Hey @brenktt - thanks for the repro! Let me look into it, it does seem like
|
I can see that bytes are getting injected I've adjusted the code to use different methods in It's currently in the branch volume_fixes. I'll make a PR once tests pass. |
@brenktt if you can test from the branch it would be great to get confirmation it is behaving as expected before I merge! |
@zacdav-db I have tested the fixed code and it seems to be working as expected (although the upload/download speed seems to be slower than before), thanks a lot! Is there a plan to have the package published on CRAN anytime soon? |
Thanks @brenktt - I've merged the changes into main branch. Yes there is a plan but some details are being worked out with other projects at Databricks. |
Hi, first of all this package is something that has been very much lacking for R users connecting to Databricks from outside.
I have found that when I try to upload parquet or csv file using
db_volume_write()
and then create a table from the file or read it back, the file becomes corrupted. Example code below:And the error I receive:
Could you have a look into this?
The text was updated successfully, but these errors were encountered: