Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

db_volume_write() corrupted files #38

Closed
brenktt opened this issue Mar 21, 2024 · 5 comments
Closed

db_volume_write() corrupted files #38

brenktt opened this issue Mar 21, 2024 · 5 comments
Assignees
Labels
bug Something isn't working

Comments

@brenktt
Copy link

brenktt commented Mar 21, 2024

Hi, first of all this package is something that has been very much lacking for R users connecting to Databricks from outside.

I have found that when I try to upload parquet or csv file using db_volume_write() and then create a table from the file or read it back, the file becomes corrupted. Example code below:

arrow::write_parquet(mtcars, "mtcars.parquet")

brickster::db_volume_write(
  path = "Volumes/.../mtcars.parquet",
  file = "mtcars.parquet"
)

brickster::db_volume_read(
  path = "Volumes/.../mtcars.parquet",
  destination = "mtcars_databricks.parquet"
)

arrow::read_parquet("mtcars_databricks.parquet") 

And the error I receive:

Error: Invalid: Parquet magic bytes not found in footer. Either the file is corrupted or this is not a parquet file.

Could you have a look into this?

@zacdav-db zacdav-db self-assigned this Mar 21, 2024
@zacdav-db
Copy link
Contributor

Hey @brenktt - thanks for the repro!

Let me look into it, it does seem like db_volume_write is the culprit from my further testing so far.

# write arrow --> read arrow:                                        success
# write arrow --> db_volume_write --> ui download --> read arrow:    error
# write arrow --> db_volume_write --> db_volume_read --> read arrow: error
# write arrow --> ui upload --> db_volume_read --> read arrow:       success

@zacdav-db
Copy link
Contributor

I can see that bytes are getting injected
Screenshot 2024-03-22 at 9 46 11 am
Screenshot 2024-03-22 at 9 46 22 am

I've adjusted the code to use different methods in {httr2} and it now should work.

It's currently in the branch volume_fixes.

I'll make a PR once tests pass.

@zacdav-db zacdav-db added the bug Something isn't working label Mar 22, 2024
@zacdav-db
Copy link
Contributor

@brenktt if you can test from the branch it would be great to get confirmation it is behaving as expected before I merge!

@brenktt
Copy link
Author

brenktt commented Mar 25, 2024

@zacdav-db I have tested the fixed code and it seems to be working as expected (although the upload/download speed seems to be slower than before), thanks a lot!

Is there a plan to have the package published on CRAN anytime soon?

@zacdav-db
Copy link
Contributor

Thanks @brenktt - I've merged the changes into main branch.

Yes there is a plan but some details are being worked out with other projects at Databricks.
I intend to have brickster on CRAN hopefully in next 4 weeks 🤞.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

2 participants