-
Notifications
You must be signed in to change notification settings - Fork 8
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
upload requests failing #63
Comments
@brenktt presumably this is not specific to volumes, do other functions work? I believe any of the following should be valid:
Remove the trailing |
Hi, I managed to get the connection working using the first option. However, I have run into second issue when trying to upload files to volume. The file starts uploading, but the upload immediately freezes and the expected upload time keeps increasing in the console. In this example I'm trying to upload very small parquet file (15kb). I have tried reading from the volume and this works as expected, so the issue seems to be just with this |
@brenktt can you try this please: # adjust before running
vpath <- "/Volumes/<catalog>/<schema>/<volume>"
# save to tempdir
dir <- tempdir()
fpath <- file.path(dir, "cars.csv")
write.csv(cars, fpath)
# upload to volume
vol_dest <- file.path(vpath, "cars.csv")
brickster::db_volume_write(path = vol_dest, file = fpath, overwrite = TRUE)
# read from volume
local_dest <- file.path(dir, "vol_cars.csv")
path <- brickster::db_volume_read(path = vol_dest, destination = local_dest)
read.csv(path) # or `read.csv(local_dest)`
I'm currently unable to reproduce the issue thus far, even with larger data. |
I have tried your solution and it works and to my surprise my code now works as well. There must have been some network issue at the time or perhaps I messed up with some of the function inputs. Thank you so much for your help and sorry I wasted your time. One more question from my side- I think I asked some time ago, but is there any plan to have the package available on CRAN? It was a life saver for me and it would be great if I did not have to install through GitHub. |
No worries, glad its working now! CRAN process has been kicked off, I did the first review a few weeks ago. I have put some time aside to go through the feedback and hopefully all things going well then its on CRAN soon 🤞. |
Hopefully it works out for the best! Please could you let the issue open for a little longer so I can test on larger datasets as well? |
It seems I was too quick with conclusions as I have tested on a file that is smaller thank 16 KB. For some reason the upload of files freezes at exactly 16 KB for all files (whether it is parquet or csv). |
Is there an example file you can make thats reproducible? |
I just pasted couple of mtcars dataframes together so it exceeds 16 KB. |
Hmm, that works fine for me, I also tested data that was 150MB which worked as well. e.g. adjusting my example to write 100k rows write.csv(dplyr::sample_n(cars, 100000, TRUE), fpath) |
This is where the difference is between our sessions. Do you have any idea what could be causing this or is there anything else I could provide that could be investigated? |
@brenktt you can paste an output of Ensure |
Here is the output:
I have also spoken to our IT department and it seems only I have this issue. I will get back to you if this gets solved somewhere on the IT side. It is likely it is not actually an issue with the package. |
Keep me posted. I'll close the issue in a week or two if I don't hear otherwise. Can always re-open. |
I will probably have an answer sometime at the start of September, so please keep the issue open until then. |
@zacdav-db So it turns out the issue is with |
Thanks @brenktt, I can now repro the issue. I'm having a dig through what's changed in |
I've tested the repro with the commit before the change and then the commit with the change and its clear that it is the culprit. remotes::install_github(repo = "r-lib/httr2", ref = "ff16551") # before change, works
remotes::install_github(repo = "r-lib/httr2", ref = "bdb13fe") # after change, fails |
From my part I'm happy this is now working, but of course it would be best to have the package working with newest versions as there was a lot of time spent to find the issue. |
@brenktt of course. I'm investigating and will likely raise an issue with I want this to work with all versions without issue too! |
Raised an issue with |
I'll be waiting for a resolution before continuing with CRAN process - this is important before release. |
@brenktt The issue is now fixed in the development version of |
Thanks to you for prompt investigation! |
You can now install |
Hi, I'm having issues with communicating with volume system using
db_volume_write()
function in the latest release (v0.2.4) using code that was working couple months ago.I receive following error:
Error in
httr2::req_perform()
:! Failed to perform HTTP request.
Caused by error in
curl::curl_fetch_memory()
:! Could not resolve host: https; Unknown error
It seems to me that the issue is with the
host
parameter. So far I have provided it in format likehttps://adb-<many_digits>.<single_digit>.azuredatabricks.net/
. According to the current documentation, host should be in a format likexxxxxxx.cloud.databricks.com
.Could you help me to acquire the host address in correct format?
The text was updated successfully, but these errors were encountered: