-
Notifications
You must be signed in to change notification settings - Fork 65
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Microsoft.Azure.Databricks.Client library on NuGet doesn't expose '/fs/files' API #235
Comments
At the moment, and just to make a test on my local machine, I downloaded the source code from GitHub, I added required new method on IDbfsApi interface + DbfsApiClient class implementation. And using the endpoint specified on the documentation, I'm able to upload file in the correct volume on Databricks. |
Can you use DbfsApiClient.Upload to upload the file? Azure Databricks supports dbfs format for volume paths: |
Thanks for your reply @memoryz . I tried with DbfsApiClient.Upload method, but it doesn't works. I get this error: I'm using this format to specify the remote filename: This Upload method internally use the Create method, that build and use the endpoint: $"{ApiVersion}/dbfs/create". Just to make a test, I implemented a Create2 method:
that use PUT method, on a different endpoint, following the documentation here: https://docs.databricks.com/api/workspace/files/upload. |
I see. Maybe the DBFS API doesn't support volumes, given that volumes feature was released much later than DBFS. I'll see if I can setup an environment with catalog enabled and give it a try. |
I'm trying to push a new branch on this repo with my temporary solution on my issue, but it seems that I don't have grants/permissions. |
Can you fork the repo and send a PR from your fork? |
Hello everybody!
I'm using this library from NuGet:
https://www.nuget.org/packages/Microsoft.Azure.Databricks.Client/
because I need to connect and ingest data inside a Databricks service hosted on Azure.
In my particular case, I need to upload JSON file into a volume.
According to this documentation:
https://docs.databricks.com/api/workspace/files/upload
I need to use the endpoint with PUT method:
/api/2.0/fs/files{file_path}
It seems that this endpoint is not exposed into the latest version of Microsoft.Azure.Databricks.Client (currently 2.6.0).
Am I wrong?
The text was updated successfully, but these errors were encountered: