databricks-sdk-py
databricks-sdk-py copied to clipboard
[ISSUE] When trying to use dbutils for interacting with Volumes through fs, an error pops up stating "PermissionDenied: No operations allowed on this path."
Description
When trying to use a command such as:
w.dbutils.fs.ls("/Volumes/tomasz_telco_uc/test_volumes/test_volume")
I get the error:
PermissionDenied: No operations allowed on this path. Config: host=https://e2-demo-field-eng.cloud.databricks.com, token=***, auth_type=pat
To verify that the location was accessible by the user, the command
w.files.download("/Volumes/tomasz_telco_uc/test_volumes/test_volume/image.png")
works to download a file from the same location.
Expected behavior It should be possible to use dbutils.fs commands such as ls to interact with Volumes as long as the user has access to the Volume.
Is it a regression? This was only attempted on the newest version 0.20.0. The feature was not available in previous versions.
Debug Logs
The SDK logs helpful debugging information when debug logging is enabled. Set the log level to debug by adding logging.basicConfig(level=logging.DEBUG) to your program, and include the logs here.
DEBUG:urllib3.connectionpool:https://e2-demo-field-eng.cloud.databricks.com:443 "GET /api/2.0/dbfs/list?path=%2FVolumes%2Ftomasz_telco_uc%2Ftest_volumes%2Ftest_volume%2F HTTP/1.1" 403 None
DEBUG:databricks.sdk:GET /api/2.0/dbfs/list?path=/Volumes/tomasz_telco_uc/test_volumes/test_volume/
< 403 Forbidden
< {
< "error_code": "PERMISSION_DENIED",
< "message": "No operations allowed on this path"
< }
Other Information
- Version: 0.20.0
@tomaszb-db thanks for raising! Currently volumes and UC paths are not supported in dbutils through the python sdk. We are working on unblocking this use case and will notify you as soon as we get it done. Thanks!
Has this issue been resolved? I am encountering the same/simliar issue when running dbutils on a volume.
Error: Status code: -1 error code: null error message: Cannot resolve hostname:
dbutils was working a ~week ago but is now failing. Per the Microsoft documentation it should be able to work
https://learn.microsoft.com/en-us/azure/databricks/discover/files
Not sure about Azure but at least in AWS that command is working for me.