containers icon indicating copy to clipboard operation
containers copied to clipboard

Sample base images for Databricks Container Services

Results 70 containers issues
Sort by recently updated
recently updated
newest added

Is there an ETA on when this image will be released as a docker container?

Hi, I see on Dockerhub there's only support for amd64 architectures. Seeing that arm64 is becoming increasingly popular due to the M1 Macs it would be a great improvement if...

Is autoloader support possible on the standard runtime container? This would be helpful for unit testing streams that utilizes this feature. Currently, `readStream.format("cloudFiles")` throws `java.lang.ClassNotFoundException: Failed to find data source:...

Hi, just opening an issue for the lack of Ganglia support, despite been [written in the docs](https://docs.databricks.com/clusters/custom-containers.html#step-1-build-your-base): > Of course, the minimal requirements listed above do not include Python, R,...

Is NVTabular supported for any of the containers listed here? This is a wild shot, but it seems that the NVTabular team was referencing a container built by the RAPIDS...

Databricks notebooks attached to clusters using the provided databricks runtime version 10.4-LTS are able to import and use `seaborn`. When building a custom docker image built from `databricksruntime/standard:10.4-LTS` (which builds...

Databricks runtime containers based on CentOS 8.

We are using a custom jar in our job clusters that we add as part of the init script. Is it possible to add the same jar while creating a...

Hi, The python dependencies` versions listed in the [python dockerfile](https://github.com/databricks/containers/blob/b459ed89526ebb8c118bfb591f357f5cda9a50ec/ubuntu/python/Dockerfile#L17) mismatch the versions listed in the release's [documentation page](https://docs.databricks.com/release-notes/runtime/10.4.html). To be exact: ``` ipython==7.19.0 \ numpy==1.19.2 \ ``` is not...

In the Standard image, it lists Spark Submit jobs, but after `cd`ing into the `/bin` directory, I don't see any `spark-submit` script...