Evan Ye
Evan Ye
@sebbegg I'll answer your question in https://github.com/databricks/containers/issues/22
@dipesh747 the runtime is proprietary and we do not distribute it for local testing. Please discuss with your databricks account manager / support team if there are other options here.
@mengxr @Loquats does the above make sense? any potential issues?
Seems like the people who might know are busy. Can you try it out and report back? I think the PATH might be dynamically modified during startup.
What are you trying to package? For jars, you can drop them into /databricks/jars inside the container, and the runtime should pick it up. For python libraries, see the included...
The best place to bring this up would be the databricks idea portal that you should have access to as a customer
Only runtime jar/scala libraries are injected when creating a cluster. Python libraries are derived from your base image itself, and these base images are not kept in sync with the...
Try adding your jars to `/databricks/jars`. This folder is on the classpath and will be included when we start spark.
@rd-rohit I think executors also use /databricks/jars - can you try without the additional options?
@rd-rohit good to know, thanks! The classpath dependencies here can be complicated