Running notebook won't import other notebooks
Thanks for this repo!
I'm having an issue where I can't run imports in a notebook I'm trying to execute. I may be misunderstanding the readme / code but I'm not seeing how to accomplish this.
For example, say I have a github repo that has a notebook at
src/notebooks/test-notebook.scala
and has a util file at
src/utils/foo.scala
so the structure looks like this:
-- src
|_ notebooks
|_ test_notebook.scala
|_ utils
|_ foo.scala
in the test-notebook.scala there is a cell that does a %run ../utils/foo.scala to import that util.
This works fine if I execute it in a notebook, but when trying to run this action, by using local-notebook-path src/notebooks/test-notebook.scala, the foo.scala file can never be found and imported.
Hope that makes sense!
Is there something I need to configure to be able to properly import the foo.scala file into test-notebook.scala??
Thanks again!
Hey @fedreg-bn are you following the recommend way of running the notebook within a temporary checkout of the entire repo? https://github.com/databricks/run-notebook#recommended-run-notebook-within-a-temporary-checkout-of-the-current-repo
That way you will have access to the other files in the repo from your scala databricks notebook.
Hi @mohamad-arabi thanks for the reply!!
I did try that approach but the job was not able to import other notebooks it depended on.
It's possible that I missed something but I tried several combinations of how I used the local-notebook-path and was unsuccessful. Using the example I posted above, test_notebook.scala was never able to find foo.scala no matter how I structured the configuration for the action.
In the meantime I just wrote a tiny script that calls /api/2.1/jobs/runs/submit directly and then checks the status. This is working fine for us but if the case I outlined above should be supported, I'll circle back in the future and give this action another try.
Thanks again