run-notebook
run-notebook copied to clipboard
Hi there, Can we pass a .json file to the `new-cluster-json:` parameter?
Hello, for Azure, I like the recommended usage you describe in the README, leveraging an Azure Service Principal. However, the step to generate the AAD token for the SP is...
Thanks for this repo! I'm having an issue where I can't run imports in a notebook I'm trying to execute. I may be misunderstanding the readme / code but I'm...
Consider the following `action.yml`: ``` - name: Run notebook uses: databricks/run-notebook@main with: local-notebook-path: notebooks/test.py ... git-provider: 'gitHubEnterprise' git-commit: ${{ github.sha }} ... ``` The jobs spec in Databricks is: ```...
Hi , I have built a custom operator where for airflow 1.9 where i am using request library and trying to trigger through POST method using API 2.1 available api...
Please dockerize so that it can be used with Argo Workflows and other CI-CD engines.
You are unable to run a notebook using the serverless compute functionality. The expected behaviour would be if a "new-cluster-json" or an "existing-cluster-id" is not specified the notebook is run...
This action awaits the completion of the Job. If the job is long running job or perpetual, the Action times out when the access token it uses to communicate with...