mlflow-export-import
mlflow-export-import copied to clipboard
Exporting experiment on Databricks
Hi! I'm trying to export an experiment to a folder using Databricks. I've got it to work running the commands listed in the README locally, but when trying the same in my Databricks workspace I'm running into trouble. After building the wheel, pushing it to DBFS and installing it on the cluster I'm trying the following:
%sh
python -u -m mlflow_export_import.experiment.export_experiment \
--experiment /Shared/my_experiment \
--output-dir /dbfs/mnt/out \
--notebook-formats SOURCE
I then get the following error:
Traceback (most recent call last): File "/databricks/conda/envs/databricks-ml/lib/python3.8/runpy.py", line 194, in _run_module_as_main return _run_code(code, main_globals, None, File "/databricks/conda/envs/databricks-ml/lib/python3.8/runpy.py", line 87, in _run_code exec(code, run_globals) File "/databricks/conda/envs/databricks-ml/lib/python3.8/site-packages/mlflow_export_import/experiment/export_experiment.py", line 82, in
main() File "/databricks/conda/envs/databricks-ml/lib/python3.8/site-packages/click/core.py", line 829, in call return self.main(*args, **kwargs) File "/databricks/conda/envs/databricks-ml/lib/python3.8/site-packages/click/core.py", line 782, in main rv = self.invoke(ctx) File "/databricks/conda/envs/databricks-ml/lib/python3.8/site-packages/click/core.py", line 1066, in invoke return ctx.invoke(self.callback, **ctx.params) File "/databricks/conda/envs/databricks-ml/lib/python3.8/site-packages/click/core.py", line 610, in invoke return callback(*args, **kwargs) File "/databricks/conda/envs/databricks-ml/lib/python3.8/site-packages/mlflow_export_import/experiment/export_experiment.py", line 78, in main exporter = ExperimentExporter(None, export_metadata_tags, utils.string_to_list(notebook_formats)) File "/databricks/conda/envs/databricks-ml/lib/python3.8/site-packages/mlflow_export_import/experiment/export_experiment.py", line 22, in init self.fs = filesystem or _filesystem.get_filesystem() File "/databricks/conda/envs/databricks-ml/lib/python3.8/site-packages/mlflow_export_import/common/filesystem.py", line 50, in get_filesystem return DatabricksFileSystem() if use_databricks else LocalFileSystem() File "/databricks/conda/envs/databricks-ml/lib/python3.8/site-packages/mlflow_export_import/common/filesystem.py", line 13, in init self.dbutils = IPython.get_ipython().user_ns["dbutils"] AttributeError: 'NoneType' object has no attribute 'user_ns'
Is this the right approach when running export/import on Databricks?
I am also getting same issue is there any resolution to this issue?
Try using these Databricks notebooks: https://github.com/amesar/mlflow-export-import#databricks-notebooks