mlflow-docker
                                
                                
                                
                                    mlflow-docker copied to clipboard
                            
                            
                            
                        Jupyter notebook fails storing artifacts
Hi, Artefacts storing fails in the jupyter notebook, but testing scripts working fine when run in the terminal. It says access denied. Do you know how to fix this? Thanks
- Did you use the environment variables?
 - How do you run the app? Does the environment differe vs running the test scripts? Are you running everything on the local machine or somewhere else?
 - Please provide a Minimal, Reproducible Example
 
Thanks, I used to start the Jupyter server from remote machine. Server is configured properly with environment variables. I used to start the Jupyter by terminal command 'nohup ssh -f user@ip "python3 -m notebook --user=user --no-browser --port=port --NotebookApp.token='' --NotebookApp.password=''"; port_forward'
Fo some reason it seems that it wont get variables properly. However if I start the Jupyter by login to the server then every thing works fine. I think this is no an issue with mlflow but something to do we with Jupyter. Even I load the env variables from Jupyter it does not work. Strange thing is all the other stuff updated to mlflow except artifacts.
@NuwanCW 25 days ago the mlflow posted the news about a feature that you could stream artifacts through mlflow into the s3. That would eliminate this problem. Let me know if this helps! And if so i could create a PR to even simplify this project architecture/configuration! Source thread: https://github.com/mlflow/mlflow/issues/629 Docs: https://mlflow.org/docs/latest/tracking.html#using-the-tracking-server-exclusively-for-proxied-artifact-access
Closing for now. No activity for ~6 months