server
server copied to clipboard
PermissionError: [Errno 13] Permission denied: '/home/triton-server'
Description
While running Triton inference server using k8s-onprem example, I am getting the below error:
PermissionError: [Errno 13] Permission denied: '/home/triton-server
This is coming while downloading hugging face tokenizer .
Stack trace:
There was a problem when trying to write in your cache folder (/home/triton-server/.cache/huggingface/hub). You should set the environment variable TRANSFORMERS_CACHE to a writable directory.
There was a problem when trying to write in your cache folder (/home/triton-server/.cache/huggingface/hub). You should set the environment variable TRANSFORMERS_CACHE to a writable directory.
None of PyTorch, TensorFlow >= 2.0, or Flax have been found. Models won't be available and only tokenizers, configuration and file/data utilities can be used.
None of PyTorch, TensorFlow >= 2.0, or Flax have been found. Models won't be available and only tokenizers, configuration and file/data utilities can be used.
I0321 14:08:23.693707 1 python_be.cc:2362] TRITONBACKEND_ModelInstanceInitialize: preprocessing_0_0 (CPU device 0)
I0321 14:08:23.695166 1 python_be.cc:2362] TRITONBACKEND_ModelInstanceInitialize: postprocessing_0_0 (CPU device 0)
There was a problem when trying to write in your cache folder (/home/triton-server/.cache/huggingface/hub). You should set the environment variable TRANSFORMERS_CACHE to a writable directory.
None of PyTorch, TensorFlow >= 2.0, or Flax have been found. Models won't be available and only tokenizers, configuration and file/data utilities can be used.
There was a problem when trying to write in your cache folder (/home/triton-server/.cache/huggingface/hub). You should set the environment variable TRANSFORMERS_CACHE to a writable directory.
I0321 14:08:24.032595 1 pb_stub.cc:346] Failed to initialize Python stub: PermissionError: [Errno 13] Permission denied: '/home/triton-server'
I tried to set the environment variable in deployment.yaml but the error remained the same.
Also, /home/triton-server directory does not exist.
Triton Information
I am using nvcr.io/nvidia/tritonserver:24.01-trtllm-python-py3 Triton container.
To Reproduce
Run helm install example .
Hello @tapansstardog thanks for reaching out,
- Can you confirm you are following the https://github.com/triton-inference-server/server/blob/main/deploy/k8s-onprem/README.md tutorial?
- Can you create the /home/triton-server with appropriate permissions and try again?
Thanks
- Yes @indrajit96 I am following the link you mentioned.
- I found that
/home/triton-serverdirectory does not exist. When I changedTRANSFORMERS_CACHEenvironment variable to/tmp/I was able to proceed. That means the used had access to tmp.
Same issue for me. Changing TRANSFORMERS_CACHE still does not work for me.