ollama-python
ollama-python copied to clipboard
Unable to set environment variables - OLLAMA_NUM_PARALLEL, OLLAMA_NUM_GPU, and OLLAMA_NUM_THREAD
I have set os.environ["OLLAMA_NUM_PARALLEL"] = "4" through python script, but it is not running parallely 4 LLM requests.
Please help me with this issue.
Could you show the snippet of code you're making requests with?
@harshachopra507 those OLLAMA_ environment variables are for the server and are not passed from the python client. Set them the appropriate technique for the way you start ollama serve - in the dockerfile, docker run command, or systemd unit.
Issue gardening. @harshachopra507: did it work? Can the issue be closed? (let's keep the issue list tidy to help maintainers)