VfBfoerst
VfBfoerst
Hi @aarnphm, we upraded to version openllm-0.1.20. Currently the bug seems to be fixed, we get an status code 200 from the /readyz endpoint. We will further investigate the other...
I don't know why, but it does not work again. Meanwhile, we tried other models but we did not change any config at all. /readyz says again ``` Runners are...
> can you walk me through how you run this again? > > are you just doing `openllm start opt`? Yeah, its a virtual environment and we Run `openllm start...
> What is the resource you are running on? Its a virtual machine running on VMware esxi, without GPU, it is CPU only
Hi, we found out it was the proxy who caused the issue. We needed to add `127.0.0.1,localhost` to the no_proxy variable, e.g. (temporarily): `export no_proxy=127.0.0.1,localhost`
> curl http://localhost:11434/api/generate -d '{"model": "MODELNAME", "keep_alive": 0}' worked for me :)
> @VfBfoerst what is NO_PROXY? can you share more re: requested implementation? @krrishdholakia no_proxy environment variable in Linux is used to define exceptions (IPs, URLs) to Not use the Proxy...
@krrishdholakia the environment variable seems not to be passed through correctly. The no_proxy env is defined as follows: `no_proxy=123.123.123.123,123.123.123.123,123.123.123.123` which leads to a value error: ```python Traceback (most recent call...
@krrishdholakia Can you reopen the issue please?
@krrishdholakia no_proxy is usually set like: ```bash no_proxy=123.123.123.123,localhost,127.0.0.1,test.de,host.containers.internal ``` in the httpx documentation no_proxy is set like "http://123.123.123.123", so maybe it will fix the issue if you set http:// in...