Kevin Hu
Kevin Hu
**`docker-compose-gpu-CN-oc9`** is not maintained any more.
[FYI](https://docs.docker.com/compose/how-tos/gpu-support/)
You could click the little lamp to have a look what exactly question changed to. 
Clean this field. 
Not supported yet.
Ollama response error. Could paste the ollama error log here?
I did not see any errors from it. Weird!...
Weired! Did you set 16384 here?  And how do you know the prompt had been truncated?
You could search `BAAI/bge-large-zh-v1.5` in the codebase, and change it to the embedding model you want. But I suggest to serve embedding model via vLLM/Ollama/XInference independently.
What about signing up a new user?