wwjCMP

Results 202 comments of wwjCMP

2024-05-15 05:12:32 Traceback (most recent call last): 2024-05-15 05:12:32 File "/usr/local/lib/python3.11/site-packages/uvicorn/protocols/http/httptools_impl.py", line 411, in run_asgi 2024-05-15 05:12:32 result = await app( # type: ignore[func-returns-value] 2024-05-15 05:12:32 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ 2024-05-15 05:12:32 File...

How can I set brain_settings?

> To correctly set up and use your locally deployed Ollama model with the `brain_settings`, follow these steps: > > 1. **Identify the Configuration File**: Find the configuration file in...

I have correctly set OLLAMA_API_BASE_URL, how do I specify the model for the conversation?

I have assigned some Ollama models for users, but users can only select models in the brain, and there is no model selection in the dialogue interface. Also, no Ollama...

2024-05-15 10:55:50 INFO: 172.20.0.1:53992 - "GET /brains/0602d8e3-f73a-4f8d-b41f-4df00fd8d471/ HTTP/1.1" 200 OK 2024-05-15 10:55:55 INFO: 172.20.0.1:53998 - "OPTIONS /chat HTTP/1.1" 200 OK 2024-05-15 10:55:56 INFO: 172.20.0.1:53998 - "POST /chat HTTP/1.1" 200 OK...

I can modify the default embedding model, what I want to know is how to modify the default dialogue model. Then how do users switch their dialogue models.

In fact, I have already assigned a custom model to the user, but llama2 is still the only one called in question and answer.

> [按照这里](https://github.com/MuiseDestiny/zotero-gpt/releases/tag/0.8.4)配置一下模型 我就是按这里配置的,还是说只能用图里面的嵌入模型