ChatGPT-Next-Web icon indicating copy to clipboard operation
ChatGPT-Next-Web copied to clipboard

[Bug] cannot link Ollama local serve

Open lucksufe opened this issue 4 months ago • 26 comments

Bug Description

cannot link Ollama local serve. Ollama and ChatNext are both latest version. I can run get Ollama response from python script,so the server is OK.

Steps to Reproduce

微信图片_20240305174858

Expected Behavior

微信截图_20240305174824

Screenshots

No response

Deployment Method

  • [ ] Docker
  • [ ] Vercel
  • [ ] Server

Desktop OS

win10

Desktop Browser

No response

Desktop Browser Version

No response

Smartphone Device

No response

Smartphone OS

No response

Smartphone Browser

No response

Smartphone Browser Version

No response

Additional Logs

No response

lucksufe avatar Mar 05 '24 09:03 lucksufe

[GIN] 2024/03/05 - 21:34:14 | 403 | 0s | 192.168.31.22 | OPTIONS "/v1/chat/completions" [GIN] 2024/03/05 - 21:34:18 | 403 | 0s | 192.168.31.22 | OPTIONS "/v1/chat/completions" [GIN] 2024/03/05 - 21:36:58 | 403 | 0s | 192.168.31.22 | OPTIONS "/dashboard/billing/usage?start_date=2024-03-01&end_date=2024-03-06" [GIN] 2024/03/05 - 21:36:58 | 403 | 0s | 192.168.31.22 | OPTIONS "/dashboard/billing/subscription"

Ollama log shows 403 for nextchat requests

lucksufe avatar Mar 05 '24 15:03 lucksufe

error connect to local ollama server

image

image

image

any fix , please

Alias4D avatar Mar 05 '24 21:03 Alias4D

This is the reason why Ollama is still not stable or fully compatible with this repository, particularly for desktop use. The owner released it without comprehensive testing across various operating systems.

H0llyW00dzZ avatar Mar 06 '24 02:03 H0llyW00dzZ

Referrer Policy: strict-origin-when-cross-origin

maybe caused by this policy,but I already create my user variables. OLLAMA_ORIGINS=*://localhost OLLAMA_HOST=0.0.0.0 under the below instructions.

`Setting environment variables on Windows On windows, Ollama inherits your user and system environment variables.

First Quit Ollama by clicking on it in the task bar

Edit system environment variables from the control panel

Edit or create New variable(s) for your user account for OLLAMA_HOST, OLLAMA_MODELS, etc.

Click OK/Apply to save

Run ollama from a new terminal window`

lucksufe avatar Mar 06 '24 03:03 lucksufe

Referrer Policy: strict-origin-when-cross-origin

maybe caused by this policy,but I already create my user variables. OLLAMA_ORIGINS=*://localhost OLLAMA_HOST=0.0.0.0 under the below instructions.

`Setting environment variables on Windows On windows, Ollama inherits your user and system environment variables.

First Quit Ollama by clicking on it in the task bar

Edit system environment variables from the control panel

Edit or create New variable(s) for your user account for OLLAMA_HOST, OLLAMA_MODELS, etc.

Click OK/Apply to save

Run ollama from a new terminal window`

it still doesn't work now ?

H0llyW00dzZ avatar Mar 06 '24 03:03 H0llyW00dzZ

Also I don't think so because of strict-origin-when-cross-origin

The strict-origin-when-cross-origin policy strikes a balance between security/privacy and functionality. Here's how it works:

  • Same-origin requests: When a request is made to the same origin, the full URL of the document making the request is sent in the Referer header. This means that for same-origin requests, the behavior is as if the policy were set to no-referrer-when-downgrade (which is the default if no policy is specified).
  • Cross-origin requests: For requests to a different origin, this policy sends only the origin (scheme, host, and port) in the Referer header, omitting the path and query string. This reduces the amount of potentially sensitive information shared across origins.
  • Downgrade navigation: If a website is served over HTTPS and makes a request to an HTTP resource, this policy will still send the origin in the Referer header for cross-origin requests. This is safer than the behavior of no-referrer-when-downgrade, which would send no Referer header at all in this case, as it at least allows the destination site to know which site the request came from.

If you think because of strict-origin-when-cross-origin then ollama it's fucking bad

H0llyW00dzZ avatar Mar 06 '24 03:03 H0llyW00dzZ

Referrer Policy: strict-origin-when-cross-origin maybe caused by this policy,but I already create my user variables. OLLAMA_ORIGINS=*://localhost OLLAMA_HOST=0.0.0.0 under the below instructions. Setting environment variables on Windows On windows, Ollama inherits your user and system environment variables. First Quit Ollama by clicking on it in the task bar Edit system environment variables from the control panel Edit or create New variable(s) for your user account for OLLAMA_HOST, OLLAMA_MODELS, etc. Click OK/Apply to save Run ollama from a new terminal window

it still doesn't work now ?

still 403 forbidden, I also copy and paste the post contents and headers from ChatGPT-Next-Web to python, and it works. The only difference I can see is “Referrer Policy: strict-origin-when-cross-origin” in ChatGPT-Next-Web post request.

I change to llama.cpp to run a serve, Ollama deleted.

lucksufe avatar Mar 06 '24 09:03 lucksufe

@lucksufe according to you ollama logs it seems to be the NextChat's requests being blocked by the CORS policies. It looks like the env you've set havent take effect in your ollama instance

fred-bf avatar Mar 07 '24 08:03 fred-bf

I have create the env OLLAMA_ORIGINS to *://localhost but still meet 403, my system is windows 10.

kaikanertan avatar Mar 07 '24 09:03 kaikanertan

ollama API May have been modified. (my ollama version is 0.1.28) I copied the request from Chrome browser to third-party software in a curl format, and Ollama returned 404

curl 'http://localhost:11434/api/v1/chat/completions' \
  -H 'sec-ch-ua: "Chromium";v="122", "Not(A:Brand";v="24", "Microsoft Edge";v="122"' \
  -H 'DNT: 1' \
  -H 'sec-ch-ua-mobile: ?0' \
  -H 'User-Agent: Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/122.0.0.0 Safari/537.36 Edg/122.0.0.0' \
  -H 'Content-Type: application/json' \
  -H 'Accept: application/json, text/event-stream' \
  -H 'Referer;' \
  -H 'sec-ch-ua-platform: "Windows"' \
  --data-raw '{"messages":[{"role":"user","content":"你好呀"}],"stream":true,"model":"llava:latest","temperature":0.5,"presence_penalty":0,"frequency_penalty":0,"top_p":1}'

PixPin_2024-03-07_23-00-19

Referring to other webui (ollama-webui-lite), it uses the following API for communication

http://localhost:11434/api/tags
http://localhost:11434/api/version

http://localhost:11434/api/chat
http://localhost:11434/api/generate
curl 'http://localhost:11434/api/chat' \
  -H 'Accept: */*' \
  -H 'Accept-Language: zh-CN,zh;q=0.9,en;q=0.8,en-GB;q=0.7,en-US;q=0.6' \
  -H 'Connection: keep-alive' \
  -H 'Content-Type: text/event-stream' \
  -H 'DNT: 1' \
  -H 'Origin: http://localhost:3001' \
  -H 'Referer: http://localhost:3001/' \
  -H 'Sec-Fetch-Dest: empty' \
  -H 'Sec-Fetch-Mode: cors' \
  -H 'Sec-Fetch-Site: same-site' \
  -H 'User-Agent: Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/122.0.0.0 Safari/537.36 Edg/122.0.0.0' \
  -H 'sec-ch-ua: "Chromium";v="122", "Not(A:Brand";v="24", "Microsoft Edge";v="122"' \
  -H 'sec-ch-ua-mobile: ?0' \
  -H 'sec-ch-ua-platform: "Windows"' \
  --data-raw '{"model":"llava:latest","messages":[{"role":"user","content":"你好呀"},{"role":"assistant","content":""}],"options":{}}'

Jackxwb avatar Mar 07 '24 15:03 Jackxwb

@Jackxwb Please ensure your Ollama version is greater than v0.1.24 https://docs.nextchat.dev/models/ollama and the endpoint you configured is http://localhost:11434/ it seems you added additionally path /api/

fred-bf avatar Mar 07 '24 15:03 fred-bf

@Jackxwb Please ensure your Ollama version is greater than v0.1.24 https://docs.nextchat.dev/models/ollama and the endpoint you configured is http://localhost:11434/ it seems you added additionally path /api/

Thank you for your reminder. After modification, I copied the requests from the Chrome browser and now they can work in third-party debugging tools. But there is still an error in the browser. I set this configuration to the window program, but it still cannot be used image

image In the window program, I cannot see the error log

-------- 2024-03-08 16:32(UTC+8) -------- I am using Edge browser, and after adding --disable-web-security to the shortcut, it can be accessed in the browser, but the exe program still reports an error. Additionally, I found that images can be sent in the exe program, but there is no button for sending images on the web side

-------- 2024-03-08 21:47(UTC+8) -------- After adding OLLAMA_ORIGINS=* to the system environment and restarting the OLLAMA service, I can now access OLLAMA in both edge and exe on my computer On my Android phone, some browsers are accessible, while others are still not image image image

Jackxwb avatar Mar 07 '24 15:03 Jackxwb

先设置一下OLLAMA_ORIGINS, 如果设置了依然无效, 那么可能是是请求的 header 问题 当你使用自定义接口时, 如果只设置了接口地址, 没有设置 API KEY, 那么每次请求时就会把 access code 放在 Authorization 传过去(这个可能还是一个安全问题?) 你可以在使用 ollama 时把 访问密码(access code) 清空来临时解决这个问题.或者等待这个 PR https://github.com/ollama/ollama/pull/2506

z2n avatar Mar 14 '24 02:03 z2n

Bot detected the issue body's language is not English, translate it automatically.


Set OLLAMA_ORIGINS first. If it still doesn’t work, it may be a request header problem. When you use a custom interface, if you only set the interface address and do not set the API KEY, then the access code will be passed in Authorization every time you request it (this may still be a security issue?) You can temporarily solve this problem by clearing the access code when using ollama. Or wait for this PR https://github.com/ollama/ollama/pull/2506

Issues-translate-bot avatar Mar 14 '24 02:03 Issues-translate-bot

ollama还是使用不了,试过别的chatbox项目可以使用,所以应该不是ollama的配置问题

aaa930811 avatar Mar 14 '24 03:03 aaa930811

Bot detected the issue body's language is not English, translate it automatically.


Ollama still can't be used. I tried other chatbox projects and it works, so it shouldn't be a configuration problem with ollama.

Issues-translate-bot avatar Mar 14 '24 03:03 Issues-translate-bot

image image

aaa930811 avatar Mar 14 '24 03:03 aaa930811

Bot detected the issue body's language is not English, translate it automatically.


image image

Issues-translate-bot avatar Mar 14 '24 03:03 Issues-translate-bot

我也碰到了,我用 LobeChat 同样的地址设置是可以的

mintisan avatar Mar 24 '24 04:03 mintisan

Bot detected the issue body's language is not English, translate it automatically.


I also encountered it. I used LobeChat and the same address settings were ok.

Issues-translate-bot avatar Mar 24 '24 04:03 Issues-translate-bot

Problem solved for nextChat Just set variables Ollama host to 0.0.0.0 Ollama origins to * Set openai end point of 127.0.0.1:11434 Model name to same name of ollama list name

Alias4D avatar Mar 25 '24 20:03 Alias4D

OLLAMA_ORIGINS=*works for me

mcthesw avatar Apr 04 '24 04:04 mcthesw

None of your methods worked

1101728133 avatar Apr 26 '24 07:04 1101728133

先设置一下 OLLAMA_ORIGINS, 如果设置了依然无效, 那么可能是是请求的 header 问题 当你使用自定义接口时, 如果只设置了接口地址, 没有设置 API KEY, 那么每次请求时就会把 access code 放在 Authorization 传过去(这个可能还是一个安全问题?) 你可以在使用 ollama 时把 访问密码(access code) 清空来临时解决这个问题.或者等待这个 PR ollama/ollama#2506

清空NextWeb的访问密码管用,模型名字为ollama list命令输出的那个。 不清空NextWeb的访问密码就不行,这是BUG吗?

daiaji avatar May 04 '24 02:05 daiaji

Bot detected the issue body's language is not English, translate it automatically.


Set OLLAMA_ORIGINS first. If it is still invalid, it may be a request header problem. When you use a custom interface, if you only set the interface address and not the API KEY, then every request will be access code is passed in Authorization (this may still be a security issue?) You can temporarily solve this problem by clearing the access code (access code) when using ollama. Or wait for this PR ollama/ollama #2506

Clearing the NextWeb access password will work. The model name is the one output by the ollama list command.

Issues-translate-bot avatar May 04 '24 02:05 Issues-translate-bot

I tried using Postman monitoring, and compared POST with OPTTONS. The Ollama server only supports POST responses and rejected OPTTONS requests

[GIN] 2024/05/10-10:16:25 | 200 | 8.5950196s | 127.0.0.1 | POST "/v1/chat/completion"
[GIN] 2024/05/10-10:16:02 | 404 | 0s | 127.0.0.1 | Options "/v1/chat/completion"

Is ChatGPTNextWeb configured to change the default access mode to POST?

playertk avatar May 10 '24 02:05 playertk