看说明使用本地ollama,不用配置api key,但为什么还是报错,提示 ValueError: ⚠️API_KEY is missing
期待回复,谢谢。
我刚才试了下 也报错 但是key我填的ollama 虽然报错但实际是能使用的
报错不影响使用,以实际运行为准
+1,和楼主同样的问题~~~~
Win10专业版,配置了WSL2的Docker Desktop
按照2楼提示,在key的位置填写的ollama,结果还是显示ollama连接错误。。。 :错误代码如下
2025-03-22 14:34:19 📝 Summarizing and extracting terminology ...
2025-03-22 14:34:19 Unexpected error occurred: Connection error.
2025-03-22 14:34:19 Retrying...
2025-03-22 14:34:19 Unexpected error occurred: Connection error.
2025-03-22 14:34:19 Retrying...
2025-03-22 14:34:19 ⚠️ Transcription results already exist, skipping transcription step.
2025-03-22 14:34:19 File 'sentence_splitbynlp.txt' already exists. Skipping split_by_spacy.
2025-03-22 14:34:19 ⏳ Loading NLP Spacy model: <en_core_web_md> ...
2025-03-22 14:34:21 ✅ NLP Spacy model loaded successfully!
2025-03-22 14:34:23 ✅ All sentences have been successfully split!
2025-03-22 14:34:31 2025-03-22 06:34:31.513 Uncaught app exception
2025-03-22 14:34:31 Traceback (most recent call last):
2025-03-22 14:34:31 File "/usr/local/lib/python3.10/dist-packages/httpx/_transports/default.py", line 101, in map_httpcore_exceptions
2025-03-22 14:34:31 yield
2025-03-22 14:34:31 File "/usr/local/lib/python3.10/dist-packages/httpx/_transports/default.py", line 250, in handle_request
2025-03-22 14:34:31 resp = self._pool.handle_request(req)
2025-03-22 14:34:31 File "/usr/local/lib/python3.10/dist-packages/httpcore/_sync/connection_pool.py", line 256, in handle_request
2025-03-22 14:34:31 raise exc from None
2025-03-22 14:34:31 File "/usr/local/lib/python3.10/dist-packages/httpcore/_sync/connection_pool.py", line 236, in handle_request
2025-03-22 14:34:31 response = connection.handle_request(
2025-03-22 14:34:31 File "/usr/local/lib/python3.10/dist-packages/httpcore/_sync/connection.py", line 101, in handle_request
2025-03-22 14:34:31 raise exc
2025-03-22 14:34:31 File "/usr/local/lib/python3.10/dist-packages/httpcore/_sync/connection.py", line 78, in handle_request
2025-03-22 14:34:31 stream = self._connect(request)
2025-03-22 14:34:31 File "/usr/local/lib/python3.10/dist-packages/httpcore/_sync/connection.py", line 124, in _connect
2025-03-22 14:34:31 stream = self._network_backend.connect_tcp(**kwargs)
2025-03-22 14:34:31 File "/usr/local/lib/python3.10/dist-packages/httpcore/_backends/sync.py", line 207, in connect_tcp
2025-03-22 14:34:31 with map_exceptions(exc_map):
2025-03-22 14:34:31 File "/usr/lib/python3.10/contextlib.py", line 153, in exit
2025-03-22 14:34:31 self.gen.throw(typ, value, traceback)
2025-03-22 14:34:31 File "/usr/local/lib/python3.10/dist-packages/httpcore/_exceptions.py", line 14, in map_exceptions
2025-03-22 14:34:31 raise to_exc(exc) from exc
2025-03-22 14:34:31 httpcore.ConnectError: [Errno 111] Connection refused
2025-03-22 14:34:31
2025-03-22 14:34:31 The above exception was the direct cause of the following exception:
2025-03-22 14:34:31
2025-03-22 14:34:31 Traceback (most recent call last):
2025-03-22 14:34:31 File "/usr/local/lib/python3.10/dist-packages/openai/_base_client.py", line 993, in _request
2025-03-22 14:34:31 response = self._client.send(
2025-03-22 14:34:31 File "/usr/local/lib/python3.10/dist-packages/httpx/_client.py", line 914, in send
2025-03-22 14:34:31 response = self._send_handling_auth(
2025-03-22 14:34:31 File "/usr/local/lib/python3.10/dist-packages/httpx/_client.py", line 942, in _send_handling_auth
2025-03-22 14:34:31 response = self._send_handling_redirects(
2025-03-22 14:34:31 File "/usr/local/lib/python3.10/dist-packages/httpx/_client.py", line 979, in _send_handling_redirects
2025-03-22 14:34:31 response = self._send_single_request(request)
2025-03-22 14:34:31 File "/usr/local/lib/python3.10/dist-packages/httpx/_client.py", line 1014, in _send_single_request
2025-03-22 14:34:31 response = transport.handle_request(request)
2025-03-22 14:34:31 File "/usr/local/lib/python3.10/dist-packages/httpx/_transports/default.py", line 249, in handle_request
2025-03-22 14:34:31 with map_httpcore_exceptions():
2025-03-22 14:34:31 File "/usr/lib/python3.10/contextlib.py", line 153, in exit
2025-03-22 14:34:31 self.gen.throw(typ, value, traceback)
2025-03-22 14:34:31 File "/usr/local/lib/python3.10/dist-packages/httpx/_transports/default.py", line 118, in map_httpcore_exceptions
2025-03-22 14:34:31 raise mapped_exc(message) from exc
2025-03-22 14:34:31 httpx.ConnectError: [Errno 111] Connection refused
2025-03-22 14:34:31
2025-03-22 14:34:31 The above exception was the direct cause of the following exception:
2025-03-22 14:34:31
2025-03-22 14:34:31 Traceback (most recent call last):
2025-03-22 14:34:31 File "/app/core/ask_gpt.py", line 73, in ask_gpt
2025-03-22 14:34:31 response = client.chat.completions.create(**completion_args)
2025-03-22 14:34:31 File "/usr/local/lib/python3.10/dist-packages/openai/_utils/_utils.py", line 275, in wrapper
2025-03-22 14:34:31 return func(*args, **kwargs)
2025-03-22 14:34:31 File "/usr/local/lib/python3.10/dist-packages/openai/resources/chat/completions.py", line 829, in create
2025-03-22 14:34:31 return self._post(
2025-03-22 14:34:31 File "/usr/local/lib/python3.10/dist-packages/openai/_base_client.py", line 1280, in post
2025-03-22 14:34:31 return cast(ResponseT, self.request(cast_to, opts, stream=stream, stream_cls=stream_cls))
2025-03-22 14:34:31 File "/usr/local/lib/python3.10/dist-packages/openai/_base_client.py", line 957, in request
2025-03-22 14:34:31 return self._request(
2025-03-22 14:34:31 File "/usr/local/lib/python3.10/dist-packages/openai/_base_client.py", line 1017, in _request
2025-03-22 14:34:31 return self._retry_request(
2025-03-22 14:34:31 File "/usr/local/lib/python3.10/dist-packages/openai/_base_client.py", line 1095, in _retry_request
2025-03-22 14:34:31 return self._request(
2025-03-22 14:34:31 File "/usr/local/lib/python3.10/dist-packages/openai/_base_client.py", line 1017, in _request
2025-03-22 14:34:31 return self._retry_request(
2025-03-22 14:34:31 File "/usr/local/lib/python3.10/dist-packages/openai/_base_client.py", line 1095, in _retry_request
2025-03-22 14:34:31 return self._request(
2025-03-22 14:34:31 File "/usr/local/lib/python3.10/dist-packages/openai/_base_client.py", line 1027, in _request
2025-03-22 14:34:31 raise APIConnectionError(request=request) from err
2025-03-22 14:34:31 openai.APIConnectionError: Connection error.
2025-03-22 14:34:31
2025-03-22 14:34:31 During handling of the above exception, another exception occurred:
2025-03-22 14:34:31
2025-03-22 14:34:31 Traceback (most recent call last):
2025-03-22 14:34:31 File "/usr/local/lib/python3.10/dist-packages/streamlit/runtime/scriptrunner/exec_code.py", line 88, in exec_func_with_error_handling
2025-03-22 14:34:31 result = func()
2025-03-22 14:34:31 File "/usr/local/lib/python3.10/dist-packages/streamlit/runtime/scriptrunner/script_runner.py", line 590, in code_to_exec
2025-03-22 14:34:31 exec(code, module.dict)
2025-03-22 14:34:31 File "/app/st.py", line 124, in
经过测试第三方连接ollama无异常,可正常连接,代码显示如下:
(base) PS C:\Windows\system32> netstat -ano | findstr :11434 TCP 127.0.0.1:11434 0.0.0.0:0 LISTENING 49296 (base) PS C:\Windows\system32>
【已自行解决】:经排查确认以上错误系“# Ollama 与WSL2的Docker容器之间通信隔离所致”
解决步骤:
- 验证ollama服务状态检查服务运行:确保ollama在Windows后台运行,监听端口11434。
netstat -ano | findstr :11434 # 确认端口监听状态
环境变量配置:设置OLLAMA_HOST=0.0.0.0(若ollama在Docker中运行需映射端口11434:11434)。
- 修正服务绑定地址 修改ollama配置: 编辑ollama配置文件或启动参数,明确绑定到0.0.0.0。 若使用Docker部署ollama,添加参数:
-p 11434:11434 # 端口映射
--env OLLAMA_HOST=0.0.0.0
- 配置Windows防火墙规则 放行ollama端口:
打开「Windows Defender 防火墙」→「高级设置」→「入站规则」。 新建规则:选择“端口” → TCP → 特定端口11434 → 允许连接 → 勾选所有网络类型。 命名规则为Allow WSL2 to Ollama。 针对WSL2子网放行(可选):
# 获取WSL2子网(如172.28.0.0/20)
wsl -- ip addr show eth0 | grep inet
# 创建允许该子网的防火墙规则
New-NetFirewallRule -DisplayName "Allow WSL2" -Direction Inbound -LocalPort 11434 -Action Allow -Protocol TCP -RemoteAddress 172.28.0.0/20
- 修正WSL2访问地址 使用宿主机网关地址:在WSL2中通过以下命令获取宿主机网关地址:
cat /etc/resolv.conf | grep nameserver | awk '{print $2}' # 输出如172.28.0.1
修改VideoLingo配置:将ollama的API地址从localhost:11434改为宿主机网关IP:11434(如http://172.28.0.1:11434)。
- 验证连通性 从WSL2内部测试:
curl http://172.28.0.1:11434 # 应返回"Ollama is running"
telnet 172.28.0.1 11434 # 检查端口可达性
- 处理动态IP问题(可选) 自动化脚本:创建/etc/profile.d/wsl_hosts.sh,动态更新hosts:
#!/bin/bash
WIN_IP=$(cat /etc/resolv.conf | grep nameserver | awk '{print $2}')
echo "$WIN_IP host.docker.internal" >> /etc/hosts
Docker容器内访问:通过host.docker.internal域名指向宿主机。
VideoLingo连接ollama的配置调整:
# VideoLingo配置文件(如config.yml)
ollama:
api_base: "http://host.docker.internal:11434" # Docker容器内访问宿主机
# 或直接使用网关IP
# api_base: "http://172.28.0.1:11434"
WSL2网络优化
# %USERPROFILE%\.wslconfig
[wsl2]
networkingMode=mirrored # 启用镜像网络模式(Win10+)
localhostForwarding=true
firewall=false # 关闭Hyper-V防火墙(需权衡安全)