ollama
Search before asking
- [x] I had searched in the issues and found no similar issues.
Operating system information
Linux
What happened
not run with ollama wsl win10
How to reproduce
?
Are you willing to submit PR?
- [x] Yes I am willing to submit a PR!
Global Configuration
Figure storage configuration:
What do I write?
Model Configuration
add model
Is it possible to follow the steps from A to Z to run the Olam model?
wsl2 win10
m@DESKTOP-MUB16F8:~$ source myenv/bin/activate
(myenv) m@DESKTOP-MUB16F8:~$ cd KAG
(myenv) m@DESKTOP-MUB16F8:~/KAG$ knext project create --config_path kag_config.cfg
2025-06-06 21:31:24 - INFO - root - Done init config from local file
Traceback (most recent call last):
File "/home/m/myenv/bin/knext", line 8, in
2025-06-06 21:33:10 - INFO - root - Failed to initialize class <class 'kag.interface.common.llm_client.LLMClient'>, info: 'No configuration setting found for key name' 2025-06-06 21:33:10 - INFO - root - Failed to initialize class <class 'kag.interface.common.llm_client.LLMClient'>, info: 'No configuration setting found for key name' Error: 'No configuration setting found for key name' (myenv) m@DESKTOP-MUB16F8:~/KAG$ (myenv) m@DESKTOP-MUB16F8:~/KAG$ ollama list NAME ID SIZE MODIFIED bge-m3:latest 790764642607 1.2 GB 6 hours ago qwen2.5:3b 357c53fb659c 1.9 GB 7 hours ago orca-mini:latest 2dbd9f439647 2.0 GB 12 days ago llama3.2:1b baf6a787fdff 1.3 GB 12 days ago (myenv) m@DESKTOP-MUB16F8:~/KAG$ knext project create --config_path kag/examples/example_config.yaml 2025-06-06 21:37:42 - INFO - root - Done init config from local file 2025-06-06 21:37:48 - INFO - root - Failed to initialize class <class 'kag.interface.common.llm_client.LLMClient'>, info: 'No configuration setting found for key name' 2025-06-06 21:37:48 - INFO - root - Failed to initialize class <class 'kag.interface.common.llm_client.LLMClient'>, info: 'No configuration setting found for key name' Error: 'No configuration setting found for key name' (myenv) m@DESKTOP-MUB16F8:~/KAG$ knext project create --config_path kag/examples/example_config.yaml 2025-06-06 21:38:38 - INFO - root - Done init config from local file
2025-06-06 21:38:43 - INFO - root - Failed to initialize class <class 'kag.interface.common.llm_client.LLMClient'>, info: 'No configuration setting found for key name'
2025-06-06 21:38:43 - INFO - root - Failed to initialize class <class 'kag.interface.common.llm_client.LLMClient'>, info: 'No configuration setting found for key name'
Error: 'No configuration setting found for key name'
(myenv) m@DESKTOP-MUB16F8:~/KAG$
(myenv) m@DESKTOP-MUB16F8:~/KAG$ knext project create --config_path kag_config.cfg
2025-06-06 21:41:12 - INFO - root - Done init config from local file
Traceback (most recent call last):
File "/home/m/myenv/bin/knext", line 8, in
project: namespace: KagDemo host_addr: http://localhost:8887
vectorizer: vectorizer: kag.common.vectorizer.OpenAIVectorizer model: bge-m3:latest api_ke> api_key: EMPTY base_url: http://127.0.0.1:11434/v1 r_dimen> vector_dimensions: 1024
llm: client_type: ollama base_url: http://localhost:11434/api/generate model: qwen2.5:3b name: qwen2.5:3b
prompt: biz_scene: default language: en
log: level: INFO EOF (myenv) m@DESKTOP-MUB16F8:~/KAG$ knext project create --config_path my_config.yaml 2025-06-06 21:49:12 - INFO - root - Done init config from local file 2025-06-06 21:49:18 - INFO - root - Failed to initialize class <class 'kag.interface.common.llm_client.LLMClient'>, info: 'No configuration setting found for key name' 2025-06-06 21:49:18 - INFO - root - Failed to initialize class <class 'kag.interface.common.llm_client.LLMClient'>, info: 'No configuration setting found for key name' Error: 'No configuration setting found for key name' (myenv) m@DESKTOP-MUB16F8:~/KAG$ python3 -c "import yaml; print('YAML valid!' if yaml.safe_load(open('my_config.yaml')) else 'Invalid YAML')" -bash: !': event not found (myenv) m@DESKTOP-MUB16F8:~/KAG$ cat > my_config.yaml << 'EOF' ject: project: namespace: KagDemo host_addr: http://localhost:8887
vectorizer: vectorizer: kag.common.vectorizer.OpenAIVectorizer model: bge-m3:latest api_key: EMPTY base_url: http://127.0.0.1:11434/v1 vector_dimensions: 1024 m: cl> llm: client_type: ollama base_url: http://localhost:11434/api/generate model: qwen2.5:3b name: qwen2.5:3b
prompt: biz_scene: default language: en
log: level: INFO EOF (myenv) m@DESKTOP-MUB16F8:~/KAG$ cat > my_config.yaml << 'EOF' ject: project: namespace: KagDemo host_addr: http://localhost:8887 r: ve> vectorizer: vectorizer: kag.common.vectorizer.OpenAIVectorizer model: bge-m3:latest api_key: EMPTY base_url: http://127.0.0.1:11434/v1 r_dimen> vector_dimensions: 1024 m: na> llm: name: qwen2.5:3b t_type:> client_type: ollama base_url: http://localhost:11434/api/generate model: qwen2.5:3b
prompt: biz_scene: default language: en
log: level: INFO EOF (myenv) m@DESKTOP-MUB16F8:~/KAG$ cat > simple_config.yaml << 'EOF' : nam> project: espace:> namespace: KagDemo host_addr: http://localhost:8887
name:> llm:
name: qwen2.5:3b client_type: ollama model: > model: qwen2.5:3b base_url: http://localhost:11434/api/generate
vectori>
vectorizer: model: bge-m3:latest base_url: http://127.0.0.1:11434/v1 api_key: EMPTY
log: level: INFO EOF (myenv) m@DESKTOP-MUB16F8:~/KAG$ knext project create --config_path simple_config.yaml 2025-06-06 21:50:52 - INFO - root - Done init config from local file 2025-06-06 21:50:57 - INFO - root - Failed to initialize class <class 'kag.interface.common.llm_client.LLMClient'>, info: 'No configuration setting found for key name' 2025-06-06 21:50:57 - INFO - root - Failed to initialize class <class 'kag.interface.common.llm_client.LLMClient'>, info: 'No configuration setting found for key name' Error: 'No configuration setting found for key name' (myenv) m@DESKTOP-MUB16F8:~/KAG$ cd \wsl.localhost\Ubuntu-24.04\home\m\KAG\kag\examples\baike -bash: cd: \wsl.localhostUbuntu-24.04homemKAGkagexamplesbaike: No such file or directory (myenv) m@DESKTOP-MUB16F8:~/KAG$ cd kag\examples\baike -bash: cd: kagexamplesbaike: No such file or directory (myenv) m@DESKTOP-MUB16F8:~/KAG$ kag\examples\baike kagexamplesbaike: command not found (myenv) m@DESKTOP-MUB16F8:~/KAG$ cd kag/examples/baike (myenv) m@DESKTOP-MUB16F8:~/KAG/kag/examples/baike$ knext project create --config_path kag_config.cfg 2025-06-06 21:53:17 - INFO - root - found config file: /home/m/KAG/kag/examples/baike/kag_config.yaml 2025-06-06 21:53:17 - INFO - root - Done init config from local file Traceback (most recent call last): File "/home/m/myenv/bin/knext", line 8, in
sys.exit(_main()) ^^^^^^^ File "/home/m/myenv/lib/python3.12/site-packages/click/core.py", line 1157, in call return self.main(*args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/home/m/myenv/lib/python3.12/site-packages/click/core.py", line 1078, in main rv = self.invoke(ctx) ^^^^^^^^^^^^^^^^ File "/home/m/KAG/knext/command/exception.py", line 21, in invoke return super().invoke(ctx) ^^^^^^^^^^^^^^^^^^^ File "/home/m/myenv/lib/python3.12/site-packages/click/core.py", line 1688, in invoke return _process_result(sub_ctx.command.invoke(sub_ctx)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/home/m/myenv/lib/python3.12/site-packages/click/core.py", line 1688, in invoke return _process_result(sub_ctx.command.invoke(sub_ctx)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/home/m/myenv/lib/python3.12/site-packages/click/core.py", line 1434, in invoke return ctx.invoke(self.callback, **ctx.params) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/home/m/myenv/lib/python3.12/site-packages/click/core.py", line 783, in invoke return __callback(*args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/home/m/KAG/knext/command/sub_command/project.py", line 140, in create_project config = yaml.load(Path(config_path).read_text() or "{}") ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/home/m/myenv/lib/python3.12/site-packages/ruamel/yaml/main.py", line 453, in load return constructor.get_single_data() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/home/m/myenv/lib/python3.12/site-packages/ruamel/yaml/constructor.py", line 117, in get_single_data node = self.composer.get_single_node() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/home/m/myenv/lib/python3.12/site-packages/ruamel/yaml/composer.py", line 75, in get_single_node if not self.parser.check_event(StreamEndEvent): ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/home/m/myenv/lib/python3.12/site-packages/ruamel/yaml/parser.py", line 141, in check_event self.current_event = self.state() ^^^^^^^^^^^^ File "/home/m/myenv/lib/python3.12/site-packages/ruamel/yaml/parser.py", line 211, in parse_document_start raise ParserError( ruamel.yaml.parser.ParserError: expected ' ', but found (' ',) in " ", line 3, column 1: namespace = KagDemo ^ (line: 3) (myenv) m@DESKTOP-MUB16F8:~/KAG/kag/examples/baike$ knext project restore --proj_path KagDemo --host_addr http://127.0.0.1:8887 2025-06-06 21:53:50 - INFO - root - found config file: /home/m/KAG/kag/examples/baike/kag_config.yaml 2025-06-06 21:53:50 - INFO - root - Done init config from local file ERROR: No such directory: KagDemo (myenv) m@DESKTOP-MUB16F8:~/KAG/kag/examples/baike$ cd .. (myenv) m@DESKTOP-MUB16F8:~/KAG/kag/examples$ cd .. (myenv) m@DESKTOP-MUB16F8:~/KAG/kag$ cd .. (myenv) m@DESKTOP-MUB16F8:~/KAG$ cat kag/examples/example_config.yaml project: namespace: KagDemo host_addr: http://localhost:8887
vectorizer: vectorizer: kag.common.vectorizer.OpenAIVectorizer model: bge-m3:latest api_key: EMPTY base_url: http://127.0.0.1:11434/v1 vector_dimensions: 1024
llm: client_type: ollama base_url: http://localhost:11434/api/generate model: qwen2.5:3b name: qwen2.5:3b # Add this missing 'name' field
prompt: biz_scene: default language: en
log: level: INFO(myenv) m@DESKTOP-MUB16F8:~/KAG$ cat > test_config.yaml << 'EOF'
project: pace: K> namespace: KagDemo addr: > host_addr: http://localhost:8887
llm: client_type: "ollama" name: "qwen2.5:3b" del: "> model: "qwen2.5:3b" base_url: "http://localhost:11434/api/generate"
vectorizer: model: "bge-m3:latest" base_url: "http://127.0.0.1:11434/v1" api_key: "EMPTY"
leve>
log: level: "INFO" EOF (myenv) m@DESKTOP-MUB16F8:~/KAG$ knext project create --config_path test_config.yaml 2025-06-06 21:54:51 - INFO - root - Done init config from local file 2025-06-06 21:54:56 - INFO - root - Failed to initialize class <class 'kag.interface.common.llm_client.LLMClient'>, info: 'No configuration setting found for key name' 2025-06-06 21:54:56 - INFO - root - Failed to initialize class <class 'kag.interface.common.llm_client.LLMClient'>, info: 'No configuration setting found for key name' Error: 'No configuration setting found for key name' (myenv) m@DESKTOP-MUB16F8:~/KAG$ cat > clean_config.yaml << 'EOF' project:> project: namespace: KagDemo host_addr: http://localhost:8887 izer:
vectorizer: vectorizer: kag.common.vectorizer.OpenAIVectorizer : bge-m> model: bge-m3:latest api_key: EMPTY base_url: http://127.0.0.1:11434/v1 vector_dimensions: 1024
llm: client_type: ollama rl: htt> base_url: http://localhost:11434/api/generate el: qwe> model: qwen2.5:3b name: qwen2.5:3b
prompt: biz_scene: default language: en
log: level: INFO EOF (myenv) m@DESKTOP-MUB16F8:~/KAG$ knext project create --config_path clean_config.yaml 2025-06-06 21:55:17 - INFO - root - Done init config from local file 2025-06-06 21:55:19 - INFO - root - Failed to initialize class <class 'kag.interface.common.llm_client.LLMClient'>, info: 'No configuration setting found for key name' 2025-06-06 21:55:19 - INFO - root - Failed to initialize class <class 'kag.interface.common.llm_client.LLMClient'>, info: 'No configuration setting found for key name' Error: 'No configuration setting found for key name' (myenv) m@DESKTOP-MUB16F8:~/KAG$ # احفظ نسخة احتياطية kag/exa(myenv) m@DESKTOP-MUB16F8:~/KAG$ cp kag/examples/example_config.yaml kag/examples/example_config.yaml.backup أنشئ ملف جديد نظيف cat > kag/examples/example_config.yaml << 'EOF' project: namespace: KagDemo host_addr: http://localhost:8887
vectorizer: vectorizer: kag.common.vectorizer.OpenAIVectorizer model: bge(myenv) m@DESKTOP-MUB16F8:~/KAG$ (myenv) m@DESKTOP-MUB16F8:~/KAG$ # أنشئ ملف جديد نظيف (myenv) m@DESKTOP-MUB16F8:~/KAG$ cat > kag/examples/example_config.yaml << 'EOF'
project: namespace: KagDemo host_addr: http://localhost:8887
vectorizer: vectorizer: kag.common.vectorizer.OpenAIVectorizer model: bge-m3:latest api_key: EMPTY se_url:> base_url: http://127.0.0.1:11434/v1 vector_> vector_dimensions: 1024
llm: client_type: ollama base_url: http://localhost:11434/api/generate model: qwen2.5:3b name: qwen2.5:3b
prompt: biz_scene: default language: en
log: level: INFO EOF (myenv) m@DESKTOP-MUB16F8:~/KAG$ knext project create --config_path kag/examples/example_config.yaml 2025-06-06 21:55:31 - INFO - root - Done init config from local file 2025-06-06 21:55:33 - INFO - root - Failed to initialize class <class 'kag.interface.common.llm_client.LLMClient'>, info: 'No configuration setting found for key name' 2025-06-06 21:55:33 - INFO - root - Failed to initialize class <class 'kag.interface.common.llm_client.LLMClient'>, info: 'No configuration setting found for key name' Error: 'No configuration setting found for key name' (myenv) m@DESKTOP-MUB16F8:~/KAG$ # ابحث عن الملفات التي تحتوي على "name" في كود LLMClient find . -(myenv) m@DESKTOP-MUB16F8:~/KAG$ find . -name "*.py" -exec grep -l "No configuration setting found for key name" {} ; cat > openai_config.yaml << 'EOF' project: namespace: KagDemo host_addr: http://localhost:8887
llm: name: qwen2.5:3b client_type: openai model: qwen2.5:3b base_url: http://localhost:11434/v1 api_key: EMPTY
vectorizer: vectorizer: kag.common.vectorizer.OpenAIVectorizer model: bge-m3:latest api_key: EMPTY base_url: http://127.0.0.1:11434/v1 vector_dimensions: 1024
prompt: biz_scene: default language: en
log: level: INFO EOF (myenv) m@DESKTOP-MUB16F8:~/KAG$ cat > openai_config.yaml << 'EOF'
project: namespace: KagDemo host_addr: http://localhost:8887
llm: name: qwen2.5:3b client_type: openai model: qwen2.5:3b base_url: http://localhost:11434/v1 api_key: EMPTY
vectorizer: vectorizer: kag.common.vectorizer.OpenAIVectorizer model: bge-m3:latest api_key: EMPTY base_url: http://127.0.0.1:11434/v1 vector_dimensions: 1024
prompt: biz_scene: default language: en
log: level: INFO EOF (myenv) m@DESKTOP-MUB16F8:~/KAG$ knext project create --config_path debug_config.yaml 2025-06-06 21:57:04 - INFO - root - Done init config from local file Traceback (most recent call last): File "/home/m/myenv/bin/knext", line 8, in
sys.exit(_main()) ^^^^^^^ File "/home/m/myenv/lib/python3.12/site-packages/click/core.py", line 1157, in call return self.main(args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/home/m/myenv/lib/python3.12/site-packages/click/core.py", line 1078, in main rv = self.invoke(ctx) ^^^^^^^^^^^^^^^^ File "/home/m/KAG/knext/command/exception.py", line 21, in invoke return super().invoke(ctx) ^^^^^^^^^^^^^^^^^^^ File "/home/m/myenv/lib/python3.12/site-packages/click/core.py", line 1688, in invoke return _process_result(sub_ctx.command.invoke(sub_ctx)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/home/m/myenv/lib/python3.12/site-packages/click/core.py", line 1688, in invoke return _process_result(sub_ctx.command.invoke(sub_ctx)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/home/m/myenv/lib/python3.12/site-packages/click/core.py", line 1434, in invoke return ctx.invoke(self.callback, **ctx.params) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/home/m/myenv/lib/python3.12/site-packages/click/core.py", line 783, in invoke return __callback(args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/home/m/KAG/knext/command/sub_command/project.py", line 140, in create_project config = yaml.load(Path(config_path).read_text() or "{}") ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/lib/python3.12/pathlib.py", line 1029, in read_text with self.open(mode='r', encoding=encoding, errors=errors) as f: ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/lib/python3.12/pathlib.py", line 1015, in open return io.open(self, mode, buffering, encoding, errors, newline) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ FileNotFoundError: [Errno 2] No such file or directory: 'debug_config.yaml' (myenv) m@DESKTOP-MUB16F8:~/KAG$ find . -name ".yaml" -o -name ".yml" | xargs grep -l "ollama|llm" | head -5 ./.github/ISSUE_TEMPLATE/bug-report.yml ./kag/examples/domain_kg/kag_config.yaml ./kag/examples/riskmining/kag_config.yaml ./kag/examples/supplychain/kag_config.yaml ./kag/examples/baidu_map_mcp/kag_config.yaml (myenv) m@DESKTOP-MUB16F8:~/KAG$ cd \wsl.localhost\Ubuntu-24.04\home\m\KAG\tests\unit\common\llm -bash: cd: \wsl.localhostUbuntu-24.04homemKAGtestsunitcommonllm: No such file or directory (myenv) m@DESKTOP-MUB16F8:~/KAG$ cd "tests\unit\common\llm" -bash: cd: tests\unit\common\llm: No such file or directory (myenv) m@DESKTOP-MUB16F8:~/KAG$ python3 ~/test_llm.py python3: can't open file '/home/m/test_llm.py': [Errno 2] No such file or directory (myenv) m@DESKTOP-MUB16F8:~/KAG$ python3 ./test_llm.py python3: can't open file '/home/m/KAG/./test_llm.py': [Errno 2] No such file or directory (myenv) m@DESKTOP-MUB16F8:~/KAG$ ollama list NAME ID SIZE MODIFIED bge-m3:latest 790764642607 1.2 GB 6 hours ago qwen2.5:3b 357c53fb659c 1.9 GB 7 hours ago orca-mini:latest 2dbd9f439647 2.0 GB 12 days ago llama3.2:1b baf6a787fdff 1.3 GB 12 days ago (myenv) m@DESKTOP-MUB16F8:~/KAG$ python3 Python 3.12.3 (main, Feb 4 2025, 14:48:35) [GCC 13.3.0] on linux Type "help", "copyright", "credits" or "license" for more information. -- coding: utf-8 --
from re import sub mport py>>> import pytest ort asyncio from kag.interface import LLMClient
def get_openai_config(): return { "type": "openai", "base_url": "https://api.siliconflow.cn/v1/", "api_key": "sk-", "model": "Qwen/Qwen2.5-7B-Instruct", "stream": False, }
def get_ollama_config(): return { "type": "ollama", "model": "qwen2.5:3b", "stream": False, }
@pytest.mark.skip(reason="Missing API key")
def test_llm_client():
print("stream = False")
for conf in [get_openai_config(), get_ollama_config()]:
client = LLMClient.from_config(conf)
rsp = client("Who are you?")
print(f"rsp = {rsp}")
print("stream = True")
for conf in [get_openai_config(), get_ollama_config()]:
conf["stream"] = True
client = LLMClient.from_config(conf)
rsp = client("Who are you?")
print(f"rsp = {rsp}")
async def call_llm_client_async():
print("stream = False")
tasks = []
for conf in [get_openai_config(), get_ollama_config()]:
client = LLMClient.from_config(conf)
task = asyncio.create_task(client.acall("Who are you?"))
tasks.append(task)
result = await asyncio.gather(*tasks)
for rsp in result:
print(f"rsp = {rsp}")
print("stream = True")
tasks = []
for conf in [get_openai_config(), get_ollama_config()]:
conf["stream"] = True
client = LLMClient.from_config(conf)
task = asyncio.create_task(client.acall("Who are you?"))
tasks.append(task)
result = await asyncio.gather(*tasks)
for rsp in result:
print(f"rsp = {rsp}")
return result
@pytest.mark.skip(reason="Missing API key")
def test_llm_client_async(): res = asyncio.run(call_llm_client_async()) return res
def test_mock_llm_client(): conf = {"type": "mock"} client = LLMClient.from_config(conf) rsp = client.call_with_json_parse("who are you?") assert rsp == "I am an intelligent assistant"
def test_llm_client_with_func_call(): for conf in [get_openai_config(), get_ollama_config()]: client = LLMClient.from_config(conf) subtract_two_numbers_tool = { "type": "function", "function": { "name": "subtract_two_numbers", "description": "Subtract two numbers", "parameters": { "type": "object", "required": ["a", "b"], "properties": { "a": {"type": "integer", "description": "The first number"}, "b": {"type": "integer", "description": "The second number"}, }, }, }, }
tool_calls = client(
"What is three subtract one?", tools=[subtract_two_numbers_tool]
)
print(f"tool_calls = {tool_calls}")
async def call_llm_client_with_func_call_async(): for conf in [get_openai_config(), get_ollama_config()]: client = LLMClient.from_config(conf) subtract_two_numbers_tool = { "type": "function", "function": { "name": "subtract_two_numbers", "description": "Subtract two numbers", "parameters": { "type": "object", "required": ["a", "b"], "properties": { "a": {"type": "integer", "description": "The first number"}, "b": {"type": "integer", "description": "The second number"}, }, }, }, }
tool_calls = await client.acall(
"What is three subtract one? ",
tools=[subtract_two_numbers_tool],
)
print(f"tool_calls = {tool_calls}")
def test_llm_client_with_func_call_async(): res = asyncio.run(call_llm_client_with_func_call_async()) return res
import asyncio from kag.interface import LLMClient 2025-06-06 22:01:28 - INFO - root - Done init config from local file
def get_openai_config(): ... return { ... "type": "openai", ... "base_url": "https://api.siliconflow.cn/v1/", ... "api_key": "sk-", ... "model": "Qwen/Qwen2.5-7B-Instruct", ... "stream": False, ... } ...
def get_ollama_config(): ... return { ... "type": "ollama", ... "model": "qwen2.5:3b", ... "stream": False, ... } ...
@pytest.mark.skip(reason="Missing API key")
def test_llm_client(): ... File "
", line 2
^
IndentationError: expected an indented block after function definition on line 1
print("stream = False")
File "
for conf in [get_openai_config(), get_ollama_config()]:
File "
client = LLMClient.from_config(conf)
File "
rsp = client("Who are you?")
File "
print(f"rsp = {rsp}")
File "
print("stream = True")
File "
for conf in [get_openai_config(), get_ollama_config()]:
File "
conf["stream"] = True
File "
client = LLMClient.from_config(conf)
File "
rsp = client("Who are you?")
File "
print(f"rsp = {rsp}")
File "
async def call_llm_client_async(): ... File "
", line 2
^
IndentationError: expected an indented block after function definition on line 1
print("stream = False")
File "
tasks = []
File "
for conf in [get_openai_config(), get_ollama_config()]:
File "
client = LLMClient.from_config(conf)
File "
task = asyncio.create_task(client.acall("Who are you?"))
File "
tasks.append(task)
File "
result = await asyncio.gather(*tasks)
File "
for rsp in result:
File "
print(f"rsp = {rsp}")
File "
print("stream = True")
File "
tasks = []
File "
for conf in [get_openai_config(), get_ollama_config()]:
File "
conf["stream"] = True
File "
client = LLMClient.from_config(conf)
File "
task = asyncio.create_task(client.acall("Who are you?"))
File "
tasks.append(task)
File "
result = await asyncio.gather(*tasks)
File "
for rsp in result:
File "
print(f"rsp = {rsp}")
File "
return result
File "
@pytest.mark.skip(reason="Missing API key")
def test_llm_client_async(): ... res = asyncio.run(call_llm_client_async()) ... return res ...
def test_mock_llm_client(): ... conf = {"type": "mock"} ... client = LLMClient.from_config(conf) ... rsp = client.call_with_json_parse("who are you?") ... assert rsp == "I am an intelligent assistant" ...
def test_llm_client_with_func_call(): ... for conf in [get_openai_config(), get_ollama_config()]: ... client = LLMClient.from_config(conf) ... subtract_two_numbers_tool = { ... "type": "function", ... "function": { ... "name": "subtract_two_numbers", ... "description": "Subtract two numbers", ... "parameters": { ... "type": "object", ... "required": ["a", "b"], ... "properties": { ... "a": {"type": "integer", "description": "The first number"}, ... "b": {"type": "integer", "description": "The second number"}, ... }, ... }, ... }, ... } ... tool_calls = client( File "
", line 1 tool_calls = client( IndentationError: unexpected indent "What is three subtract one?", tools=[subtract_two_numbers_tool] File " ", line 1 "What is three subtract one?", tools=[subtract_two_numbers_tool] IndentationError: unexpected indent ) File " ", line 1 ) IndentationError: unexpected indent print(f"tool_calls = {tool_calls}") File " ", line 1 print(f"tool_calls = {tool_calls}") IndentationError: unexpected indent async def call_llm_client_with_func_call_async(): ... for conf in [get_openai_config(), get_ollama_config()]: ... client = LLMClient.from_config(conf) ... subtract_two_numbers_tool = { ... "type": "function", ... "function": { ... "name": "subtract_two_numbers", ... "description": "Subtract two numbers", ... "parameters": { ... "type": "object", ... "required": ["a", "b"], ... "properties": { ... "a": {"type": "integer", "description": "The first number"}, ... "b": {"type": "integer", "description": "The second number"}, ... }, ... }, ... }, ... } ... tool_calls = await client.acall( File "
", line 1 tool_calls = await client.acall( IndentationError: unexpected indent "What is three subtract one? ", File " ", line 1 "What is three subtract one? ", IndentationError: unexpected indent tools=[subtract_two_numbers_tool], File " ", line 1 tools=[subtract_two_numbers_tool], IndentationError: unexpected indent ) File " ", line 1 ) IndentationError: unexpected indent print(f"tool_calls = {tool_calls}") File " ", line 1 print(f"tool_calls = {tool_calls}") IndentationError: unexpected indent def test_llm_client_with_func_call_async(): ... res = asyncio.run(call_llm_client_with_func_call_async()) ... return res ...
how to run ollama in kag Step by step
?????????
All video explanations are not displayed. If possible, a video or operating commands and the interface is different. Is there a difference in the latest version?
Olama settings please؟؟؟؟؟؟؟؟؟؟؟؟؟؟؟؟؟؟؟؟؟؟؟؟؟؟؟؟؟؟؟؟؟؟؟؟؟؟؟؟؟؟؟؟؟؟؟؟؟؟؟؟
Caused by: pemja.core.PythonException: <class 'RuntimeError'>: invalid vectorizer config: ollama not in acceptable choices for type: ['bge_vectorize_model', 'bge', 'bge_m3', 'openai', 'azure_openai', 'mock']. You should make sure the class is correctly registerd. at /home/admin/miniconda3/lib/python3.10/site-packages/kag/bridge/spg_server_bridge.run_vectorizer_config_check(spg_server_bridge.py:123) at /home/admin/miniconda3/lib/python3.10/site-packages/kag/common/vectorize_model/vectorize_model_config_checker.check(vectorize_model_config_checker.py:47) at pemja.core.PythonInterpreter.invokeMethodOneArgString(Native Method) at pemja.core.PythonInterpreter.invokeMethodOneArg(PythonInterpreter.java:212) at pemja.core.PythonInterpreter.invokeMethod(PythonInterpreter.java:116) at com.antgroup.openspg.common.util.pemja.PemjaUtils.lambda$null$0(PemjaUtils.java:63) at java.util.concurrent.FutureTask.run(FutureTask.java:266) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) ... 1 common frames omitted 2025-06-08 10:05:55,728 [] [] [http-nio-8887-exec-6] ERROR c.a.o.s.a.h.s.HttpBizTemplate - execute http biz callback unknown error java.lang.RuntimeException: PemjaUtils.invoke Exception uniqueKey:SPGServerBridge_659a452bdce05edf9a57582559ed1fc9 at com.antgroup.openspg.common.util.pemja.PemjaUtils.lambda$invoke$1(PemjaUtils.java:88) at com.github.rholder.retry.AttemptTimeLimiters$NoAttemptTimeLimit.call(AttemptTimeLimiters.java:78) at com.github.rholder.retry.Retryer.call(Retryer.java:160) at com.antgroup.openspg.common.util.pemja.PemjaUtils.invoke(PemjaUtils.java:55) at com.antgroup.openspgapp.core.reasoner.service.utils.Utils.checkVectorizer(Utils.java:39) at com.antgroup.openspgapp.api.http.server.config.ConfigController$3.check(ConfigController.java:151) at com.antgroup.openspg.server.api.http.server.HttpBizTemplate.execute2(HttpBizTemplate.java:77) at com.antgroup.openspgapp.api.http.server.config.ConfigController.update(ConfigController.java:120) at sun.reflect.GeneratedMethodAccessor137.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.springframework.web.method.support.InvocableHandlerMethod.doInvoke(InvocableHandlerMethod.java:205) at org.springframework.web.method.support.InvocableHandlerMethod.invokeForRequest(InvocableHandlerMethod.java:150) at org.springframework.web.servlet.mvc.method.annotation.ServletInvocableHandlerMethod.invokeAndHandle(ServletInvocableHandlerMethod.java:117) at org.springframework.web.servlet.mvc.method.annotation.RequestMappingHandlerAdapter.invokeHandlerMethod(RequestMappingHandlerAdapter.java:895) at org.springframework.web.servlet.mvc.method.annotation.RequestMappingHandlerAdapter.handleInternal(RequestMappingHandlerAdapter.java:808) at org.springframework.web.servlet.mvc.method.AbstractHandlerMethodAdapter.handle(AbstractHandlerMethodAdapter.java:87) at org.springframework.web.servlet.DispatcherServlet.doDispatch(DispatcherServlet.java:1071) at org.springframework.web.servlet.DispatcherServlet.doService(DispatcherServlet.java:964) at org.springframework.web.servlet.FrameworkServlet.processRequest(FrameworkServlet.java:1006) at org.springframework.web.servlet.FrameworkServlet.doPut(FrameworkServlet.java:920) at javax.servlet.http.HttpServlet.service(HttpServlet.java:699) at org.springframework.web.servlet.FrameworkServlet.service(FrameworkServlet.java:883) at javax.servlet.http.HttpServlet.service(HttpServlet.java:779) at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:227) at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:162) at org.apache.tomcat.websocket.server.WsFilter.doFilter(WsFilter.java:53) at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:189) at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:162) at com.antgroup.openspgapp.api.http.server.filter.AclFilter.doFilter(AclFilter.java:146) at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:189) at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:162) at org.springframework.web.filter.RequestContextFilter.doFilterInternal(RequestContextFilter.java:100) at org.springframework.web.filter.OncePerRequestFilter.doFilter(OncePerRequestFilter.java:117) at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:189) at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:162) at org.springframework.web.filter.FormContentFilter.doFilterInternal(FormContentFilter.java:93) at org.springframework.web.filter.OncePerRequestFilter.doFilter(OncePerRequestFilter.java:117) at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:189) at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:162) at org.springframework.boot.actuate.metrics.web.servlet.WebMvcMetricsFilter.doFilterInternal(WebMvcMetricsFilter.java:96) at org.springframework.web.filter.OncePerRequestFilter.doFilter(OncePerRequestFilter.java:117) at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:189) at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:162) at com.alipay.sofa.tracer.plugins.springmvc.SpringMvcSofaTracerFilter.doFilter(SpringMvcSofaTracerFilter.java:87) at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:189) at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:162) at org.springframework.web.filter.CharacterEncodingFilter.doFilterInternal(CharacterEncodingFilter.java:201) at org.springframework.web.filter.OncePerRequestFilter.doFilter(OncePerRequestFilter.java:117) at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:189) at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:162) at org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:177) at org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:97) at org.apache.catalina.authenticator.AuthenticatorBase.invoke(AuthenticatorBase.java:541) at org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:135) at org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:92) at org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:78) at org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:360) at org.apache.coyote.http11.Http11Processor.service(Http11Processor.java:399) at org.apache.coyote.AbstractProcessorLight.process(AbstractProcessorLight.java:65) at org.apache.coyote.AbstractProtocol$ConnectionHandler.process(AbstractProtocol.java:891) at org.apache.tomcat.util.net.NioEndpoint$SocketProcessor.doRun(NioEndpoint.java:1784) at org.apache.tomcat.util.net.SocketProcessorBase.run(SocketProcessorBase.java:49) at org.apache.tomcat.util.threads.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1191) at org.apache.tomcat.util.threads.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:659) at org.apache.tomcat.util.threads.TaskThread$WrappingRunnable.run(TaskThread.java:61) at java.lang.Thread.run(Thread.java:750) Caused by: java.util.concurrent.ExecutionException: pemja.core.PythonException: <class 'RuntimeError'>: invalid vectorizer config: ollama not in acceptable choices for type: ['bge_vectorize_model', 'bge', 'bge_m3', 'openai', 'azure_openai', 'mock']. You should make sure the class is correctly registerd. at java.util.concurrent.FutureTask.report(FutureTask.java:122) at java.util.concurrent.FutureTask.get(FutureTask.java:206) at com.antgroup.openspg.common.util.pemja.PemjaUtils.lambda$invoke$1(PemjaUtils.java:65) ... 66 common frames omitted Caused by: pemja.core.PythonException: <class 'RuntimeError'>: invalid vectorizer config: ollama not in acceptable choices for type: ['bge_vectorize_model', 'bge', 'bge_m3', 'openai', 'azure_openai', 'mock']. You should make sure the class is correctly registerd. at /home/admin/miniconda3/lib/python3.10/site-packages/kag/bridge/spg_server_bridge.run_vectorizer_config_check(spg_server_bridge.py:123) at /home/admin/miniconda3/lib/python3.10/site-packages/kag/common/vectorize_model/vectorize_model_config_checker.check(vectorize_model_config_checker.py:47) at pemja.core.PythonInterpreter.invokeMethodOneArgString(Native Method) at pemja.core.PythonInterpreter.invokeMethodOneArg(PythonInterpreter.java:212) at pemja.core.PythonInterpreter.invokeMethod(PythonInterpreter.java:116) at com.antgroup.openspg.common.util.pemja.PemjaUtils.lambda$null$0(PemjaUtils.java:63) at java.util.concurrent.FutureTask.run(FutureTask.java:266) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) ... 1 common frames omitted 2025-06-08 10:05:55,734 [] [] [http-nio-8887-exec-6] INFO c.a.o.a.h.s.f.AclFilter - [PUT:http://127.0.0.1:8887/v1/configs/2] cost:19 2025-06-08 10:06:14,130 [] [] [dbSchedule-72] INFO c.a.o.s.c.s.s.h.c.d.SchedulerHandlerClient - Start DB SchedulerHandler:generateInstanceScheduleHandler 2025-06-08 10:06:14,137 [] [] [dbSchedule-72] INFO c.a.o.s.c.s.s.h.c.d.SchedulerHandlerClient - get scheduler lock success generateInstanceScheduleHandler 172.18.0.5 2025-06-08 10:06:14,138 [] [] [dbSchedule-72] INFO c.a.o.s.c.s.s.h.c.d.SchedulerHandlerClient - can execute result false generateInstanceScheduleHandler 2025-06-08 10:06:14,140 [] [] [dbSchedule-72] INFO c.a.o.s.c.s.s.h.c.d.SchedulerHandlerClient - Scheduler lock released successfully! 2025-06-08 10:06:14,193 [] [] [dbSchedule-73] INFO c.a.o.s.c.s.s.h.c.d.SchedulerHandlerClient - Start DB SchedulerHandler:executeInstanceScheduleHandler 2025-06-08 10:06:14,202 [] [] [dbSchedule-73] INFO c.a.o.s.c.s.s.h.c.d.SchedulerHandlerClient - get scheduler lock success executeInstanceScheduleHandler 172.18.0.5 2025-06-08 10:06:14,202 [] [] [dbSchedule-73] INFO c.a.o.s.c.s.s.h.c.d.SchedulerHandlerClient - can execute result false executeInstanceScheduleHandler 2025-06-08 10:06:14,205 [] [] [dbSchedule-73] INFO c.a.o.s.c.s.s.h.c.d.SchedulerHandlerClient - Scheduler lock released successfully! 2025-06-08 10:07:12,424 [] [] [dbSchedule-72] INFO c.a.o.s.c.s.s.h.c.d.SchedulerHandlerClient - Start DB SchedulerHandler:generateInstanceScheduleHandler 2025-06-08 10:07:12,425 [] [] [dbSchedule-73] INFO c.a.o.s.c.s.s.h.c.d.SchedulerHandlerClient - Start DB SchedulerHandler:executeInstanceScheduleHandler 2025-06-08 10:07:12,434 [] [] [dbSchedule-72] INFO c.a.o.s.c.s.s.h.c.d.SchedulerHandlerClient - get scheduler lock success generateInstanceScheduleHandler 172.18.0.5 2025-06-08 10:07:12,435 [] [] [dbSchedule-73] INFO c.a.o.s.c.s.s.h.c.d.SchedulerHandlerClient - get scheduler lock success executeInstanceScheduleHandler 172.18.0.5 2025-06-08 10:07:12,439 [] [] [dbSchedule-72] INFO c.a.o.s.c.s.s.h.c.d.SchedulerHandlerClient - can execute result true generateInstanceScheduleHandler 2025-06-08 10:07:12,440 [] [] [dbSchedule-72] INFO c.a.o.s.c.s.s.e.i.SchedulerExecuteServiceImpl - getAllPeriodJob successful size:0 2025-06-08 10:07:12,441 [] [] [dbSchedule-72] INFO c.a.o.s.c.s.s.h.i.GenerateInstanceScheduleHandler - run GenerateInstanceScheduleHandler end time:2 2025-06-08 10:07:12,442 [] [] [dbSchedule-73] INFO c.a.o.s.c.s.s.h.c.d.SchedulerHandlerClient - can execute result true executeInstanceScheduleHandler 2025-06-08 10:07:12,443 [] [] [dbSchedule-73] INFO c.a.o.s.c.s.s.e.i.SchedulerExecuteServiceImpl - getAllNotFinishInstance successful size:0 2025-06-08 10:07:12,444 [] [] [dbSchedule-73] INFO c.a.o.s.c.s.s.h.i.ExecuteInstanceScheduleHandler - run ExecuteInstanceScheduleHandler end time:2 2025-06-08 10:07:12,450 [] [] [dbSchedule-72] INFO c.a.o.s.c.s.s.h.c.d.SchedulerHandlerClient - Scheduler lock released successfully! 2025-06-08 10:07:12,462 [] [] [dbSchedule-73] INFO c.a.o.s.c.s.s.h.c.d.SchedulerHandlerClient - Scheduler lock released successfully!
(kag-demo) C:\Users\TARGET STORE\Desktop\6\KAG\kag\examples\csqa>docker ps CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES 3fa3109016b3 spg-registry.us-west-1.cr.aliyuncs.com/spg/openspg-server:latest "java -jar arks-sofa…" 59 minutes ago Up 59 minutes 0.0.0.0:8887->8887/tcp release-openspg-server 92764c91e6d3 spg-registry.us-west-1.cr.aliyuncs.com/spg/openspg-mysql:latest "docker-entrypoint.s…" 59 minutes ago Up 59 minutes 0.0.0.0:3306->3306/tcp release-openspg-mysql 16fe97c62c13 spg-registry.us-west-1.cr.aliyuncs.com/spg/openspg-minio:latest "/usr/bin/docker-ent…" 59 minutes ago Up 59 minutes 0.0.0.0:9000-9001->9000-9001/tcp release-openspg-minio 0e80e354e690 spg-registry.us-west-1.cr.aliyuncs.com/spg/openspg-neo4j:latest "tini -g -- /startup…" 59 minutes ago Up 59 minutes 0.0.0.0:7474->7474/tcp, 7473/tcp, 0.0.0.0:7687->7687/tcp release-openspg-neo4j
(kag-demo) C:\Users\TARGET STORE\Desktop\6\KAG\kag\examples\csqa>
https://dev.to/gaodalie_ai/kag-graph-multimodal-rag-llm-agents-powerful-ai-reasoning-57ko
KAG\kag\examples\csqa\kag_config.yaml
#------------project configuration start----------------# openie_llm: &openie_llm base_url: http://localhost:11434/ model: llama3.2:latest type: openai
[llm] type = ollama base_url = http://host.docker.internal:11434/v1 model = llama3.2:latest
[vectorizer] vectorizer = kag.common.vectorizer.OpenAIVectorizer model = bge-m3:latest api_key = EMPTY base_url = http://host.docker.internal:11434/v1 vector_dimensions = 1024
log: level: INFO
project: biz_scene: default host_addr: http://127.0.0.1:8887 id: '8' language: en namespace: CsQa #------------project configuration end----------------#
#------------kag-builder configuration start----------------# kag_builder_pipeline: chain: type: unstructured_builder_chain # kag.builder.default_chain.DefaultUnstructuredBuilderChain extractor: type: schema_free_extractor # kag.builder.component.extractor.schema_free_extractor.SchemaFreeExtractor llm: *openie_llm ner_prompt: type: default_ner # kag.builder.prompt.default.ner.OpenIENERPrompt std_prompt: type: default_std # kag.builder.prompt.default.std.OpenIEEntitystandardizationdPrompt triple_prompt: type: default_triple # kag.builder.prompt.default.triple.OpenIETriplePrompt reader: type: md_reader # kag.builder.component.reader.markdown_reader.MarkDownReader post_processor: type: kag_post_processor # kag.builder.component.postprocessor.kag_postprocessor.KAGPostProcessor splitter: type: length_splitter # kag.builder.component.splitter.length_splitter.LengthSplitter split_length: 4950 window_length: 100 vectorizer: type: batch_vectorizer # kag.builder.component.vectorizer.batch_vectorizer.BatchVectorizer vectorize_model: *vectorize_model writer: type: kg_writer # kag.builder.component.writer.kg_writer.KGWriter num_threads_per_chain: 50 num_chains: 16 scanner: type: dir_file_scanner # kag.builder.component.scanner.directory_scanner.DirectoryScanner file_suffix: md #------------kag-builder configuration end----------------#
#------------kag-solver configuration start----------------# search_api: &search_api type: openspg_search_api #kag.solver.tools.search_api.impl.openspg_search_api.OpenSPGSearchAPI
graph_api: &graph_api type: openspg_graph_api #kag.solver.tools.graph_api.impl.openspg_graph_api.OpenSPGGraphApi
chain_vectorizer: type: batch vectorize_model: *vectorize_model
kg_cs: type: kg_cs_open_spg path_select: type: exact_one_hop_select entity_linking: type: entity_linking recognition_threshold: 0.9 exclude_types: - Chunk
kg_fr: type: kg_fr_open_spg top_k: 20 path_select: type: fuzzy_one_hop_select llm_client: *chat_llm ppr_chunk_retriever_tool: type: ppr_chunk_retriever llm_client: *openie_llm entity_linking: type: entity_linking recognition_threshold: 0.8 exclude_types: - Chunk
rc: type: rc_open_spg vector_chunk_retriever: type: vector_chunk_retriever vectorize_model: *vectorize_model vectorize_model: *vectorize_model top_k: 20
kag_merger: type: kg_merger top_k: 20 llm_module: *chat_llm summary_prompt: type: default_thought_then_answer vectorize_model: *vectorize_model
kag_hybrid_executor: &kag_hybrid_executor_conf type: kag_hybrid_executor lf_rewriter: type: kag_spo_lf llm_client: *chat_llm lf_trans_prompt: type: default_spo_retriever_decompose vectorize_model: *vectorize_model flow: | kg_cs->kg_fr->kag_merger;rc->kag_merger
kag_output_executor: &kag_output_executor_conf type: kag_output_executor
kag_deduce_executor: &kag_deduce_executor_conf type: kag_deduce_executor
py_code_based_math_executor: &py_code_based_math_executor_conf type: py_code_based_math_executor llm: *chat_llm
solver_pipeline: type: kag_static_pipeline planner: type: lf_kag_static_planner llm: *chat_llm plan_prompt: type: default_lf_static_planning rewrite_prompt: type: default_rewrite_sub_task_query executors: - *kag_hybrid_executor_conf - *py_code_based_math_executor_conf - *kag_deduce_executor_conf - *kag_output_executor_conf generator: type: llm_generator llm_client: *chat_llm generated_prompt: type: default_refer_generator_prompt enable_ref: true
#------------kag-solver configuration end----------------#
(kag-demo) C:\Users\TARGET STORE\Desktop\6\KAG\kag\examples\csqa>knext project create --config_path ./kag_config.yaml
Traceback (most recent call last):
File "C:\ProgramData\anaconda3\envs\kag-demo\Scripts\knext-script.py", line 33, in
(kag-demo) C:\Users\TARGET STORE\Desktop\6\KAG\kag\examples\csqa>
at org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:78)
at org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:360)
at org.apache.coyote.http11.Http11Processor.service(Http11Processor.java:399)
at org.apache.coyote.AbstractProcessorLight.process(AbstractProcessorLight.java:65)
at org.apache.coyote.AbstractProtocol$ConnectionHandler.process(AbstractProtocol.java:891)
at org.apache.tomcat.util.net.NioEndpoint$SocketProcessor.doRun(NioEndpoint.java:1784)
at org.apache.tomcat.util.net.SocketProcessorBase.run(SocketProcessorBase.java:49)
at org.apache.tomcat.util.threads.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1191)
at org.apache.tomcat.util.threads.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:659)
at org.apache.tomcat.util.threads.TaskThread$WrappingRunnable.run(TaskThread.java:61)
at java.lang.Thread.run(Thread.java:750)
Caused by: java.util.concurrent.ExecutionException: pemja.core.PythonException: <class 'RuntimeError'>: invalid llm config: {'__customParamKeys': [], 'creator': 'openspg', 'default': True, 'createTime': '2025-06-07 19:38:58', 'base_url': 'http://127.0.0.1:11434/', 'model': 'llama3.2:latest', 'type': 'ollama', 'llm_id': 'b3ac456e-06e9-43a1-9b26-53d504ebb0d7', 'desc': '2e2w3d'}, for details: Failed to connect to Ollama. Please check that Ollama is downloaded, running and accessible. https://ollama.com/download at java.util.concurrent.FutureTask.report(FutureTask.java:122) at java.util.concurrent.FutureTask.get(FutureTask.java:206) at com.antgroup.openspg.common.util.pemja.PemjaUtils.lambda$invoke$1(PemjaUtils.java:65) ... 66 common frames omitted Caused by: pemja.core.PythonException: <class 'RuntimeError'>: invalid llm config: {'__customParamKeys': [], 'creator': 'openspg', 'default': True, 'createTime': '2025-06-07 19:38:58', 'base_url': 'http://127.0.0.1:11434/', 'model': 'llama3.2:latest', 'type': 'ollama', 'llm_id': 'b3ac456e-06e9-43a1-9b26-53d504ebb0d7', 'desc': '2e2w3d'}, for details: Failed to connect to Ollama. Please check that Ollama is downloaded, running and accessible. https://ollama.com/download at /home/admin/miniconda3/lib/python3.10/site-packages/kag/bridge/spg_server_bridge.run_llm_config_check(spg_server_bridge.py:116) at /home/admin/miniconda3/lib/python3.10/site-packages/kag/common/llm/llm_config_checker.check(llm_config_checker.py:42) at pemja.core.PythonInterpreter.invokeMethodOneArgString(Native Method) at pemja.core.PythonInterpreter.invokeMethodOneArg(PythonInterpreter.java:212) at pemja.core.PythonInterpreter.invokeMethod(PythonInterpreter.java:116) at com.antgroup.openspg.common.util.pemja.PemjaUtils.lambda$null$0(PemjaUtils.java:63) at java.util.concurrent.FutureTask.run(FutureTask.java:266) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) ... 1 common frames omitted 2025-06-08 10:38:58,428 [] [] [http-nio-8887-exec-9] INFO c.a.o.a.h.s.f.AclFilter - [PUT:http://127.0.0.1:8887/v1/configs/2] cost:453 2025-06-08 10:39:15,573 [] [] [dbSchedule-73] INFO c.a.o.s.c.s.s.h.c.d.SchedulerHandlerClient - Start DB SchedulerHandler:executeInstanceScheduleHandler 2025-06-08 10:39:15,576 [] [] [dbSchedule-72] INFO c.a.o.s.c.s.s.h.c.d.SchedulerHandlerClient - Start DB SchedulerHandler:generateInstanceScheduleHandler 2025-06-08 10:39:15,584 [] [] [dbSchedule-73] INFO c.a.o.s.c.s.s.h.c.d.SchedulerHandlerClient - get scheduler lock success executeInstanceScheduleHandler 172.18.0.5 2025-06-08 10:39:15,584 [] [] [dbSchedule-73] INFO c.a.o.s.c.s.s.h.c.d.SchedulerHandlerClient - can execute result false executeInstanceScheduleHandler 2025-06-08 10:39:15,585 [] [] [dbSchedule-72] INFO c.a.o.s.c.s.s.h.c.d.SchedulerHandlerClient - get scheduler lock success generateInstanceScheduleHandler 172.18.0.5 2025-06-08 10:39:15,585 [] [] [dbSchedule-72] INFO c.a.o.s.c.s.s.h.c.d.SchedulerHandlerClient - can execute result false generateInstanceScheduleHandler 2025-06-08 10:39:15,588 [] [] [dbSchedule-73] INFO c.a.o.s.c.s.s.h.c.d.SchedulerHandlerClient - Scheduler lock released successfully! 2025-06-08 10:39:15,590 [] [] [dbSchedule-72] INFO c.a.o.s.c.s.s.h.c.d.SchedulerHandlerClient - Scheduler lock released successfully!
Caused by: java.util.concurrent.ExecutionException: pemja.core.PythonException: <class 'RuntimeError'>: invalid llm config: {'__customParamKeys': [], 'creator': 'openspg', 'default': True, 'createTime': '2025-06-07 19:57:22', 'base_url': 'http://host.docker.internal:11434/v1', 'model': 'z:latest', 'type': 'ollama', 'llm_id': '2eae6cda-9497-4d2d-8488-54ac77c17277', 'desc': 'ollamaaaaaa'}, for details: 404 page not found (status code: 404) at java.util.concurrent.FutureTask.report(FutureTask.java:122) at java.util.concurrent.FutureTask.get(FutureTask.java:206) at com.antgroup.openspg.common.util.pemja.PemjaUtils.lambda$invoke$1(PemjaUtils.java:65) ... 66 common frames omitted Caused by: pemja.core.PythonException: <class 'RuntimeError'>: invalid llm config: {'__customParamKeys': [], 'creator': 'openspg', 'default': True, 'createTime': '2025-06-07 19:57:22', 'base_url': 'http://host.docker.internal:11434/v1', 'model': 'z:latest', 'type': 'ollama', 'llm_id': '2eae6cda-9497-4d2d-8488-54ac77c17277', 'desc': 'ollamaaaaaa'}, for details: 404 page not found (status code: 404) at /home/admin/miniconda3/lib/python3.10/site-packages/kag/bridge/spg_server_bridge.run_llm_config_check(spg_server_bridge.py:116) at /home/admin/miniconda3/lib/python3.10/site-packages/kag/common/llm/llm_config_checker.check(llm_config_checker.py:42) at pemja.core.PythonInterpreter.invokeMethodOneArgString(Native Method) at pemja.core.PythonInterpreter.invokeMethodOneArg(PythonInterpreter.java:212) at pemja.core.PythonInterpreter.invokeMethod(PythonInterpreter.java:116) at com.antgroup.openspg.common.util.pemja.PemjaUtils.lambda$null$0(PemjaUtils.java:63) at java.util.concurrent.FutureTask.run(FutureTask.java:266) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) ... 1 common frames omitted 2025-06-08 10:57:23,134 [] [] [http-nio-8887-exec-1] INFO c.a.o.a.h.s.f.AclFilter - [PUT:http://127.0.0.1:8887/v1/configs/2] cost:469 2025-06-08 10:57:42,743 [] [] [dbSchedule-73] INFO c.a.o.s.c.s.s.h.c.d.SchedulerHandlerClient - Start DB SchedulerHandler:executeInstanceScheduleHandler 2025-06-08 10:57:42,746 [] [] [dbSchedule-72] INFO c.a.o.s.c.s.s.h.c.d.SchedulerHandlerClient - Start DB SchedulerHandler:generateInstanceScheduleHandler 2025-06-08 10:57:42,756 [] [] [dbSchedule-73] INFO c.a.o.s.c.s.s.h.c.d.SchedulerHandlerClient - get scheduler lock success executeInstanceScheduleHandler 172.18.0.5 2025-06-08 10:57:42,758 [] [] [dbSchedule-72] INFO c.a.o.s.c.s.s.h.c.d.SchedulerHandlerClient - get scheduler lock success generateInstanceScheduleHandler 172.18.0.5 2025-06-08 10:57:42,761 [] [] [dbSchedule-73] INFO c.a.o.s.c.s.s.h.c.d.SchedulerHandlerClient - can execute result true executeInstanceScheduleHandler 2025-06-08 10:57:42,763 [] [] [dbSchedule-73] INFO c.a.o.s.c.s.s.e.i.SchedulerExecuteServiceImpl - getAllNotFinishInstance successful size:0 2025-06-08 10:57:42,763 [] [] [dbSchedule-73] INFO c.a.o.s.c.s.s.h.i.ExecuteInstanceScheduleHandler - run ExecuteInstanceScheduleHandler end time:2 2025-06-08 10:57:42,765 [] [] [dbSchedule-72] INFO c.a.o.s.c.s.s.h.c.d.SchedulerHandlerClient - can execute result true generateInstanceScheduleHandler 2025-06-08 10:57:42,767 [] [] [dbSchedule-72] INFO c.a.o.s.c.s.s.e.i.SchedulerExecuteServiceImpl - getAllPeriodJob successful size:0 2025-06-08 10:57:42,767 [] [] [dbSchedule-72] INFO c.a.o.s.c.s.s.h.i.GenerateInstanceScheduleHandler - run GenerateInstanceScheduleHandler end time:2 2025-06-08 10:57:42,774 [] [] [dbSchedule-73] INFO c.a.o.s.c.s.s.h.c.d.SchedulerHandlerClient - Scheduler lock released successfully! 2025-06-08 10:57:42,780 [] [] [dbSchedule-72] INFO c.a.o.s.c.s.s.h.c.d.SchedulerHandlerClient - Scheduler lock released successfully!
注意Docker内安装OpenSPG要使用http://host.docker.internal:11434访问本地Ollama