[Bug] [Module Name] 项目创建失败
Search before asking
- [x] I had searched in the issues and found no similar issues.
Operating system information
Linux
What happened
我使用以下命令来创建项目,但报错了,请问这个问题该怎么解决
knext project create --config_path ./example_config.yam
Traceback (most recent call last):
File "/home/hzc/miniconda3/envs/kag-demo/bin/knext", line 33, in
How to reproduce
我已经验证过,docker里的服务能通过172.17.0.1:11434访问宿主机ollma,但我不清楚上面的问题是什么原因导致的 以下是我的yaml文件,
#------------project configuration start----------------# openie_llm: &openie_llm type: ollama base_url: http://localhost:11434/
api_key: EMPTY
model: qwen3:32b temperature: 0.7 extra_body: {"chat_template_kwargs": {"enable_thinking": False}}
chat_llm: &chat_llm type: ollama base_url: http://localhost:11434/ api_key: EMPTY model: qwen3:32b max_tokens: 32768 temperature: 0.7 extra_body: {"chat_template_kwargs": {"enable_thinking": False}}
vectorize_model: &vectorize_model api_key: EMPTY base_url: http://172.17.0.1:11434/v1 model: bge-m3:latest type: openai vector_dimensions: 1024 vectorizer: *vectorize_model
log: level: INFO
project: biz_scene: default host_addr: http://127.0.0.1:8887 id: "1" language: en namespace: HotpotQATest #------------project configuration end----------------#
#------------kag-builder configuration start----------------# kag_builder_pipeline: chain: type: unstructured_builder_chain # kag.builder.default_chain.DefaultUnstructuredBuilderChain extractor: type: knowledge_unit_extractor llm: *openie_llm ner_prompt: type: knowledge_unit_ner triple_prompt: type: knowledge_unit_triple kn_prompt: type: knowledge_unit reader: type: dict_reader # kag.builder.component.reader.dict_reader.DictReader post_processor: type: kag_post_processor # kag.builder.component.postprocessor.kag_postprocessor.KAGPostProcessor splitter: type: length_splitter # kag.builder.component.splitter.length_splitter.LengthSplitter split_length: 100000 window_length: 0 vectorizer: type: batch_vectorizer # kag.builder.component.vectorizer.batch_vectorizer.BatchVectorizer vectorize_model: *vectorize_model writer: type: kg_writer # kag.builder.component.writer.kg_writer.KGWriter num_threads_per_chain: 1 num_chains: 16 scanner: type: hotpotqa_dataset_scanner # kag.builder.component.scanner.dataset_scanner.HotpotqaCorpusScanner #------------kag-builder configuration end----------------#
#------------kag-solver configuration start----------------# search_api: &search_api type: openspg_search_api #kag.solver.tools.search_api.impl.openspg_search_api.OpenSPGSearchAPI
graph_api: &graph_api type: openspg_graph_api #kag.solver.tools.graph_api.impl.openspg_graph_api.OpenSPGGraphApi
kg_cs: &kg_cs type: kg_cs_open_spg priority: 0 path_select: type: exact_one_hop_select graph_api: *graph_api search_api: *search_api entity_linking: type: entity_linking graph_api: *graph_api search_api: *search_api recognition_threshold: 0.9 exclude_types: - Chunk - AtomicQuery - KnowledgeUnit - Summary - Outline - Doc
kg_fr: &kg_fr type: kg_fr_knowledge_unit top_k: 20 graph_api: *graph_api search_api: *search_api vectorize_model: *vectorize_model path_select: type: fuzzy_one_hop_select llm_client: *openie_llm graph_api: *graph_api search_api: *search_api ppr_chunk_retriever_tool: type: ppr_chunk_retriever llm_client: *chat_llm graph_api: *graph_api search_api: *search_api entity_linking: type: entity_linking graph_api: *graph_api search_api: *search_api recognition_threshold: 0.8 exclude_types: - Chunk - AtomicQuery - KnowledgeUnit - Summary - Outline - Doc
rc: &rc type: rc_open_spg vector_chunk_retriever: type: vector_chunk_retriever vectorize_model: *vectorize_model score_threshold: 0.65 search_api: *search_api graph_api: *graph_api search_api: *search_api vectorize_model: *vectorize_model top_k: 20
kag_hybrid_executor: &kag_hybrid_executor_conf type: kag_hybrid_retrieval_executor retrievers: - *kg_cs - *kg_fr - *rc merger: type: kag_merger enable_summary: true
kag_output_executor: &kag_output_executor_conf type: kag_output_executor llm_module: *chat_llm
kag_deduce_executor: &kag_deduce_executor_conf type: kag_deduce_executor llm_module: *chat_llm
py_code_based_math_executor: &py_code_based_math_executor_conf type: py_code_based_math_executor llm: *chat_llm
kag_solver_pipeline: type: kag_static_pipeline planner: type: lf_kag_static_planner llm: *chat_llm plan_prompt: type: default_lf_static_planning rewrite_prompt: type: default_rewrite_sub_task_query executors: - *kag_hybrid_executor_conf - *py_code_based_math_executor_conf - *kag_deduce_executor_conf - *kag_output_executor_conf generator: type: default_generator # kag.solver.implementation.default_generator.DefaultGenerator llm_client: *chat_llm generated_prompt: type: default_refer_generator_prompt enable_ref: true
#------------kag-solver configuration end----------------#
Are you willing to submit PR?
- [ ] Yes I am willing to submit a PR!
我试了用api的方式也不行,报的是同样的错
knext project create --config_path ./example_config1.yaml
Traceback (most recent call last):
File "/home/hzc/miniconda3/envs/kag-demo/bin/knext", line 33, in
我也遇到了类似的问题,在配置好qwen-max后,使用knext project create --config_path ./example_config.yaml命令创建项目,报错:HTTP response body: "modelTypeMap is null"
HTTP response headers: HTTPHeaderDict({'Access-Control-Allow-Origin': '*', 'Access-Control-Allow-Methods': 'POST, PUT, GET, DELETE', 'Access-Control-Allow-Headers': 'Origin, X-Requested-With, Content-Type, Accept, client_id, uuid, Authorization, credentials', 'Access-Control-Allow-Credentials': 'true', 'Access-Control-Max-Age': '3600', 'Vary': 'Origin, Access-Control-Request-Method, Access-Control-Request-Headers', 'TraceId': 'ac140005175125396331710651', 'Remote': '172.20.0.5', 'Content-Type': 'application/json;charset=UTF-8', 'Transfer-Encoding': 'chunked', 'Date': 'Mon, 30 Jun 2025 03:26:03 GMT', 'Connection': 'close'})
HTTP response body: "modelTypeMap is null"
这是我的具体配置
openie_llm: &openie_llm
type: maas
base_url: https://dashscope.aliyuncs.com/compatible-mode/v1
api_key: sk-
model: qwen-max
enable_check: false
chat_llm: &chat_llm
type: maas
base_url: https://dashscope.aliyuncs.com/compatible-mode/v1
api_key: sk-
model: qwen-max
enable_check: false
vectorize_model: &vectorize_model
api_key: sk-
base_url: https://api.siliconflow.cn/v1/
model: BAAI/bge-m3
type: openai
vector_dimensions: 1024
enable_check: false
vectorizer: *vectorize_model
我也遇到了类似的问题,在配置好qwen-max后,使用knext project create --config_path ./example_config.yaml命令创建项目,报错:HTTP response body: "modelTypeMap is null"
HTTP response headers: HTTPHeaderDict({'Access-Control-Allow-Origin': '*', 'Access-Control-Allow-Methods': 'POST, PUT, GET, DELETE', 'Access-Control-Allow-Headers': 'Origin, X-Requested-With, Content-Type, Accept, client_id, uuid, Authorization, credentials', 'Access-Control-Allow-Credentials': 'true', 'Access-Control-Max-Age': '3600', 'Vary': 'Origin, Access-Control-Request-Method, Access-Control-Request-Headers', 'TraceId': 'ac140005175125396331710651', 'Remote': '172.20.0.5', 'Content-Type': 'application/json;charset=UTF-8', 'Transfer-Encoding': 'chunked', 'Date': 'Mon, 30 Jun 2025 03:26:03 GMT', 'Connection': 'close'}) HTTP response body: "modelTypeMap is null"这是我的具体配置
openie_llm: &openie_llm type: maas base_url: https://dashscope.aliyuncs.com/compatible-mode/v1 api_key: sk- model: qwen-max enable_check: false chat_llm: &chat_llm type: maas base_url: https://dashscope.aliyuncs.com/compatible-mode/v1 api_key: sk- model: qwen-max enable_check: false vectorize_model: &vectorize_model api_key: sk- base_url: https://api.siliconflow.cn/v1/ model: BAAI/bge-m3 type: openai vector_dimensions: 1024 enable_check: false vectorizer: *vectorize_model
我调用api或者用本地模型都是这个错误,你这些配置应该没问题,如果这里有问题的话运行创建项目命令会报下面这种错误 Error: invalid llm config 现在不知道是不是只有我俩遇到这个问题,还是很多人都遇到这个问题
我把镜像删除重新拉取后这个问题就消失了(O_O)
我把镜像删除重新拉取后这个问题就消失了(O_O)
确实,我删了后重新拉取就解决了,非常感谢
我把镜像删除重新拉取后这个问题就消失了(O_O)
确实,我删了后重新拉取就解决了,非常感谢
遇到这个 问题之后我把容器和镜像重新拉取之后还是没好
我把镜像删除重新拉取后这个问题就消失了(O_O)
确实,我删了后重新拉取就解决了,非常感谢
遇到这个 问题之后我把容器和镜像重新拉取之后还是没好
已解决:更新到最新版的release0.8之后就可以正常访问api了
我把镜像删除重新拉取后这个问题就消失了(O_O)
我重新拉去了还是报一样的错,有什么其他的解决方法吗?
我把镜像删除重新拉取后这个问题就消失了(O_O)
我重新拉去了还是报一样的错,有什么其他的解决方法吗?
你要先删除docker里面关于openspg的镜像,删除后再拉取,直接重新拉取没有用,如果重新拉取后还不行的话我也不知道该怎么解决,不过看到上面有人说更新到最新版的release0.8之后就能正常访问了
Yes, the first time I used '. yam', and after encountering an error, I have already correctly used '. yaml'. I copied both of them together, and you can see that I used '. yaml' by looking back at a few lines
------------------ 原始邮件 ------------------ 发件人: "OpenSPG/KAG" @.>; 发送时间: 2025年8月11日(星期一) 上午9:53 @.>; @.@.>; 主题: Re: [OpenSPG/KAG] [Bug] [Module Name] 项目创建失败 (Issue #614)
thundax-lyp left a comment (OpenSPG/KAG#614)
Search before asking
I had searched in the issues and found no similar issues.
Operating system information
Linux
What happened
我使用以下命令来创建项目,但报错了,请问这个问题该怎么解决 knext project create --config_path ./example_config.yam Traceback (most recent call last): File "/home/hzc/miniconda3/envs/kag-demo/bin/knext", line 33, in sys.exit(load_entry_point('openspg-kag', 'console_scripts', 'knext')()) File "/home/hzc/miniconda3/envs/kag-demo/lib/python3.10/site-packages/click/core.py", line 1157, in call return self.main(*args, **kwargs) File "/home/hzc/miniconda3/envs/kag-demo/lib/python3.10/site-packages/click/core.py", line 1078, in main rv = self.invoke(ctx) File "/home/hzc/KAG/knext/command/exception.py", line 21, in invoke return super().invoke(ctx) File "/home/hzc/miniconda3/envs/kag-demo/lib/python3.10/site-packages/click/core.py", line 1688, in invoke return _process_result(sub_ctx.command.invoke(sub_ctx)) File "/home/hzc/miniconda3/envs/kag-demo/lib/python3.10/site-packages/click/core.py", line 1688, in invoke return _process_result(sub_ctx.command.invoke(sub_ctx)) File "/home/hzc/miniconda3/envs/kag-demo/lib/python3.10/site-packages/click/core.py", line 1434, in invoke return ctx.invoke(self.callback, **ctx.params) File "/home/hzc/miniconda3/envs/kag-demo/lib/python3.10/site-packages/click/core.py", line 783, in invoke return __callback(args, kwargs) File "/home/hzc/KAG/knext/command/sub_command/project.py", line 145, in create_project config = yaml.load(Path(config_path).read_text() or "{}") File "/home/hzc/miniconda3/envs/kag-demo/lib/python3.10/pathlib.py", line 1134, in read_text with self.open(mode='r', encoding=encoding, errors=errors) as f: File "/home/hzc/miniconda3/envs/kag-demo/lib/python3.10/pathlib.py", line 1119, in open return self._accessor.open(self, mode, buffering, encoding, errors, FileNotFoundError: [Errno 2] No such file or directory: 'example_config.yam' (kag-demo) @.:~/KAG/kag/examples$ knext project create --config_path ./example_config.yaml Traceback (most recent call last): File "/home/hzc/miniconda3/envs/kag-demo/bin/knext", line 33, in sys.exit(load_entry_point('openspg-kag', 'console_scripts', 'knext')()) File "/home/hzc/miniconda3/envs/kag-demo/lib/python3.10/site-packages/click/core.py", line 1157, in call return self.main(*args, **kwargs) File "/home/hzc/miniconda3/envs/kag-demo/lib/python3.10/site-packages/click/core.py", line 1078, in main rv = self.invoke(ctx) File "/home/hzc/KAG/knext/command/exception.py", line 21, in invoke return super().invoke(ctx) File "/home/hzc/miniconda3/envs/kag-demo/lib/python3.10/site-packages/click/core.py", line 1688, in invoke return _process_result(sub_ctx.command.invoke(sub_ctx)) File "/home/hzc/miniconda3/envs/kag-demo/lib/python3.10/site-packages/click/core.py", line 1688, in invoke return _process_result(sub_ctx.command.invoke(sub_ctx)) File "/home/hzc/miniconda3/envs/kag-demo/lib/python3.10/site-packages/click/core.py", line 1434, in invoke return ctx.invoke(self.callback, **ctx.params) File "/home/hzc/miniconda3/envs/kag-demo/lib/python3.10/site-packages/click/core.py", line 783, in invoke return __callback(_args, **kwargs) File "/home/hzc/KAG/knext/command/sub_command/project.py", line 180, in create_project project = client.create( File "/home/hzc/KAG/knext/project/client.py", line 90, in create project = self.rest_client.project_create_post( File "/home/hzc/KAG/knext/project/rest/project_api.py", line 68, in project_create_post return self.project_create_post_with_http_info(**kwargs) # noqa: E501 File "/home/hzc/KAG/knext/project/rest/project_api.py", line 144, in project_create_post_with_http_info return self.api_client.call_api( File "/home/hzc/KAG/knext/common/rest/api_client.py", line 419, in call_api return self.__call_api( File "/home/hzc/KAG/knext/common/rest/api_client.py", line 219, in __call_api raise e File "/home/hzc/KAG/knext/common/rest/api_client.py", line 207, in __call_api response_data = self.request( File "/home/hzc/KAG/knext/common/rest/api_client.py", line 495, in request return self.rest_client.POST( File "/home/hzc/KAG/knext/common/rest/rest.py", line 345, in POST return self.request( File "/home/hzc/KAG/knext/common/rest/rest.py", line 257, in request raise ApiException(http_resp=r) knext.common.rest.exceptions.ApiException: (400) Reason: HTTP response headers: HTTPHeaderDict({'Access-Control-Allow-Origin': '', 'Access-Control-Allow-Methods': 'POST, PUT, GET, DELETE', 'Access-Control-Allow-Headers': 'Origin, X-Requested-With, Content-Type, Accept, client_id, uuid, Authorization, credentials', 'Access-Control-Allow-Credentials': 'true', 'Access-Control-Max-Age': '3600', 'Vary': 'Origin, Access-Control-Request-Method, Access-Control-Request-Headers', 'Content-Type': 'application/json;charset=UTF-8', 'Transfer-Encoding': 'chunked', 'Date': 'Sun, 29 Jun 2025 09:08:07 GMT', 'Connection': 'close'}) HTTP response body: {"timestamp":"2025-06-29 17:08:07","status":400,"error":"Bad Request","path":"/public/v1/project"}
How to reproduce
我已经验证过,docker里的服务能通过172.17.0.1:11434访问宿主机ollma,但我不清楚上面的问题是什么原因导致的 以下是我的yaml文件,
#------------project configuration start----------------# openie_llm: &openie_llm type: ollama base_url: http://localhost:11434/
api_key: EMPTY
model: qwen3:32b temperature: 0.7 extra_body: {"chat_template_kwargs": {"enable_thinking": False}}
chat_llm: &chat_llm type: ollama base_url: http://localhost:11434/ api_key: EMPTY model: qwen3:32b max_tokens: 32768 temperature: 0.7 extra_body: {"chat_template_kwargs": {"enable_thinking": False}}
vectorize_model: &vectorize_model api_key: EMPTY base_url: http://172.17.0.1:11434/v1 model: bge-m3:latest type: openai vector_dimensions: 1024 vectorizer: *vectorize_model
log: level: INFO
project: biz_scene: default host_addr: http://127.0.0.1:8887 id: "1" language: en namespace: HotpotQATest #------------project configuration end----------------#
#------------kag-builder configuration start----------------# kag_builder_pipeline: chain: type: unstructured_builder_chain # kag.builder.default_chain.DefaultUnstructuredBuilderChain extractor: type: knowledge_unit_extractor llm: *openie_llm ner_prompt: type: knowledge_unit_ner triple_prompt: type: knowledge_unit_triple kn_prompt: type: knowledge_unit reader: type: dict_reader # kag.builder.component.reader.dict_reader.DictReader post_processor: type: kag_post_processor # kag.builder.component.postprocessor.kag_postprocessor.KAGPostProcessor splitter: type: length_splitter # kag.builder.component.splitter.length_splitter.LengthSplitter split_length: 100000 window_length: 0 vectorizer: type: batch_vectorizer # kag.builder.component.vectorizer.batch_vectorizer.BatchVectorizer vectorize_model: *vectorize_model writer: type: kg_writer # kag.builder.component.writer.kg_writer.KGWriter num_threads_per_chain: 1 num_chains: 16 scanner: type: hotpotqa_dataset_scanner # kag.builder.component.scanner.dataset_scanner.HotpotqaCorpusScanner #------------kag-builder configuration end----------------#
#------------kag-solver configuration start----------------# search_api: &search_api type: openspg_search_api #kag.solver.tools.search_api.impl.openspg_search_api.OpenSPGSearchAPI
graph_api: &graph_api type: openspg_graph_api #kag.solver.tools.graph_api.impl.openspg_graph_api.OpenSPGGraphApi
kg_cs: &kg_cs type: kg_cs_open_spg priority: 0 path_select: type: exact_one_hop_select graph_api: *graph_api search_api: *search_api entity_linking: type: entity_linking graph_api: *graph_api search_api: *search_api recognition_threshold: 0.9 exclude_types: - Chunk - AtomicQuery - KnowledgeUnit - Summary - Outline - Doc
kg_fr: &kg_fr type: kg_fr_knowledge_unit top_k: 20 graph_api: *graph_api search_api: *search_api vectorize_model: *vectorize_model path_select: type: fuzzy_one_hop_select llm_client: *openie_llm graph_api: *graph_api search_api: *search_api ppr_chunk_retriever_tool: type: ppr_chunk_retriever llm_client: *chat_llm graph_api: *graph_api search_api: *search_api entity_linking: type: entity_linking graph_api: *graph_api search_api: *search_api recognition_threshold: 0.8 exclude_types: - Chunk - AtomicQuery - KnowledgeUnit - Summary - Outline - Doc
rc: &rc type: rc_open_spg vector_chunk_retriever: type: vector_chunk_retriever vectorize_model: *vectorize_model score_threshold: 0.65 search_api: *search_api graph_api: *graph_api search_api: *search_api vectorize_model: *vectorize_model top_k: 20
kag_hybrid_executor: &kag_hybrid_executor_conf type: kag_hybrid_retrieval_executor retrievers: - *kg_cs - *kg_fr - *rc merger: type: kag_merger enable_summary: true
kag_output_executor: &kag_output_executor_conf type: kag_output_executor llm_module: *chat_llm
kag_deduce_executor: &kag_deduce_executor_conf type: kag_deduce_executor llm_module: *chat_llm
py_code_based_math_executor: &py_code_based_math_executor_conf type: py_code_based_math_executor llm: *chat_llm
kag_solver_pipeline: type: kag_static_pipeline planner: type: lf_kag_static_planner llm: *chat_llm plan_prompt: type: default_lf_static_planning rewrite_prompt: type: default_rewrite_sub_task_query executors: - *kag_hybrid_executor_conf - *py_code_based_math_executor_conf - *kag_deduce_executor_conf - *kag_output_executor_conf generator: type: default_generator # kag.solver.implementation.default_generator.DefaultGenerator llm_client: *chat_llm generated_prompt: type: default_refer_generator_prompt enable_ref: true
#------------kag-solver configuration end----------------#
Are you willing to submit PR?
Yes I am willing to submit a PR!
i found such exception in your log
FileNotFoundError: [Errno 2] No such file or directory: 'example_config.yam'
be sure that your file is a '.yam'?
— Reply to this email directly, view it on GitHub, or unsubscribe. You are receiving this because you authored the thread.Message ID: @.***>