BettaFish icon indicating copy to clipboard operation
BettaFish copied to clipboard

[Query Engine Streamlit App] 研究过程中发生错误: Error code: 400 - {'error': {'message':

Open xangelwing opened this issue 2 months ago • 3 comments

错误信息

研究过程中发生错误: Error code: 400 - {'error': {'message': 'Content Exists Risk', 'type': 'invalid_request_error', 'param': None, 'code': 'invalid_request_error'}}

错误详情

Traceback (most recent call last):
  File "/app/SingleEngineApp/query_engine_streamlit_app.py", line 149, in execute_research
    agent._initial_search_and_summary(i)
  File "/app/SingleEngineApp/../QueryEngine/agent.py", line 301, in _initial_search_and_summary
    self.state = self.first_summary_node.mutate_state(
                 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/app/SingleEngineApp/../QueryEngine/nodes/summary_node.py", line 199, in mutate_state
    raise e
  File "/app/SingleEngineApp/../QueryEngine/nodes/summary_node.py", line 185, in mutate_state
    summary = self.run(input_data, **kwargs)
              ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/app/SingleEngineApp/../QueryEngine/nodes/summary_node.py", line 116, in run
    raise e
  File "/app/SingleEngineApp/../QueryEngine/nodes/summary_node.py", line 103, in run
    response = self.llm_client.stream_invoke_to_string(
               ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/app/SingleEngineApp/../QueryEngine/llms/base.py", line 148, in stream_invoke_to_string
    for chunk in self.stream_invoke(system_prompt, user_prompt, **kwargs):
  File "/app/SingleEngineApp/../QueryEngine/llms/base.py", line 132, in stream_invoke
    raise e
  File "/app/SingleEngineApp/../QueryEngine/llms/base.py", line 118, in stream_invoke
    stream = self.client.chat.completions.create(
             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.11/site-packages/openai/_utils/_utils.py", line 286, in wrapper
    return func(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.11/site-packages/openai/resources/chat/completions/completions.py", line 1156, in create
    return self._post(
           ^^^^^^^^^^^
  File "/usr/local/lib/python3.11/site-packages/openai/_base_client.py", line 1259, in post
    return cast(ResponseT, self.request(cast_to, opts, stream=stream, stream_cls=stream_cls))
                           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.11/site-packages/openai/_base_client.py", line 1047, in request
    raise self._make_status_error_from_response(err.response) from None
openai.BadRequestError: Error code: 400 - {'error': {'message': 'Content Exists Risk', 'type': 'invalid_request_error', 'param': None, 'code': 'invalid_request_error'}}

环境信息

  • 应用: Query Engine Streamlit App
  • 时间: 2025-11-09 17:31:10

xangelwing avatar Nov 09 '25 17:11 xangelwing

Hi @xangelwing! I'm Dosu and I’m helping the BettaFish team.

这个 400 - "Content Exists Risk" 错误是由 LLM 服务商(如 DeepSeek 或部分 OpenAI 兼容 API)内容风控机制触发的,通常是因为请求内容包含敏感词(如新闻、涉政、加密货币、PDF 标题等)或被判定为高风险,导致 API 拒绝响应。该错误直接由 LLM API 返回,BettaFish 本地代码没有预处理或绕过机制,重试也无法解决,因为内容本身会持续触发风控相关说明

目前 Query Engine Streamlit App 没有自动断点续跑或自动保存进度的功能,遇到此类错误只能重新跑流程。建议如下:

  • 优先切换到内容审查更宽松的 LLM(如 Qwen、Bocha、OpenAI 等),只需在 .env 或配置文件中修改 base_urlmodelAPI key,然后重启服务即可操作指引
  • 避免 prompt 或输入内容中出现敏感词或重复内容,降低触发风控概率。
  • 如需减少损失,可手动保存中间 state 文件,出错后用 load_state 恢复进度(需一定代码操作)详细说明

如果你用的是 DeepSeek,注意 base_url 需带 /v1 后缀,否则会导致请求失败,但这不是 "Content Exists Risk" 的直接原因配置说明

如需进一步规避此类问题,建议关注后续版本是否支持自动断点续跑或更细致的内容预检测。

To reply, just mention @dosu.


How did I do? Good | Irrelevant | Incorrect | Verbose | Hallucination | Report 🐛 | Other  Join Discord Share on X

dosubot[bot] avatar Nov 09 '25 17:11 dosubot[bot]

换成kimi 可用

arsuili avatar Nov 10 '25 00:11 arsuili

命中敏感词,只能换模型了,如果都换做gemini,是不是可以呢,虽然成本高一些

cgk100 avatar Nov 10 '25 00:11 cgk100

换模型就好了。

xangelwing avatar Nov 12 '25 16:11 xangelwing