byzerllm query时报错:Failed to unpickle serialized exception
运行“byzerllm query --model ollama_qwen2_chat --query "who are you"”时会报错: ERROR serialization.py:425 -- Failed to unpickle serialized exception Traceback (most recent call last): File "C:\Users\xxx\Miniconda3\envs\auto-coder\Lib\site-packages\ray\exceptions.py", line 50, in from_ray_exception return pickle.loads(ray_exception.serialized_exception) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ TypeError: APIStatusError.init() missing 2 required keyword-only arguments: 'response' and 'body'
The above exception was the direct cause of the following exception:
Traceback (most recent call last):
File "C:\Users\xxx\Miniconda3\envs\auto-coder\Lib\site-packages\ray_private\serialization.py", line 423, in deserialize_objects
obj = self._deserialize_object(data, metadata, object_ref)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\xxx\Miniconda3\envs\auto-coder\Lib\site-packages\ray_private\serialization.py", line 305, in _deserialize_object
return RayError.from_bytes(obj)
^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\xxx\Miniconda3\envs\auto-coder\Lib\site-packages\ray\exceptions.py", line 44, in from_bytes
return RayError.from_ray_exception(ray_exception)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\xxx\Miniconda3\envs\auto-coder\Lib\site-packages\ray\exceptions.py", line 53, in from_ray_exception
raise RuntimeError(msg) from e
RuntimeError: Failed to unpickle serialized exception
Traceback (most recent call last):
File "
The above exception was the direct cause of the following exception:
Traceback (most recent call last): File "C:\Users\xxx\Miniconda3\envs\auto-coder\Lib\site-packages\ray_private\serialization.py", line 423, in deserialize_objects obj = self._deserialize_object(data, metadata, object_ref) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\xxx\Miniconda3\envs\auto-coder\Lib\site-packages\ray_private\serialization.py", line 305, in _deserialize_object return RayError.from_bytes(obj) ^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\xxx\Miniconda3\envs\auto-coder\Lib\site-packages\ray\exceptions.py", line 44, in from_bytes return RayError.from_ray_exception(ray_exception) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\xxx\Miniconda3\envs\auto-coder\Lib\site-packages\ray\exceptions.py", line 53, in from_ray_exception raise RuntimeError(msg) from e RuntimeError: Failed to unpickle serialized exception
一样的问题,解决了吗?
我是 在coding时 model_input_max_length配置的太小了出现了这个问题
same here , when i deploy deepseek-ai/deepseek-v2-chat model ,but the deepseek-chat model works file
same here , when i deploy
deepseek-ai/deepseek-v2-chatmodel ,but thedeepseek-chatmodel works file
I was using a wrong API key ,My problem solved after i switch to a correct api Key. This issue might due to exception serialization. but I don't know for details
根据使用auto_coder遇到的问题及解决方案汇总 - 飞书云文档 这里的指导
可以去ray的dashboard查看, localhost:8265 看actor下面的log
我的环境这个报错最后是知识库token超了 openai.BadRequestError: Error code: 400 - {'code': 20015, 'message': 'length of prompt_tokens (434052) must be less than max_seq_len (32768).', 'data': None}
处理掉之后, 可以返回内容了. 只不过好像多个文件加起来不能超大模型的上下文, deepseek 128K超了还是会报错.
解决了没