amstrongzyf
amstrongzyf

I think if you want the response to be in JSON format, you can always set `json_mode` to `True`. As you can see in `rdagent/oai/backend/base.py` (lines 548-549), `json_mode` is actually...
maybe you can try our new version and as far as I know quite a lot people have succeeded in using deepseek model
what's your model? Deepseek?
Maybe you can switch branch to `qlib_ds` and try again. It seems that hypothesis doesn't return a string and I have enhanced this string section in that branch though because...
Now you can pull our latest main branch and have a try. If you have any questions, just comment here.
In fact, my test shows that openrouter could be called using current structure, include these lines in your `.env`: ``` BACKEND=rdagent.oai.backend.LiteLLMAPIBackend OPENAI_API_KEY="your open-router api-key" OPENAI_API_BASE="https://openrouter.ai/api/v1" CHAT_MODEL="the full model-name on openrouter"...
We have fixed this problem in `rdagent/oai/backend/base.py` via this function: ```python def _fix_python_booleans(json_str: str) -> str: """Safely fix Python-style booleans to JSON standard format using tokenize""" replacements = {"True": "true",...
That's true, but since prompt formats might vary in the future, I think checking and fixing it at runtime is a more robust and elegant solution. Thanks for pointing out...