aideml
aideml copied to clipboard
Direct any model (named without `claude-`) to the OpenAI backend.
The old logic (OpenAI backend only for gpt- names) directed any OpenAI-compatible server model configuration to the Claude backend, as it was picked by default for any model named without gpt-.
It is easier to just switch the default backend, rather than considering whether the environment variable OPENAI_BASE_URL is set.
I confirmed the fix on the following run:
- Use Ollama:
export OPENAI_BASE_URL="http://localhost:11434/v1" - Set dummy OpenAI key:
export OPENAI_API_KEY="ollama" - Run README command with extra
agentparameters:aide agent.code.model="qwen2.5" agent.feedback.model="qwen2.5" data_dir="example_tasks/house_prices" goal="Predict the sales price for each house" eval="Use the RMSE metric between the logarithm of the predicted and observed values."