pr-agent icon indicating copy to clipboard operation
pr-agent copied to clipboard

Error with model gpt-5-mini

Open cong91 opened this issue 5 months ago • 4 comments

Git provider

Other

System Info

model model gpt-5-mini

Bug details

Image Please update max token of model gpt-5-mini. Thanks I'm facing error

{"text": "Failed to generate code suggestions for PR, error: Failed to generate prediction with any model of ['gpt-5-mini', 'gpt-5-mini']\n", "record": {"elapsed": {"repr": "0:00:19.164692", "seconds": 19.164692}, "exception": null, "extra": {"artifact": {"traceback": "Traceback (most recent call last):\n File \"/app/pr_agent/algo/pr_processing.py\", line 331, in retry_with_fallback_models\n return await f(model)\n ^^^^^^^^^^^^^^\n File \"/app/pr_agent/tools/pr_code_suggestions.py\", line 690, in prepare_prediction_main\n self.patches_diff_list_no_line_numbers = get_pr_multi_diffs(self.git_provider,\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\n File \"/app/pr_agent/algo/pr_processing.py\", line 415, in get_pr_multi_diffs\n if total_tokens + OUTPUT_BUFFER_TOKENS_SOFT_THRESHOLD < get_max_tokens(model):\n ^^^^^^^^^^^^^^^^^^^^^\n File \"/app/pr_agent/algo/utils.py\", line 969, in get_max_tokens\n raise Exception(f\"Ensure {model} is defined in MAX_TOKENS in ./pr_agent/algo/__init__.py or set a positive value for it in config.custom_model_max_tokens\")\nException: Ensure gpt-5-mini is defined in MAX_TOKENS in ./pr_agent/algo/__init__.py or set a positive value for it in config.custom_model_max_tokens\n\nDuring handling of the above exception, another exception occurred:\n\nTraceback (most recent call last):\n File \"/app/pr_agent/tools/pr_code_suggestions.py\", line 124, in run\n data = await retry_with_fallback_models(self.prepare_prediction_main, model_type=ModelType.REGULAR)\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\n File \"/app/pr_agent/algo/pr_processing.py\", line 337, in retry_with_fallback_models\n raise Exception(f\"Failed to generate prediction with any model of {all_models}\")\nException: Failed to generate prediction with any model of ['gpt-5-mini', 'gpt-5-mini']\n"}}, "file": {"name": "pr_code_suggestions.py", "path": "/app/pr_agent/tools/pr_code_suggestions.py"}, "function": "run", "level": {"icon": "❌", "name": "ERROR", "no": 40}, "line": 193, "message": "Failed to generate code suggestions for PR, error: Failed to generate prediction with any model of ['gpt-5-mini', 'gpt-5-mini']", "module": "pr_code_suggestions", "name": "pr_agent.tools.pr_code_suggestions", "process": {"id": 7, "name": "MainProcess"}, "thread": {"id": 139717992496000, "name": "MainThread"}, "time": {"repr": "2025-08-08 06:14:32.896337+00:00", "timestamp": 1754633672.896337}}} Thanks

cong91 avatar Aug 08 '25 06:08 cong91

:bulb: Auto-generated documentation-based answer:

The error you're encountering is because the model gpt-5-mini is not recognized by PR-Agent. The error message specifically states:

"Ensure gpt-5-mini is defined in MAX_TOKENS in ./pr_agent/algo/init.py or set a positive value for it in config.custom_model_max_tokens"

Solution

You have two options to fix this:

Option 1: Use a supported model

The model name gpt-5-mini doesn't appear to be a valid OpenAI model. You likely meant to use o4-mini (which is a supported model). Update your configuration to use:

[config]
model = "o4-mini"
fallback_models = ["o4-mini"]

Option 2: Configure custom model max tokens

If gpt-5-mini is indeed a custom model you want to use, you need to set the maximum tokens for it in your configuration:

[config]
model = "gpt-5-mini"
fallback_models = ["gpt-5-mini"]
custom_model_max_tokens = 128000  # Set appropriate token limit for your model

Supported Models

For reference, some of the supported models include:

  • o4-mini
  • gpt-4o
  • claude-4-sonnet
  • gemini-2.5-pro

You can find the complete list of supported models and their configuration in the "Changing a model" documentation.

Relevant Sources:

  • https://qodo-merge-docs.qodo.ai//usage-guide/changing_a_model#custom-models
  • https://qodo-merge-docs.qodo.ai//usage-guide/qodo_merge_models
  • https://qodo-merge-docs.qodo.ai//usage-guide/changing_a_model#changing-a-model-in-pr-agent

try again, added support

https://github.com/qodo-ai/pr-agent?tab=readme-ov-file#aug-8-2025

mrT23 avatar Aug 08 '25 07:08 mrT23

Same issue here, I already use docker pull codiumai/pr-agent:latest to update the images. But it will show the error below:

Error during LLM inference: litellm.AuthenticationError: AuthenticationError: OpenAIException - model is not supported.
2025-08-13 03:18:59.254 | WARNING  | pr_agent.algo.pr_processing:retry_with_fallback_models:333 - Failed to generate prediction with o4-mini
2025-08-13 03:18:59.254 | ERROR    | pr_agent.tools.pr_reviewer:run:183 - Failed to review PR: Failed to generate prediction with any model of ['gpt-5-2025-08-07', 'o4-mini']

Model "o4-mini-2025-04-16" is working, but not work with gpt-5.

Here are the content of .pr_agent.toml.

[config]
model="gpt-5-mini"
fallback_models=["gpt-5-mini"]
model_reasoning="gpt-5-mini"
model_weak="gpt-5-mini"
custom_model_max_tokens=128000
custom_reasoning_model=true
log_level="DEBUG"
git_provider="gitlab"
custom_reasoning_model = true
response_language="zh-TW"

evaeva38 avatar Aug 13 '25 02:08 evaeva38

gpt-5-mini works for me:

Image Image

mrT23 avatar Aug 14 '25 15:08 mrT23

@mrT23 could it be because gpt-5 models do not support temperature and need to be added there https://github.com/qodo-ai/pr-agent/blob/main/pr_agent/algo/init.py#L185 ?

A bit more context on this: I am running pr-agent 0.3.0 Model" gpt-5.1-codex-mini

the "improve" tool was failing with following error: Error during LLM inference: litellm.BadRequestError: OpenAIException - Unsupported value: 'temperature' does not support 0.2 with this model. Only the default (1) value is supported.

and following warnings were present even after I would set config__temperature=1: 2025-11-20 12:58:29.977 | WARNING | pr_agent.algo.ai_handlers.litellm_ai_handler:chat_completion:378 - Error during LLM inference: litellm.BadRequestError: OpenAIException - Unsupported value: 'temperature' does not support 0.2 with this model. Only the default (1) value is supported.

I did trace it to this list https://github.com/qodo-ai/pr-agent/blob/main/pr_agent/algo/init.py#L185 and if the model was added there the "improve" tool would work

PS I think all gpt-5 series models do not support temperature anymore (gpt-5, gpt-5-mini, gpt-5.1-codex, gpt-5.1-codex-mini, the list goes on)

dzmitryashkinadze avatar Nov 21 '25 08:11 dzmitryashkinadze