open-notebook
open-notebook copied to clipboard
[Bug]: Anthropic Claude models fail with temperature/top_p conflict
What did you do when it broke?
Bug Report: Anthropic Claude models fail with "temperature and top_p cannot both be specified" error
Environment
- Open Notebook Version: v1-latest-single (Docker)
- Docker Image Date: 2025-10-25
- Platform: Windows (Docker Desktop)
- Affected Models:
- claude-sonnet-4-5-20250929
- claude-3-5-sonnet-20241022
- claude-3-haiku-20240307
Description
All Anthropic Claude models fail when attempting to chat with the following error:
Error code: 400 - {'type': 'error', 'error': {'type': 'invalid_request_error', 'message': 'temperature and top_p cannot both be specified for this model. Please use only one.'}, 'request_id': 'req_011CUXDKK6e1cU6UZcRyCQuj'}
Root Cause
Open Notebook is sending both temperature and top_p parameters to the Anthropic API. However, Anthropic's
API requires only one parameter to be set, as per their documentation.
Expected Behavior
- Claude models should work with either
temperatureORtop_p(not both) - OpenAI models work fine (they accept both parameters)
Actual Behavior
- OpenAI models (gpt-4o): ✅ Works perfectly
- Anthropic models (all Claude models): ❌ Fails with 400 error
Steps to Reproduce
- Set up Open Notebook with
ANTHROPIC_API_KEYenvironment variable - Add Claude model via web UI (Settings → Models → Add Model)
- Provider: Anthropic
- Model Name: claude-sonnet-4-5-20250929
- Create a notebook and upload a document
- Try to chat with the notebook
- Error occurs: "Failed send message"
Logs
2025-10-27 06:15:41.268 | ERROR | api.routers.chat:execute_chat:384 - Error executing chat: Error code: 400 - {'type': 'error', 'error': {'type': 'invalid_request_error', 'message': 'temperature and top_p cannot both be specified for this model. Please use only one.'}, 'request_id': 'req_011CUXDKK6e1cU6UZcRyCQuj'}
Proposed Solution
Modify the Anthropic provider implementation to send only temperature parameter (or make it configurable per
provider).
For Anthropic models:
- Send
temperatureonly - Do NOT send
top_p
For OpenAI models:
- Can send both (current behavior is fine)
Additional Context
- This issue has been reported in other projects using Anthropic API:
Workaround
Currently using OpenAI models (gpt-4o) as primary LLM until this is fixed.
Environment Details
ANTHROPIC_API_KEY=sk-ant-*** OPENAI_API_KEY=sk-*** GOOGLE_API_KEY=AIza*** API_URL=http://localhost:5055 SURREAL_URL=ws://localhost:8000/rpc
How did it break?
.
Logs or Screenshots
📋 완전한 GitHub Issue 내용
Bug: Anthropic Claude models fail with "temperature and top_p cannot both be specified" error
Environment
- Open Notebook Version:
v1-latest-single - Docker Image:
lfnovo/open_notebook:v1-latest-single - Image SHA:
sha256:08a018d217d564f24021edf07c6b6c0dd99c5272b1af9e6709d2c7081321edf0 - Image Created:
2025-10-24T21:34:15Z - Platform: Windows 10/11 with Docker Desktop
- Python Version: 3.12.12
- SurrealDB Version: 2.3.10
Affected Models
All Anthropic Claude models fail:
claude-sonnet-4-5-20250929❌claude-3-5-sonnet-20241022❌claude-3-haiku-20240307❌
Working Models (for comparison)
gpt-4o(OpenAI) ✅- All other OpenAI models ✅
Description
When attempting to chat using any Anthropic Claude model, the request fails with a 400 error from Anthropic's API.
The error indicates that both temperature and top_p parameters are being sent, which violates Anthropic's API
requirements.
Error Message: Error code: 400 - { 'type': 'error', 'error': { 'type': 'invalid_request_error', 'message': 'temperature and top_p cannot both be specified for this model. Please use only one.' }, 'request_id': 'req_011CUXDKK6e1cU6UZcRyCQuj' }
Steps to Reproduce
- Set up Open Notebook with Docker:
docker run -d \ --name open-notebook \ -p 8502:8502 -p 5055:5055 \ -v ./notebook_data:/app/data \ -v ./surreal_data:/mydata \ -e ANTHROPIC_API_KEY=sk-ant-*** \ -e API_URL=http://localhost:5055 \ -e SURREAL_URL=ws://localhost:8000/rpc \ -e SURREAL_USER=root \ -e SURREAL_PASSWORD=root \ -e SURREAL_NAMESPACE=open_notebook \ -e SURREAL_DATABASE=production \ lfnovo/open_notebook:v1-latest-single - Access web UI: http://localhost:8502
- Add Claude model via Settings → Models: - Click "Add Model" under "Language Models" - Provider: Anthropic - Model Name: claude-sonnet-4-5-20250929 - Save
- Create a notebook and upload a document
- Try "Chat with Notebook" → Select Claude model
- Send any message → Error occurs: "Failed send message"
Logs
Full Error Log:
2025-10-27 06:15:41.268 | ERROR | api.routers.chat:execute_chat:384 -
Error executing chat: Error code: 400 - {
'type': 'error',
'error': {
'type': 'invalid_request_error',
'message': 'temperature and top_p cannot both be specified for this model. Please use only one.'
},
'request_id': 'req_011CUXDKK6e1cU6UZcRyCQuj'
}
INFO: 172.17.0.1:42572 - "POST /api/chat/execute HTTP/1.1" 500 Internal Server Error
Database Error Record:
UPDATE ONLY command:k10vtfr1yplb58q3wzfr MERGE {
error_message: '',
result: {
command_id: 'command:k10vtfr1yplb58q3wzfr',
embedded_chunks: 0,
error_message: "Error code: 400 - {'type': 'error', 'error': {'type': 'invalid_request_error', 'message':
'temperature and top_p cannot both be specified for this model. Please use only one.'}, 'request_id':
'req_011CUXDJ9d6hyZUgDThNfLaB'}",
execution_metadata: {
app_name: 'open_notebook',
command_name: 'process_source',
started_at: '2025-10-27T06:15:16.708505'
},
success: false
},
status: 'completed'
}
Root Cause
Open Notebook is sending both temperature and top_p parameters to the Anthropic API when making chat completions.
According to https://docs.anthropic.com/en/api/complete, only one of these parameters should be specified: "You should either alter temperature or top_p, but not both."
API Behavior Comparison:
| Provider | Accepts Both Parameters |
|---|---|
| OpenAI | ✅ Yes (both work) |
| Anthropic | ❌ No (only one allowed) |
| ✅ Yes |
Expected Behavior
- Claude models should work successfully with either temperature OR top_p (not both)
- The application should conditionally send parameters based on the provider
Actual Behavior
- OpenAI models: ✅ Work perfectly
- Claude models: ❌ All fail with 400 error
- User sees: "Failed send message" in UI
Proposed Solution
Modify the LLM provider implementation to handle Anthropic differently:
Option 1: Provider-specific parameter handling if provider == "anthropic": # Only send temperature, exclude top_p params = {"temperature": temperature} elif provider == "openai": # Can send both params = {"temperature": temperature, "top_p": top_p}
Option 2: Configuration-based approach Allow users to specify which parameter to use per model in the UI settings.
Option 3: Use only temperature globally Since temperature is more commonly used, only send temperature to all providers (safest approach).
Additional Context
This is a common issue across multiple AI integration projects:
- https://github.com/n8n-io/n8n/issues/18304 - Same error with Anthropic models
- https://github.com/danny-avila/LibreChat/issues/3374 - Weird default API parameters for Anthropic
- https://github.com/BerriAI/litellm/issues/15097 - claude-sonnet-4-5 temperature/top_p conflict
Workaround
Currently using OpenAI GPT-4o as primary LLM, which works perfectly. Waiting for fix to use Claude models.
Configuration Details
Environment Variables
ANTHROPIC_API_KEY=sk-ant-*** OPENAI_API_KEY=sk-*** GOOGLE_API_KEY=AIza*** API_URL=http://localhost:5055 SURREAL_URL=ws://localhost:8000/rpc SURREAL_USER=root SURREAL_PASSWORD=root SURREAL_NAMESPACE=open_notebook SURREAL_DATABASE=production
Screenshots
(I will attach screenshots showing:)
- Models settings page with Claude models added
- "Failed send message" error in chat UI
- Browser console errors (if any)
Open Notebook Version
v1-latest (Docker)
Environment
📋 완전한 GitHub Issue 내용
Bug: Anthropic Claude models fail with "temperature and top_p cannot both be specified" error
Environment
- Open Notebook Version:
v1-latest-single - Docker Image:
lfnovo/open_notebook:v1-latest-single - Image SHA:
sha256:08a018d217d564f24021edf07c6b6c0dd99c5272b1af9e6709d2c7081321edf0 - Image Created:
2025-10-24T21:34:15Z - Platform: Windows 10/11 with Docker Desktop
- Python Version: 3.12.12
- SurrealDB Version: 2.3.10
Affected Models
All Anthropic Claude models fail:
claude-sonnet-4-5-20250929❌claude-3-5-sonnet-20241022❌claude-3-haiku-20240307❌
Working Models (for comparison)
gpt-4o(OpenAI) ✅- All other OpenAI models ✅
Description
When attempting to chat using any Anthropic Claude model, the request fails with a 400 error from Anthropic's API.
The error indicates that both temperature and top_p parameters are being sent, which violates Anthropic's API
requirements.
Error Message: Error code: 400 - { 'type': 'error', 'error': { 'type': 'invalid_request_error', 'message': 'temperature and top_p cannot both be specified for this model. Please use only one.' }, 'request_id': 'req_011CUXDKK6e1cU6UZcRyCQuj' }
Steps to Reproduce
- Set up Open Notebook with Docker:
docker run -d \ --name open-notebook \ -p 8502:8502 -p 5055:5055 \ -v ./notebook_data:/app/data \ -v ./surreal_data:/mydata \ -e ANTHROPIC_API_KEY=sk-ant-*** \ -e API_URL=http://localhost:5055 \ -e SURREAL_URL=ws://localhost:8000/rpc \ -e SURREAL_USER=root \ -e SURREAL_PASSWORD=root \ -e SURREAL_NAMESPACE=open_notebook \ -e SURREAL_DATABASE=production \ lfnovo/open_notebook:v1-latest-single - Access web UI: http://localhost:8502
- Add Claude model via Settings → Models: - Click "Add Model" under "Language Models" - Provider: Anthropic - Model Name: claude-sonnet-4-5-20250929 - Save
- Create a notebook and upload a document
- Try "Chat with Notebook" → Select Claude model
- Send any message → Error occurs: "Failed send message"
Logs
Full Error Log:
2025-10-27 06:15:41.268 | ERROR | api.routers.chat:execute_chat:384 -
Error executing chat: Error code: 400 - {
'type': 'error',
'error': {
'type': 'invalid_request_error',
'message': 'temperature and top_p cannot both be specified for this model. Please use only one.'
},
'request_id': 'req_011CUXDKK6e1cU6UZcRyCQuj'
}
INFO: 172.17.0.1:42572 - "POST /api/chat/execute HTTP/1.1" 500 Internal Server Error
Database Error Record:
UPDATE ONLY command:k10vtfr1yplb58q3wzfr MERGE {
error_message: '',
result: {
command_id: 'command:k10vtfr1yplb58q3wzfr',
embedded_chunks: 0,
error_message: "Error code: 400 - {'type': 'error', 'error': {'type': 'invalid_request_error', 'message':
'temperature and top_p cannot both be specified for this model. Please use only one.'}, 'request_id':
'req_011CUXDJ9d6hyZUgDThNfLaB'}",
execution_metadata: {
app_name: 'open_notebook',
command_name: 'process_source',
started_at: '2025-10-27T06:15:16.708505'
},
success: false
},
status: 'completed'
}
Root Cause
Open Notebook is sending both temperature and top_p parameters to the Anthropic API when making chat completions.
According to https://docs.anthropic.com/en/api/complete, only one of these parameters should be specified: "You should either alter temperature or top_p, but not both."
API Behavior Comparison:
| Provider | Accepts Both Parameters |
|---|---|
| OpenAI | ✅ Yes (both work) |
| Anthropic | ❌ No (only one allowed) |
| ✅ Yes |
Expected Behavior
- Claude models should work successfully with either temperature OR top_p (not both)
- The application should conditionally send parameters based on the provider
Actual Behavior
- OpenAI models: ✅ Work perfectly
- Claude models: ❌ All fail with 400 error
- User sees: "Failed send message" in UI
Proposed Solution
Modify the LLM provider implementation to handle Anthropic differently:
Option 1: Provider-specific parameter handling if provider == "anthropic": # Only send temperature, exclude top_p params = {"temperature": temperature} elif provider == "openai": # Can send both params = {"temperature": temperature, "top_p": top_p}
Option 2: Configuration-based approach Allow users to specify which parameter to use per model in the UI settings.
Option 3: Use only temperature globally Since temperature is more commonly used, only send temperature to all providers (safest approach).
Additional Context
This is a common issue across multiple AI integration projects:
- https://github.com/n8n-io/n8n/issues/18304 - Same error with Anthropic models
- https://github.com/danny-avila/LibreChat/issues/3374 - Weird default API parameters for Anthropic
- https://github.com/BerriAI/litellm/issues/15097 - claude-sonnet-4-5 temperature/top_p conflict
Workaround
Currently using OpenAI GPT-4o as primary LLM, which works perfectly. Waiting for fix to use Claude models.
Configuration Details
Environment Variables
ANTHROPIC_API_KEY=sk-ant-*** OPENAI_API_KEY=sk-*** GOOGLE_API_KEY=AIza*** API_URL=http://localhost:5055 SURREAL_URL=ws://localhost:8000/rpc SURREAL_USER=root SURREAL_PASSWORD=root SURREAL_NAMESPACE=open_notebook SURREAL_DATABASE=production
Screenshots
(I will attach screenshots showing:)
- Models settings page with Claude models added
- "Failed send message" error in chat UI
- Browser console errors (if any)
Additional Context
📋 완전한 GitHub Issue 내용
Bug: Anthropic Claude models fail with "temperature and top_p cannot both be specified" error
Environment
- Open Notebook Version:
v1-latest-single - Docker Image:
lfnovo/open_notebook:v1-latest-single - Image SHA:
sha256:08a018d217d564f24021edf07c6b6c0dd99c5272b1af9e6709d2c7081321edf0 - Image Created:
2025-10-24T21:34:15Z - Platform: Windows 10/11 with Docker Desktop
- Python Version: 3.12.12
- SurrealDB Version: 2.3.10
Affected Models
All Anthropic Claude models fail:
claude-sonnet-4-5-20250929❌claude-3-5-sonnet-20241022❌claude-3-haiku-20240307❌
Working Models (for comparison)
gpt-4o(OpenAI) ✅- All other OpenAI models ✅
Description
When attempting to chat using any Anthropic Claude model, the request fails with a 400 error from Anthropic's API.
The error indicates that both temperature and top_p parameters are being sent, which violates Anthropic's API
requirements.
Error Message: Error code: 400 - { 'type': 'error', 'error': { 'type': 'invalid_request_error', 'message': 'temperature and top_p cannot both be specified for this model. Please use only one.' }, 'request_id': 'req_011CUXDKK6e1cU6UZcRyCQuj' }
Steps to Reproduce
- Set up Open Notebook with Docker:
docker run -d \ --name open-notebook \ -p 8502:8502 -p 5055:5055 \ -v ./notebook_data:/app/data \ -v ./surreal_data:/mydata \ -e ANTHROPIC_API_KEY=sk-ant-*** \ -e API_URL=http://localhost:5055 \ -e SURREAL_URL=ws://localhost:8000/rpc \ -e SURREAL_USER=root \ -e SURREAL_PASSWORD=root \ -e SURREAL_NAMESPACE=open_notebook \ -e SURREAL_DATABASE=production \ lfnovo/open_notebook:v1-latest-single - Access web UI: http://localhost:8502
- Add Claude model via Settings → Models: - Click "Add Model" under "Language Models" - Provider: Anthropic - Model Name: claude-sonnet-4-5-20250929 - Save
- Create a notebook and upload a document
- Try "Chat with Notebook" → Select Claude model
- Send any message → Error occurs: "Failed send message"
Logs
Full Error Log:
2025-10-27 06:15:41.268 | ERROR | api.routers.chat:execute_chat:384 -
Error executing chat: Error code: 400 - {
'type': 'error',
'error': {
'type': 'invalid_request_error',
'message': 'temperature and top_p cannot both be specified for this model. Please use only one.'
},
'request_id': 'req_011CUXDKK6e1cU6UZcRyCQuj'
}
INFO: 172.17.0.1:42572 - "POST /api/chat/execute HTTP/1.1" 500 Internal Server Error
Database Error Record:
UPDATE ONLY command:k10vtfr1yplb58q3wzfr MERGE {
error_message: '',
result: {
command_id: 'command:k10vtfr1yplb58q3wzfr',
embedded_chunks: 0,
error_message: "Error code: 400 - {'type': 'error', 'error': {'type': 'invalid_request_error', 'message':
'temperature and top_p cannot both be specified for this model. Please use only one.'}, 'request_id':
'req_011CUXDJ9d6hyZUgDThNfLaB'}",
execution_metadata: {
app_name: 'open_notebook',
command_name: 'process_source',
started_at: '2025-10-27T06:15:16.708505'
},
success: false
},
status: 'completed'
}
Root Cause
Open Notebook is sending both temperature and top_p parameters to the Anthropic API when making chat completions.
According to https://docs.anthropic.com/en/api/complete, only one of these parameters should be specified: "You should either alter temperature or top_p, but not both."
API Behavior Comparison:
| Provider | Accepts Both Parameters |
|---|---|
| OpenAI | ✅ Yes (both work) |
| Anthropic | ❌ No (only one allowed) |
| ✅ Yes |
Expected Behavior
- Claude models should work successfully with either temperature OR top_p (not both)
- The application should conditionally send parameters based on the provider
Actual Behavior
- OpenAI models: ✅ Work perfectly
- Claude models: ❌ All fail with 400 error
- User sees: "Failed send message" in UI
Proposed Solution
Modify the LLM provider implementation to handle Anthropic differently:
Option 1: Provider-specific parameter handling if provider == "anthropic": # Only send temperature, exclude top_p params = {"temperature": temperature} elif provider == "openai": # Can send both params = {"temperature": temperature, "top_p": top_p}
Option 2: Configuration-based approach Allow users to specify which parameter to use per model in the UI settings.
Option 3: Use only temperature globally Since temperature is more commonly used, only send temperature to all providers (safest approach).
Additional Context
This is a common issue across multiple AI integration projects:
- https://github.com/n8n-io/n8n/issues/18304 - Same error with Anthropic models
- https://github.com/danny-avila/LibreChat/issues/3374 - Weird default API parameters for Anthropic
- https://github.com/BerriAI/litellm/issues/15097 - claude-sonnet-4-5 temperature/top_p conflict
Workaround
Currently using OpenAI GPT-4o as primary LLM, which works perfectly. Waiting for fix to use Claude models.
Configuration Details
Environment Variables
ANTHROPIC_API_KEY=sk-ant-*** OPENAI_API_KEY=sk-*** GOOGLE_API_KEY=AIza*** API_URL=http://localhost:5055 SURREAL_URL=ws://localhost:8000/rpc SURREAL_USER=root SURREAL_PASSWORD=root SURREAL_NAMESPACE=open_notebook SURREAL_DATABASE=production
Screenshots
(I will attach screenshots showing:)
- Models settings page with Claude models added
- "Failed send message" error in chat UI
- Browser console errors (if any)
Linking back to some previous reporting and discussion around this.
Thank you all for following up on this. New version resolves it for good.