budibase
budibase copied to clipboard
Error 400 when using OpenAI API in Automations
Discussed in https://github.com/Budibase/budibase/discussions/13592
Originally posted by fueledbyEmin May 2, 2024 I'm using the tutorial at https://docs.budibase.com/docs/openai to send a question to OpenAI API.
But I get Error 400
{ "success": false, "response": "Error: Request failed with status code 400" }
From OpenAI forums, the problem can be in the JSON payload which should include max_tokens
and temperature
params. What's more, max_tokens
and temperature
should be set to specific values. For example, max_tokens=64
and temperature=0.5
.
If Budibase correctly structures OpenAI API requests in the background, what could be the problem here?
I'm self-hosting using docker-compose.
CSE Team findings
GPT-3.5 Turbo returns a 400
response
GPT-4 returns a 404
response
Possibly the internal model name has been depricated in the API and may need to be updated.