Sek Davara
Sek Davara
@Jaydipsinhv That is a good idea if we get this feature in this repo.
I am getting same error with model `gpt-4-0314` and max_token = 2048 with request_timeout = 240 in local and live server. yesterday this was working fine
Today I am getting the same error every time with model `gpt-4-0314`, I also set the request_timeout to 240 and even after that, I am still getting same error every...
@Suprimepl which model you are using in your script ?
Is there any update on this ?
really needed feature is now available