data-driven-characters
data-driven-characters copied to clipboard
Maximum context length needs to be configurable
When generating characters using gpt-3.5-turbo instead of gpt-4, I get the following error:
openai.error.InvalidRequestError: This model's maximum context length is 4097 tokens. However, your messages resulted in 4215 tokens. Please reduce the length of the messages.
After a lot of pain, I solved the problem by switching to gpt-3.5-turbo-16k.
Clearly relates to #5 and having to work around the hardcoding of GPT-4 in parts.