Slowly-Grokking
Slowly-Grokking
Just sharing some general thoughts here. I'm working on something similar to address some of this, but it's too early to go into details. Whoever controls the most powerful 'AI'...
>read issues_data.json' and after 5 minutes of the data it's reading scrolling up the screen I get an error that reads: openai.error.InvalidRequestError: This model's maximum context length is 8191 tokens,...
> Seems to be covered by #2801 This issue is kind of described poorly, but the main issue OP is reporting is not the chunking issue. It's the failure to...
> Thanks for correcting me @Slowly-Grokking Oh, no worries. I really just wanted to make note of this if even for myself to see later.