morpheuslord
morpheuslord
> @morpheuslord Did you ever figure out what the issue was? I'm experiencing this with the docker build Na man I am using an older release for my tests ....
So I am looking into this issue, the thing is OPENAI has a token limit and that's common for all models. The only way to mitigate this is to make...
> Thank you for the feedback. > Yes I agree as due to Nmap responses this can be expected for Openai. > Just out of curiosity, even if you use...
> On my error i can see that actually openai provides the limit as the maximum context length is 16385 tokens, but i requested for 17216 tokens (14716 in your...
I was kinda busy with uni exams and stuff so not able to work on this 😅. I will work on this when I get the time for it.
Can we compile this to work on any windows machine
same issue did you find any updates
what?