hamperia4
Results
3
comments of
hamperia4
Thank you for the feedback. Yes i agree as due to nmap responses this can be expected for openai. Just out of curiosity, even if you use bigger models like...
On my error i can see that actually openai provides the limit as the maximum context length is 16385 tokens, but i requested for 17216 tokens (14716 in your prompt,...
Hello, as tested in multiple models this is still the same so you are correct. Is there any update?