Calamari
Calamari
But I think a nicer error message might be a dev quality improvement :)
No worries about the timing. :-) We have a workaround in place right now, but of course I like to get rid of that again. Thanks for taking care.
Honestly, I don’t know anymore how I worked around this, since I don’t work on that project anymore :shrug:
Sure. If I can help, I will :-) But a thought on the onEntityAdded. Why is that a property on the Query? Seems dangerous, since different systems could set it,...
Not quite sure what you mean with the last part. What I like about ecsy, for example is the way you define query additions and removals. It looks like this:...
I am not quite sure about other models as OpenAI's but wouldn't it be a relatively easy solution to add a struct containing either the real response or the headers...
I was also thinking along the lines of the meta info of "how many tokens are left and when do they reset". At least for ChatGPT they say, they provide...
A callback sounds like a nice idea. Looking at the [API docs](https://platform.openai.com/docs/guides/rate-limits/rate-limits-in-headers) at least for OpenAI this information about, how many tokens are left is returned in the header as...
Yes, I am currently interested in the time-based rate limits. Since there is currently no way to track them, so the server cannot throttle anything or doesn't know when to...
In the canonical data there are checks, if the positions are on the board and throwing errors if they aren't, if I am reading this correctly. So I would also...