ZakFahey
ZakFahey
Absolutely, go ahead. Next weekend actually I was going to update the core library this to fix some stuff, so maybe we can collaborate on that and hold off until...
By the way, I'm working on a new release of the base EasyCommands library and should have a release out by the end of today. It will add things like...
What's the timeline you expect for integrating the plugin? Is this something you want to do before the full release of the 1.4-compatible version of TShock?
Yeah I figure. Just deciding whether to obsolete the easycommands tshock repo now or make a version tick upgrade to it. But yeah, you'd basically just want to replace out...
Oh apparently somebody else already has this in progress: https://github.com/Pryaxis/TShock/pull/1679. Did that venture not pan out for some reason? Edit: well it does look like it hasn't been updated since...
Call me biased, but I think it would generally be better to use the EasyCommands library rather than have TShock implement its own thing. It means that there's less code...
EasyCommands is reflection-based, but converting it to a codegen based system could be an option down the road. That being said, I've used the library on my own servers and...
I see that this issue isn't particularly active, but I just want to put out there that if this is ever taken up, please for the sake of backwards compatibility...
Seems like https://github.com/OkGoDoIt/OpenAI-API-dotnet/pull/83 would let me have this use case, so that would be nice to merge. If not I guess I could depend on that fork for my project.
I've run into this limitation as well. As a workaround you can always just get the number of tokens in your conversation using https://github.com/dluc/openai-tools and calculate it yourself. Prompt tokens...