Jan Kaniewski
Jan Kaniewski
Needs context, model, what you were doing when this call happened etc.
It's possible you might need a newer llama.cpp release to match and support a newer model.
It's likely 4.24 is compatible with 4.23. Try taking the 4.23 commit checkpoint and importing it into a 4.24 project and hitting build. Unless there were big API changes it...
This does appear to be a bug. Original intent was to get a callback on game thread that the server has finished, this callback doesn't get waited on so it...
Disconnect your TCP before transitioning or attach it to a game instance. It appears your callback returns data after it's owner may have been released. Likely something like: https://github.com/getnamo/TCP-Unreal/blob/master/Source/TCPWrapper/Private/TCPClientComponent.cpp#L114
Nice spotting this error, if you make a pull request I'll merge it