Lambda14
Lambda14
Have that code ``` sendMsg = MsgProto(EMsg.ClientFriendMsg) sendMsg.body.steamid = 76561198864244185 sendMsg.body.chat_entry_type = 1 sendMsg.body.message = str.encode("HW!") sendMsg.body.rtime32_server_timestamp = int(time.time()) client.send(sendMsg) ``` But it doesn't work. Have that error: `TypeError: Expected...
@zwpaper Hello, thanks for the reply, but still, why this error does not occur when running the model via docker?
Ok, now i tried to run model Qwen2.5-Coder-3B **without chat model** it runs successfully, but when i send a request, i get an error ``` 2025-04-19T11:35:24.909336Z WARN llama_cpp_server::supervisor: crates\llama-cpp-server\src\supervisor.rs:124: llama-server...
@zwpaper hello, cpu: E5-2690 v3