Liangqi zhu

Results 6 comments of Liangqi zhu

> Please provide the exact command how you run the inference server. Plus all the configs and logs (see ~/.ktransformers/logs/) etc. > > [EDIT]: Also provide the info regarding your...

> [@Adadxz](https://github.com/Adadxz) can you try with --log_level DEBUG parameter? Sure. This is log with DEBUG parameter [rpc_debug.log](https://github.com/user-attachments/files/21105446/rpc_debug.log)

No, I just thought the V2 model was relatively small and would load quickly to debug this issue. However, I'm experiencing the same problem with V3.

I just changed the parameter from “ktransformers” to “balance_serve”, and this issue occurs. If I use the “ktransformers” parameter, it works well.