JarvisZeng
JarvisZeng
可以了,但是流量差距挺大,客户端500M 服务器5M。。。。
另外想要测试bind,用udp不行,./tcpcopy -x 53-192.168.1.78:53,如何解决,谢谢! @zengjice ,./configure --enable-udp
I had the same problem, here is my commands: ```bash cmake -DLUA_INCLUDE_DIR=~/openresty/luajit/include/luajit-2.1 -DLUA_LIBRARIES=~/openresty/luajit/lib -DUSE_LUAJIT=ON -DUSE_LUA=OFF make ```
> New model dead. I need someone to share me the plus model so I can implement it I can help you. How can I contact you?
@shihaobai Thank you, here are some detailed information: ```bash python -m lightllm.server.api_server --model_dir /data/models/qwen/Qwen-14B-Chat-Int4 --trust_remote_code --max_total_token_num 3000 --max_req_input_len 2048 --max_req_total_len 2100 --tokenizer_mode auto --disable_log_stats --tp 2 --mode ppl_int4weight ``` ```bash...
@shihaobai It’s the same error, haven’t reached the code to distinguish modes yet.
> Qwen-14B-Chat-Int4 weights has not been supported yet. 👌🏻