felixslu

Results 12 comments of felixslu

@Barry-Delaney ,I got an error when build w4a8_awq engines of llama7b with trtllm-build tools. My TRT-LLM version is v0.8.0. (by the way,the quantization stage is OK when use quantize.py script)...

internlm2 meet this error. v0.9.dev0/mlc_ai_nightly_cu122-0.15.dev519-cp310-cp310-manylinux_2_28_x86_64.whl