RiverZhou
RiverZhou
能不能提供 llama.cpp 的 FP16 格式的权重文件下载。量化成int4之后感觉会有一些信息错误。
# Prerequisites Please answer the following questions for yourself before submitting an issue. - [x] I am running the latest code. Development is very rapid so there are no tagged...
After setup site following readme, open http://localhost:3000, cannot select model. 
The code is support ROCm now, but CMakefile.txt do not have ROCm content.
Offical llama.cpp is already support ROCm, when will qwen.cpp support ROCm?