bitxsw93
bitxsw93
@LHB116 NVIDIA的服务器opencl可能不支持fp16,之前mnn-stable-diffusion仓库跑的是哪个后端?
> > @LHB116 NVIDIA的服务器opencl可能不支持fp16,之前mnn-stable-diffusion仓库跑的是哪个后端? > > 应该是cpu上跑的(Can't Find type=2 backend, use 0 instead),我把后端改为cpu(MNN_FORWARD_CPU),出现Segmentation fault的错误 > > model resource path: mnn_taiyi model type is stable diffusion taiyi chinese version output img_name:...
我们本地测试正常的 无法复现,编一下debug版本 看看是否有报错
 how to run qwen3-8b with eagle3 in this repo? how to choose --model-type ?
you can try with memory saving mode in settings, and try to use lowercase letters prompt, like "a lovely cat"