Morgan
Morgan
I face same problem after check it is just the network problem, so just go to the https://github.com/snakers4/silero-vad/tree/v4.0 download the zip file to /home/your_username/.cache/torch/hub after unzip and change the name...
参考思路:(实际没验证过) - diffusers 利用pipeline 加载 .safetensors - 分离 text_encoder, unet, vae_encoder, vae_decoder 这四个模型出来 - 分别将他们trace 成 pt模型或者转换为onnx - 将得到的pt 或者 onnx 利用rknn-toolkit2 转化为.rknn - 精度,性能分析 - 编写sd整体代码,涉及推理部分用rknn-toolkit2-lite做python推理,或用C推理
it seems TPU is hang, could you check your libbmrt.so path is correct?
i think you need to use `/data/LLM-TPU/support/lib_soc/libbmrt.so` or `/data/LLM-TPU/support/lib_pcie/libbmrt.so` depend what kind of device you use
it seems your input is reach the maximum input shape of the bmodel, due with the bmodel is a static model, the input shape is limit when convertation, you can...
i think is the bounding problem of the tpu-mlir which is not support yet
Thanks @TonyRoyRoy, this is work for me in ubuntu 22.04
should be the python packages enviroment proplem
是不是 embedding 模型会莫名掉线,但是 LLM 还在?
万分感谢,刚升级后发现邮件看不了,立马回退了 > 我也中招了。 但deepin的源里面清除了,没办法。 手动下载再安装吧。 > > wget -U 'Debian APT-HTTP/1.3 (2.9.17)' https://com-store-packages.uniontech.com/appstorev20/pool/appstore/c/com.qq.weixin.work.deepin/com.qq.weixin.work.deepin_4.1.32.6005deepin11_amd64.deb