StableDiffusionTelegram
StableDiffusionTelegram copied to clipboard
debian server
i installed pytorch for cpu on debian server
got error:
Pipelines loaded with torch_dtype=torch.float16 cannot run with cpu or mps device. It is not recommended to move them to cpu or mps as running them will fail. Please make sure to use a cuda device to run the pipeline in inference. due to the lack of support for float16 operations on those devices in PyTorch. Please remove the torch_dtype=torch.float16 argument, or use a cuda device to run inference.
Does anyone have any solution for this?