ml-stable-diffusion
ml-stable-diffusion copied to clipboard
Error calling plan_submit in batch processing.
I guess I should have gave up by now, but I have not been able to get it to run on an M1 iPad Pro or M1 Mac. I just updated the Mac to Ventura, reinstalled conda, brew, everything but it fails:
(coreml_stable_diffusion)swift run StableDiffusionSample "a photo of an astronaut riding a horse on mars" --resource-path mlpackages384/Resources --seed 93 --output-path output Building for debugging... [59/59] Linking StableDiffusionSample Build complete! (10.72s) Loading resources and creating pipeline (Note: This can take a while the first time using these resources) Sampling ... inputLength: 77 inputShape: [1, 77] inputLength: 77 inputShape: [1, 77] 2022-12-04 18:02:11.377 StableDiffusionSample[28003:235755] Error calling plan_submit in batch processing. zsh: trace trap swift run StableDiffusionSample --resource-path --seed 93 --output-path
Running into the same issue
While I appreciate the work that went into this project, it feels painful compared to most the deep learning I do with PyTorch.
I dream of the day we can just use a few huggingface lines in Xcode.
While running swift run StableDiffusionSample "a photo of an astronaut riding a horse on mars" --resource-path mlpackages384/Resources --seed 93 --output-path output
it throws the above error but while trying to run the pipeline from Xcode project, it does not throw any error and works fine (runs on M1 8GB Mac Pro but crashes on M1 iPad :( )
While running
swift run StableDiffusionSample "a photo of an astronaut riding a horse on mars" --resource-path mlpackages384/Resources --seed 93 --output-path output
it throws the above error but while trying to run the pipeline from Xcode project, it does not throw any error and works fine (runs on M1 8GB Mac Pro but crashes on M1 iPad :( )
any chance you could link your Xcode project setup? I'm getting the opposite problem - works from command line but I'm not getting a project to work.
I haven't pushed my project yet, but you can refer https://github.com/ynagatomo/ImgGenSD2 @ynagatomo has compiled it in a much better way
I'm running into the same problem (M1 Max, 64 GB RAM).
Interestingly, it works fine with 1.4 (the first model I installed). But I get this plan_submit
error on 1.5 and 2.
I'm running it from the command line:
swift run StableDiffusionSample --resource-path ../models/coreml-stable-diffusion-2-base_original_compiled --compute-units all "a photo of an astronaut riding a horse on mars" --output-path ../final_image
Result:
2022-12-10 12:48:39.928 StableDiffusionSample[8088:337787] Error calling plan_submit in batch processing.
zsh: trace trap swift run StableDiffusionSample --resource-path --compute-units all
I was the same. The Compiled models of v1.5 and v2 by HuggingFace didn't work.
The converted model of v2 by myself, according to Apple's instructions, works well.
I am still running into this issue. I have pytorch 1.12.1, and compiled v2 of SD like the apple docs said.
appending --compute-units cpuAndGPU
worked for me (mentioned here)