Daniel J Walsh
Daniel J Walsh
Is this something we care about or is llama.cpp need to get this functionality so we can take advantage?
Should stable-diffusion be considered a runtime then?
PR to add stable-diffusion.cpp, one issue will be command line and how much disk space this adds. Could be added as a separate image if it is too big. Would...
@jtligon interested in working on this?
@jtligon do you still consider this issue open?
Seems to be something to do with the first time you use python3 on a MAC. I don't recall seeing this either on my Mac but I have had it...
Since we have not heard of this since, I am closing. Reopen if you can recreate the issue with latest ramalama
Yes vllm can only do serve at this point.
Not even sure how well that works either.
@dwrobel still have this issue?