ErroneousBosch

Results 43 comments of ErroneousBosch

actually I do get a startup error on the 12.3.2: ``` WARNING[XFORMERS]: xFormers can't load C++/CUDA extensions. xFormers was built for: PyTorch 2.0.1+cu118 with CUDA 1108 (you have 2.1.0+cu121) Python...

Dug around some more and there's a couple of barriers: 1. the nvidia/cuda images only go up to Ubuntu 22.04 2. Ubuntu 22.04 only oficially goes up to Python 3.10.12...

@mrbrdo Python 3.11 is to get the xformers to work, since they need one sub-point version past what Ubuntu 22.04 has, and the PPA doesn't have python 3.10.13+. I so...

Testing on my machine, stability seems pretty good, and if anything it seems to run faster than it did before. Edit: So it will still sometimes give me a crash...

Stability is really rough when changing models if you are going a bit rapid fire. seems like there is some kind of settling period needed between changes, specifically if you...

@mrbrdo I end up just tapping it before the error message disappears, which seems to reset its conter each time. Eventually the container restarts. and it reconnects. I had this...

I tried it with a debug-mode, but all I get is that it is killed: `/content/entrypoint.sh: line 33: 22 Killed python launch.py $*` This is identical to what I get...

What might be happening is that Python sees the full amount of system RAM, tries to use more than is allowed, and then crashes when it hits the memory limit....

@mrbrdo Unless you have disabled it, swap should be on by default. I'd say set max memory to 32gb on your system. Fooocus wants to use whatever is available, and...

The lack of Ollama (or even LiteLLM) Integration is a real detriment, especially since it is already in competitors like Flowise and N8n. I am not sure how @kaovilai was...