Elias Hogstvedt
Elias Hogstvedt
torch: **2.1.0**.dev20230304+**rocm5.4.2** XFX Radeon **RX 6900 XT** MERC 319 Black LTD, **16368 MiB RAM** ``` euler a, 20 samples: --opt-sub-quad-attention: 1.5 512: 4.00 sec, 2976 MiB 2.1 512: 3.55 sec,...
> > So it's a lot faster! > > But uses a lot more more VRAM when resolution is increased compared to opt-sub-quad-attention. > > Why are you using opt-sub-quad-attention?...
If you have an AMD card you can't use xformers and full precision will just run out of memory when doing 768x768, even though I have 16gb vram.
I can't find any usage of ATTN_PRECISION in code with the commit hash mentioned above. Their latest commit does have some code related to it though (c12d960d1ee4f9134c2516862ef991ec52d3f59e) However even after...
> Xformers is becoming mandatory from 2.1 onwards, I believe they said (or the no-half). They may change that but even my 1060 can do fp32 albeit the 6 gig...
I have the same issue and was looking for a way to do this in code. (I searched for `fallback` and found this pr.)
I updated and applied the changes again in order to compile. https://gist.github.com/CapsAdmin/ce5971c7e7ecf4828af935acf840bceb runtests.sh attempts to run test.lua which i assume is runtests.lua. There are errors in the telescope test as...
The idea with webaudio and other lua scripts in the same folder is that they're supposed to be independent from pac. You could just check if pac exists in its...
Feel free to change the format too, it's not like anything relies on it.
pac might be rewriting the link to make sure it works https://github.com/CapsAdmin/pac3/blob/e00e16c30452b37f7a0146ce4cb6ff0ab4777043/lua/pac3/core/shared/http.lua#L68-L74 However passing the first link doesn't seem to modify it when testing the code locally with Lua. So...