stable-diffusion-webui-depthmap-script
stable-diffusion-webui-depthmap-script copied to clipboard
[Feature Request] Add PatchFusion Depth Estimation
Would be cool to have this added: https://github.com/zhyever/PatchFusion
I've been running it locally on it's own. Would be nice to have it incorporated into this extension. https://huggingface.co/spaces/zhyever/PatchFusion
However a couple of modifications may need to be made to get it to work, same ones I had to make. Found here: https://github.com/Fannovel16/comfyui_controlnet_aux/issues/2#issuecomment-1763579485
to fix this: file /comfyui_controlnet_aux/src/controlnet_aux/zoe/zoedepth/models/base_models/midas.py replace line 176 with
return nn.functional.interpolate(x, (int(height), int(width)), mode='bilinear', align_corners=True)
file /comfyui_controlnet_aux/src/custom_midas_repo/midas/backbones/beit.py replace line 47 with
new_sub_table = F.interpolate(old_sub_table, size=(int(new_height), int(new_width)), mode="bilinear")
The line location was different on mine and so was the path to the two files but their paths showed up in the errors that came up. Once I found this solution, it was simple to implement.
I too hope somebody can implement this :)
Do you have an estimate on your local memory usage? I was kind of hoping they would make some memory optimisations before we implemented it. I'm thinking about doing some cloud gpu stuff next week so might have some time to implement it.
Do you have an estimate on your local memory usage? I was kind of hoping they would make some memory optimisations before we implemented it. I'm thinking about doing some cloud gpu stuff next week so might have some time to implement it.
Running it right now, Task Manager is showing Python using 1.7GB of RAM.
That can be right while the algorithm is running? I'm pretty sure it should be at least 10GB.
It BRIEFLY got up to 2.4GB, but as you can see from this screenshot, my GPU is maxed out while the system RAM is at 1.7GB.
Oh no that's the RAM I meant the VRAM. I'll just try with a big gpu.
Oh no that's the RAM I meant the VRAM. I'll just try with a big gpu.
Sorry, stupid question I know, but how can I tell? I've got MSI afterburner installed but I'm not sure if I can use that for this since it's not using a graphics API window. My GPU is a 4090 with 24GB of VRAM.
Ok, the first image is without it running, my GPU VRAM is at 11GB. Then with it running it goes up to just over 15GB, then briefly at the end it jumps up to just over 20GB. So you were correct. Around 10GB of VRAM it seems.
Nice thank you!
Nice thank you!
No problem! Would be very nice to be able to have this in this extension :)
https://github.com/graemeniedermayer/stable-diffusion-webui-normalmap-script/tree/patchfusion I have the start of it here but there's a surprising amount of compatibility issues.