diffusionbee-stable-diffusion-ui
diffusionbee-stable-diffusion-ui copied to clipboard
Inpainting gives identical result images to input
Version 1.3.1 (0010): Every inpainting image returns an image that is identical to the original.
After several failures using both default settings and adjusted settings, I tried a baseline test using part of the Wikipedia page example for "inpainting." I painted out the damaged areas of the photos, than ran inpainting, batch size 4, seed 1.
After running for quite some time, all resulting images are simply identical copies of the original input image -- including the original damage sections in the example, which could not reasonably be inferred from the surrounding content. DiffusionBee doesn't seem to be doing inpainting at all.

I have the same issue, no matter what photos I try on an M1 Max. Someone else with an M1 Max had the same issue. What kind of computer are you running?
MacBook Pro (14-inch, 2021), Apple M1 Pro, MacOS 12.2.1 (21D62)
I've discovered that (I think) the recorded area that is masked is like 1 pixel under the mouse and then it's like 10 (or whatever actual value) on the screen. Thus you have to go back and forth over the area a ton to make sure all the images are completely covered up, otherwise it is able to mostly reconstruct an image, and even more so if the original image was something it generated (because it can totally arrive at that again).
The fix would be to make sure the painted region is the size of the brush. I'm not sure that's true in your case.
looks like your theory is right, I did a huge paint, to make sure to cover just the feets in my input