AbyszOne

Results 74 comments of AbyszOne

> Thanks for your suggestions! But little confused about 3.ControlNet Bypass, could you share some examples to help me understand how to properly “blend with”? As we know, img2img organically...

> Is point 3 about some sort of the feature recreated with the technique in this video: https://www.youtube.com/watch?v=_xHC3bT5GBU ? No. In this video a generated image is used, which is...

> I think a second pass with the lighting technique would be very welcome and would still require functionality for batching img2img and controlnet separetely. This, imo, is blending light...

> Looks like point 3 is somehow impl in Composer (#359), but it also requires some modification to img2imgalt.py Thanks for the update. Very interesting paper. I have been researching...

Here a full frame. Not picked. SD 1.5. Promptless. ![hombremanoscabeza_000080](https://user-images.githubusercontent.com/112580728/221347153-57381c40-df65-4d41-8e4f-d5f49d9a8177.jpeg) ![04040-3463456346-](https://user-images.githubusercontent.com/112580728/221347190-534120ac-8b65-41cf-a3e8-0601ada174b4.jpg)

I don't know if I understood correctly, but the idea is that this reconstruction with img2alt is an alternative output in controlnet, and interacts with img2img in the same way....

If the question was whether CN influences that image, the answer is yes. A concurrent CN with img2alt has an effect on denoising 1, but not in ways worth mentioning....

Some quick test with custom models shows even better results, including hard tasks like avatar skin shapes. If this can be really be organically mixed, many people will be ecstatic....

Point 1 and 2 almost there. Quick question. Its difficult to make a control just another img2img? With cfg and denoising. Less powerful than img2alt, but would add utilities.