was-node-suite-comfyui icon indicating copy to clipboard operation
was-node-suite-comfyui copied to clipboard

function bounded_image_blend_with_mask is missing from WAS_Node_Suite.py

Open aqgasa opened this issue 2 years ago • 2 comments

File: WAS_Node_Suite.py

"class WAS_Bounded_Image_Blend_With_Mask" (line 11396) is missing the function "bounded_image_blend_with_mask".

Instead the class has a function "bounded_image_crop_with_mask" (line 11418)

aqgasa avatar Oct 13 '23 14:10 aqgasa

I was testing those classes in another custom node, trying to fix them. They may not work. Though I patched the name.

WASasquatch avatar Oct 13 '23 16:10 WASasquatch

I replaced the function with this earlier version which seems to work (at least in the workflows I've tried):

def bounded_image_blend_with_mask(self, target, target_mask, target_bounds, source, blend_factor, feathering):
    # Convert PyTorch tensors to PIL images
    target_pil = Image.fromarray((target.squeeze(0).cpu().numpy() * 255).clip(0, 255).astype(np.uint8))
    target_mask_pil = Image.fromarray((target_mask.cpu().numpy() * 255).astype(np.uint8), mode='L')
    source_pil = Image.fromarray((source.squeeze(0).cpu().numpy() * 255).astype(np.uint8))
    
    # Extract the target bounds
    rmin, rmax, cmin, cmax = target_bounds
    
    # Create a blank image with the same size and mode as the target
    source_positioned = Image.new(target_pil.mode, target_pil.size)
    
    # Paste the source image onto the blank image using the target bounds
    source_positioned.paste(source_pil, (cmin, rmin))
    
    # Create a blend mask using the target mask and blend factor
    blend_mask = target_mask_pil.point(lambda p: p * blend_factor).convert('L')
    
    # Apply feathering (Gaussian blur) to the blend mask if feather_amount is greater than 0
    if feathering > 0:
        blend_mask = blend_mask.filter(ImageFilter.GaussianBlur(radius=feathering))
        
        # Blend the source and target images using the blend mask
    result = Image.composite(source_positioned, target_pil, blend_mask)
    
    # Convert the result back to a PyTorch tensor
    result_tensor = torch.from_numpy(np.array(result).astype(np.float32) / 255).unsqueeze(0)
    
    return (result_tensor,)

aqgasa avatar Oct 16 '23 12:10 aqgasa