BLIP icon indicating copy to clipboard operation
BLIP copied to clipboard

The following `model_kwargs` are not used by the model: ['encoder_hidden_states', 'encoder_attention_mask']

Open osi1880vr opened this issue 2 years ago • 4 comments

I keep getting this error, I migrated from original BLIP to the new REPO here but the error keeps following. I did remove all existing models so it would download fresh copies to prevent me having wrong models, did not help, I try your example code kinda close to what its in your example but still it failes. Funny thing is it did work weeks ago and now it keeps failing so it might still be an issue with my box but I fail to find it, could you tell which direction I get shoot at?

I try this

raw_img = Image.open("00000-0-1.png").convert("RGB")

# setup device to use
device = torch.device("cuda" if torch.cuda.is_available() else "cpu")

vis_processor = load_processor("blip_image_eval").build(image_size=384)

model_type= "BLIP_large"

if model_type.startswith("BLIP"):
    blip_type = model_type.split("_")[1].lower()
    model = load_model_cache(
        "blip_caption",
        model_type=f"{blip_type}_coco",
        is_eval=True,
        device=device,
    )

use_beam = False #did try True either but same result
img = vis_processor(raw_img).unsqueeze(0).to(device)
captions = generate_caption(
    model=model, image=img, use_nucleus_sampling=not use_beam
)

osi1880vr avatar Dec 16 '22 14:12 osi1880vr

I am having a similar error, did you ever find a solution?

oeatekha avatar Jan 14 '23 00:01 oeatekha

@dxli94 could you take a look at this issue?

LiJunnan1992 avatar Jan 16 '23 00:01 LiJunnan1992

Hi @osi1880vr , @oeatekha ,

This happens between specific versions of HF transformers. For transformers version >=4.15.0,<4.22.0, and >=4.25.0, the issue is not happening.

If you have to stick with otherwise versions, please see explanation and fixes as outlined here: https://github.com/huggingface/transformers/issues/19290#issuecomment-1321247755

dxli94 avatar Feb 10 '23 02:02 dxli94

Im on transformers 4.25.1 now and the issue is gone, thanks a lot for the help

osi1880vr avatar Feb 12 '23 09:02 osi1880vr