FlagEmbedding icon indicating copy to clipboard operation
FlagEmbedding copied to clipboard

BGE-VL: unsupported operand type(s) for //: 'int' and 'NoneType'

Open Labmem009 opened this issue 9 months ago • 1 comments

When I use the demo of BGE-VL, without any change: `import torch from transformers import AutoModel from PIL import Image

MODEL_NAME= "./BGE-VL-MLLM-S2"

model = AutoModel.from_pretrained(MODEL_NAME, trust_remote_code=True) model.eval() model.cuda()

with torch.no_grad(): model.set_processor(MODEL_NAME)

query_inputs = model.data_process(
    text="Make the background dark, as if the camera has taken the photo at night", 
    images="./assets/cir_query.png",
    q_or_c="q",
    task_instruction="Retrieve the target image that best meets the combined criteria by using both the provided image and the image retrieval instructions: "
)

candidate_inputs = model.data_process(
    images=["./assets/cir_candi_1.png", "./assets/cir_candi_2.png"],
    q_or_c="c",
)

query_embs = model(**query_inputs, output_hidden_states=True)[:, -1, :]
candi_embs = model(**candidate_inputs, output_hidden_states=True)[:, -1, :]

query_embs = torch.nn.functional.normalize(query_embs, dim=-1)
candi_embs = torch.nn.functional.normalize(candi_embs, dim=-1)

scores = torch.matmul(query_embs, candi_embs.T)

print(scores) ` It outputs Traceback (most recent call last): File "/ssd4/demo/bgevl.py", line 14, in query_inputs = model.data_process( File "/ssd4/.cache/huggingface/modules/transformers_modules/BGE-VL-MLLM-S2/modeling_llavanext_for_embedding.py", line 312, in data_process inputs = self.processor(images=images, text=text_input, return_tensors="pt", padding=True) File "/ssd4/anaconda3/lib/python3.10/site-packages/transformers/models/llava_next/processing_llava_next.py", line 162, in call num_image_tokens = self._get_number_of_features(orig_height, orig_width, height, width) File "/ssd4/anaconda3/lib/python3.10/site-packages/transformers/models/llava_next/processing_llava_next.py", line 181, in _get_number_of_features patches_height = height // self.patch_size TypeError: unsupported operand type(s) for //: 'int' and 'NoneType'

Labmem009 avatar Mar 20 '25 06:03 Labmem009

pip install transformers==4.47.0 work this

wlll123456 avatar Jun 03 '25 07:06 wlll123456