ipex-llm icon indicating copy to clipboard operation
ipex-llm copied to clipboard

[All-in-one benchmark] [GPT2-large] The size of tensor a (1024) must match the size of tensor b (1025) at non-singleton dimension 3

Open Kpeacef opened this issue 1 year ago • 1 comments

Hi I am trying to benchmark GPT2-large and experienced RuntimeError: The size of tensor a (1024) must match the size of tensor b (1025) at non-singleton dimension 3.

The inputs should able to accept up to 1024 consecutive tokens. I have tried different in/out tokens and the max in/out pairs i tried is 512/512.

Inputs for 640 to 2048 will face this RuntimeError: The size of tensor a (1024) must match the size of tensor b (1025) at non-singleton dimension 3.

API used: transformer_int4_fp16_gpu & optimize_model_gpu

Model used: openai-community/gpt2-large

Versions: bigdl-core-xe-21 2.6.0b20240827

Thank you.

Kpeacef avatar Aug 28 '24 08:08 Kpeacef

Hi Kpeacef, We have looked into this issue. We have tried running the GPT2-large model using only native transformers. The error is also reported under the input size you mentioned. So we suppose this issue is not introduced by ipex-llm and probably due to GPT2's incompatibility with the current version of transformers.

cranechu0131 avatar Sep 02 '24 01:09 cranechu0131