LLaVA icon indicating copy to clipboard operation
LLaVA copied to clipboard

[BUG] RuntimeError when running inference on ScienceQA

Open YuanLiuuuuuu opened this issue 1 year ago • 1 comments

When did you clone our code?

I cloned the code base after 5/1/23

Describe the issue

Issue:

Command:

python -m llava.eval.model_vqa_science \
    --model-name /path/to/LLaVA-13b-v0-science_qa \
    --question-file /path/to/ScienceQA/data/scienceqa/llava_test.json \
    --image-folder /path/to/ScienceQA/data/scienceqa/images/test \
    --answers-file vqa/results/ScienceQA/test_llava-13b.jsonl \
    --answer-prompter
    --conv-mode simple

Log:

RuntimeError: cuDNN error: CUDNN_STATUS_INTERNAL_ERROR

Screenshots: image You may attach screenshots if it better explains the issue.

YuanLiuuuuuu avatar May 17 '23 12:05 YuanLiuuuuuu

Hi, you may try reinstalling PyTorch with CUDA 11.7, with the commands this user from our community provides: https://github.com/haotian-liu/LLaVA/issues/123#issuecomment-1539434115.

haotian-liu avatar May 17 '23 18:05 haotian-liu