cambrian icon indicating copy to clipboard operation
cambrian copied to clipboard

Error in inference.py when multiple GPUs are available. [BUG]

Open ZeenSong opened this issue 7 months ago • 4 comments

My server has 8 GPUs and when running

python inference.py

It can load all models, but when input with image and question it raises an error with:

RuntimeError: Expected all tensors to be on the same device, but found at least two devices, cuda:1 and cuda:0! (when checking argument for argument weight in method wrapper_CUDA__native_layer_norm)

But when I specify only one GPU with

CUDA_VISIBLE_DEVICES=0 python inference.py

It works well.

Does this script only work with a single GPU?

ZeenSong avatar Jul 02 '24 10:07 ZeenSong