FastSAM
FastSAM copied to clipboard
Probleming with GPU setting
trafficstars
Can only run inference code on Cuda:0, cannot set the device to other gpus via the "device" parameter. My code is as follows:
results = model(input,
device="cuda:1",
retina_masks=True,
iou=iou_threshold,
conf=conf_threshold,
imgsz=input_size,)
Hi, you can set 'device="1",' or 'device="0,1,2,3",'
Yeah, it works! Thank you!
I'm having the same problem, and the solution exposed here doesn't work for me. How should I specify the device?