Image2Paragraph icon indicating copy to clipboard operation
Image2Paragraph copied to clipboard

Out of Memory Issue in Semantic Segmentation

Open code4indo opened this issue 1 year ago • 2 comments

Why is it that when working on semantic segmentation, I constantly encounter out of memory errors, even though I have two GPUs with 15GB each? Is it possible to distribute the model workload across the GPUs in parallel?

code4indo avatar Apr 17 '23 06:04 code4indo

The SAM itself is not heavy. But semantic segment anything requires four large model which is very memory consuming. At now, simply use --semantic_segment_device as 'CPU' to run. We are working on make this model lightweight now.

FingerRec avatar Apr 17 '23 06:04 FingerRec

Hi, we have implement a light version.

Can be run on 8G GPU less than 20s.

FingerRec avatar Apr 17 '23 11:04 FingerRec