olmocr icon indicating copy to clipboard operation
olmocr copied to clipboard

Support for Dual Nvidia 3060 Setup in OLMOCR

Open mrpoet2 opened this issue 9 months ago • 5 comments

🚀 The feature, motivation and pitch

I normally use a dual Nvidia 3060 setup for AI, and it works great for LLMs that require 20GB of VRAM. The workload is split across my GPUs, and everything runs smoothly. I also use a third GPU (not Nvidia) for my displays. Ollama handles this automatically for my LLMs.

However, when running OLMo-CR, I encounter the following error:

'[ERROR:olmocr.check:Torch was not able to find a GPU with at least 20 GB of RAM.]’

Is there a way to make OLMo-CR utilize both of my GPUs? Additionally, would a single 3090 (24GB) work for OLMo-CR? I'm considering getting one if my dual 3060 setup isn’t viable.

Alternatives

No response

Additional context

No response

mrpoet2 avatar Mar 25 '25 14:03 mrpoet2

Pls add support for dual graphics cards for AMD cards too. 20G VM is above average cards in the market.

wcwong22000 avatar Mar 30 '25 07:03 wcwong22000

Yes, a single RTX 3090 works for olmOCR using about 20Gb of the 24Gb VRAM.

mrharrison007 avatar Apr 01 '25 16:04 mrharrison007

A single 3090 will be much better than the dual 3060

NOTE46 avatar Apr 21 '25 16:04 NOTE46

Just support for dual GPU's period. I have Dual NVIDIA GeForce RTX 4060 Ti totalling 32GB VRAM and get ERROR:olmocr.check:Torch was not able to find a GPU with at least 20 GB of RAM.

Slodl avatar May 31 '25 01:05 Slodl

Unrealistic to get 3090 most of which are well known to be burnt out by miners 24/7. All 30 series cards are abused by miners. Buyers be aware.

wcwong22000 avatar May 31 '25 03:05 wcwong22000

I've run 3 used 3090's 1.5 years now after bought used. I've had no issues. Excellent choice for affordable AI. If you have 3-4K for a 5090 and can find them, then buy them. Otherwise, 3090 is the smart affordable choice.

mrharrison007 avatar May 31 '25 13:05 mrharrison007