Support for Dual Nvidia 3060 Setup in OLMOCR
🚀 The feature, motivation and pitch
I normally use a dual Nvidia 3060 setup for AI, and it works great for LLMs that require 20GB of VRAM. The workload is split across my GPUs, and everything runs smoothly. I also use a third GPU (not Nvidia) for my displays. Ollama handles this automatically for my LLMs.
However, when running OLMo-CR, I encounter the following error:
'[ERROR:olmocr.check:Torch was not able to find a GPU with at least 20 GB of RAM.]’
Is there a way to make OLMo-CR utilize both of my GPUs? Additionally, would a single 3090 (24GB) work for OLMo-CR? I'm considering getting one if my dual 3060 setup isn’t viable.
Alternatives
No response
Additional context
No response
Pls add support for dual graphics cards for AMD cards too. 20G VM is above average cards in the market.
Yes, a single RTX 3090 works for olmOCR using about 20Gb of the 24Gb VRAM.
A single 3090 will be much better than the dual 3060
Just support for dual GPU's period. I have Dual NVIDIA GeForce RTX 4060 Ti totalling 32GB VRAM and get ERROR:olmocr.check:Torch was not able to find a GPU with at least 20 GB of RAM.
Unrealistic to get 3090 most of which are well known to be burnt out by miners 24/7. All 30 series cards are abused by miners. Buyers be aware.
I've run 3 used 3090's 1.5 years now after bought used. I've had no issues. Excellent choice for affordable AI. If you have 3-4K for a 5090 and can find them, then buy them. Otherwise, 3090 is the smart affordable choice.