Ovis
Ovis copied to clipboard
If it is possible to run inference with OVIS 1.6 on a single 4090 GPU?
Could anyone please advise if it is possible to run inference with OVIS 1.6 on a single 4090 GPU? After loading the model, it appears to consume approximately 20GB of VRAM. I attempted an inference, but the demo exited due to insufficient memory. Are there any solutions to this issue?