Mario Tuta

Results 1 comments of Mario Tuta

Following this calculation from https://www.substratus.ai/blog/calculating-gpu-memory-for-llm/ you would need - 94,2 GB for the 4Bit Model - 188,4 GB for the 8Bit Model I've just stumbled upon this article from VMWare...