Emu
Emu copied to clipboard
Multi-GPU support
Can we support multi-GPU inference? The model is too large to fit in one consumer GPU, even 4090 does not have enough memory. I think two 4090s are enough.