chatdocs
chatdocs copied to clipboard
Cant load big models across multiple gpu’s
Is it possible to run models that require 48gb vram by combining two 24gb gpus? Ive tried playing around with the chatdocs.yml file to no avail.