PowerInfer icon indicating copy to clipboard operation
PowerInfer copied to clipboard

possible to do one that can fit into 7GB vram?

Open sprappcom opened this issue 1 year ago • 2 comments

7b is > 12gb ram use, can u do one that is maybe 3b parameters or have one 7b whose quantisation is 4_0 gguf or something?

sprappcom avatar Feb 04 '24 21:02 sprappcom

To better assist you, could you please clarify the context? For example, what's the hardware spec and the model you want to use?

In general, PowerInfer is designed to automatically offload model weights to VRAM to utilize GPU as possible. If you're looking to further restrict VRAM usage, you might consider using the --vram-budget parameter to specify your VRAM limitations. You can refer to our inference README for some examples.

hodlen avatar Feb 05 '24 15:02 hodlen

it's not obvious on the "speed up", the quality generated is not ideal at this stage.

maybe i'll wait for mistral 7b. hope to see this mainstream.

p.s. : i'm using 4060 laptop with 7gb vram for testing. it has 8gb but 1gb seemed reserved for display use.

sprappcom avatar Feb 05 '24 16:02 sprappcom