Amal
Amal
Hey, I'm getting the same error for llama2:70b, trying to run it on a G5.2xlarge (32GB RAM, 24GB A10 with 80GB storage and 20GB Available), RAM and GPU didn't seem...
I agree on the latency part but security is still a concern when it comes to certain deployments, so I was wondering how to enable turn server so that the...
Yes, BFloat16 is not supported on MPS. Adding more details about the error below `python3 generate/base.py --prompt "Hello, my name is" --checkpoint_dir checkpoints/tiiuae/falcon-7b Loading model 'checkpoints/tiiuae/falcon-7b/lit_model.pth' with {'block_size': 2048, 'vocab_size':...
Thanks, that worked