Divjyot

Results 4 comments of Divjyot

@letien91 Unable to reproduce !

I know this works `model = AutoModelForCausalLM.from_pretrained("mosaicml/mpt-7b-instruct", trust_remote_code=True)` but I want to deploy it on Endpoint. Is there any easy way to deploy using above `AutoModelForCausalLM` approach ?

> @llimllib ✅ Worked on Mac (2012) OS Catalina x64_86 architecture Intel chip 1. Clone Repo ``` pip uninstall pyllama git clone https://github.com/juncongmoo/pyllama pip install -e pyllama ``` 2. `cd...

I am on Mac M1 with rosetta based kernel. I also getting similar output, however I can inference the model fine. ``` llama.cpp: can't use mmap because tensors are not...