eugeneie

Results 2 comments of eugeneie

Confirming that the following works: ``` from ctransformers import AutoModelForCausalLM llm = AutoModelForCausalLM.from_pretrained( "TheBloke/Mistral-7B-Instruct-v0.1-GGUF", model_file="mistral-7b-instruct-v0.1.Q2_K.gguf", model_type="mistral", gpu_layers=0) ``` But the same doesn't work for Mixtral-8x7B-v0.1-GGUF files, i.e., this fails: ```...

Re. trying to set to 'llama', same error: ``` RuntimeError: Failed to create LLM 'llama' from '../models--TheBloke--Mixtral-8x7B-v0.1-GGUF/blobs/27e3909257480e313a79ff63a1168df5ac7016917add8ad56b5dc489f9215f13'. ``` I checked the `LLM` class and understood that what actually matters is...