smoskal

Results 1 comments of smoskal

Just updated from pip and getting the same issue. The fix is to use _llama_cpp.load_model_default_params() ``` self.lparams = llama_cpp.llama_context_default_params() self.mparams = llama_cpp.llama_model_default_params() self.model = llama_cpp.llama_load_model_from_file(model_path.encode('utf-8'), self.mparams) self.ctx = llama_cpp.llama_new_context_with_model(self.model, self.lparams`...