karthink
karthink
@g-simmons, @safijari Are you still facing this issue?
@g-simmons gentle ping -- please let me know if you're still facing this issue.
@eastbowl: I updated the llama.cpp instructions in the README. Basically you have to specify that no key is to be used for Llama (`:key nil`) when defining the backend. Please...
```elisp (use-package gptel :if (is-mbp-p) :config (setq-default gtpel-backend (gptel-make-openai ;Not a typo, same API as OpenAI "llama-cpp" ;Any name :stream t ;Stream responses :protocol "http" :host "127.0.0.1:8080" ;Llama.cpp server location,...
Hmm, I'm not able to reproduce this behavior. It works correctly once I set `:key` to `nil`.
While I try to figure this out, you could try using a dummy key: `:key "something"`. It doesn't matter since it's not used by Llama.
Okay, should actually be fixed now. Please update, and don't hesitate to reopen this topic if the issue persists. I'm known to declare things fixed before they're tested!
Again, your configuration looks fine. You can delete the `:key nil` line now, BTW. > I hate to say it, but it's still happening. No worries, we'll figure it out....
On that note, I hope this issue isn't keeping you from using Llama. (You can enter anything (or nothing at all) when it asks you for an API key and...
> I can't use Llama with your package, if that's what you mean. After I enter a key (anything) if I submit a prompt, it will be rejected as being...