karthink
karthink
Closing this issue since, with the benefit of hindsight, there are many uses for gptel where it is desirable to have more than one request active in a buffer.
The LLM is sending this string: `````markdown ```javascript some code ``` ````` So gptel shows `#+begin_src javascript`. This is working as intended. If you anticipate this problem, you can use...
I need to make the markdown -> org pipeline user-customizable first. Then gptel can provide some utility functions for mini-fixes like this without having to resort to the post-response hook,...
This is fantastic -- and quite a small change too! Thanks for the PR, this looks quite promising. I'm not familiar with the OpenAI function-calling API. I'll take a look...
> This PR would be nice to have for my use case. Anything I can do to push it along? Function calling is planned to be part of the features...
I've now implemented function calling - details in #514. Looking for testers!
Closing this PR as function calling is now available in gptel for all supported models.
If the Perplexity API supports it, sure. I don't see an option for this in the [chat completions API](https://docs.perplexity.ai/reference/post_chat_completions). Kagi includes the links as part of the responses, for example,...
Yup, that makes sense. I don't have a perplexity account, will you be able to test some code if I provide it here?
Okay, run the following code after adjusting the `:models` and `:key` fields: ``` emacs-lisp (defvar gptel--perplexity (gptel-make-openai "Perplexity" :host "api.perplexity.ai" :key "your-api-key" :endpoint "/chat/completions" :stream t :header (lambda () (when-let...