gptel
gptel copied to clipboard
Ability to re-use the fabric patterns as system prompts for gptel
At the moment the gptel prompts allow you to use those specified in by the gptel-directive a-list or crowd sourced prompts by pressing space.
Describe the solution you'd like I use the fabric patterns at the CLI and it would be great to be able to use the same patterns from gptel in a similar way as the crowdsourced patterns can be used.
Describe alternatives you've considered
I can add the patterns I use most regularly to the gptel-directive but it means double entry, at least in the cases I am creating my own patterns.
We can replace the crowdsourced prompts option with the fabric patterns, as I think the crowdsourced prompts aren't getting much use. A PR for this is welcome.
We can replace the crowdsourced prompts option with the fabric patterns, as I think the crowdsourced prompts aren't getting much use.
Here are some notes in case someone (including myself) wants to give this a try.
According to this the descriptions of the "patterns" are available as nice JSON (pattern_descriptions.json) and the first 500 words of each pattern is also in a single JSON (pattern_extracts.json). Not sure how the words are counted, does not seem to match count-words in Emacs or wc -l. And there is no indication that the extract is truncated, so not entirely clear how to detect if the whole markdown file needs to be downloaded.
Using fabric from the cli, all of the markdown files are downloaded and stored in a local directory. I am not sure what the json descriptors are used for but to use the pattern the markdown file as a whole is used as a system message.
We can replace the crowdsourced prompts option with the fabric patterns, as I think the crowdsourced prompts aren't getting much use. A PR for this is welcome.
I wish I had the lisp-fu to do a PR but I am afraid I am more of a user than a creator. Happy to do testing and feedback for anyone else that feels up for the challenge though.
The prompts, referred to as "patterns", appear to be intriguing and potentially more useful than the crowd-sourced options we currently utilize.
Would it be beneficial to integrate with the actual fabric program, or should we concentrate exclusively on utilizing the prompts themselves? I have not used fabric myself, but looking at the github repo I don't really see any major benefit for gptel, except perhaps things like Youtube support look somewhat interesting.
We need to determine the best strategy for obtaining the prompts. Ideally, we should avoid requiring users to install Fabric or clone the entire repository. Instead, we can download and cache the prompts directly from GitHub, similar to our current method for crowd-sourced prompts. However, the fabric prompts consist of multiple files, making the downloading and caching process a bit more complex than handling a single JSON file.
If nobody else steps up, I'm willing to give this a try at some point, but not promising anything.
The prompts, referred to as "patterns", appear to be intriguing and potentially more useful than the crowd-sourced options we currently utilize.
Yes, that appears to be the case. Eventually I want to drop support for the crowd-sourced prompts, leaving some way for interested users to add it back in their configurations.
Would it be beneficial to integrate with the actual fabric program, or should we concentrate exclusively on utilizing the prompts themselves? I have not used fabric myself, but looking at the github repo I don't really see any major benefit for gptel, except perhaps things like Youtube support look somewhat interesting.
I don't think it makes sense to integrate with the fabric CLI, which is built around composition via piping. Providing the prompts in gptel will work better in Emacs.
We need to determine the best strategy for obtaining the prompts. Ideally, we should avoid requiring users to install Fabric or clone the entire repository. Instead, we can download and cache the prompts directly from GitHub, similar to our current method for crowd-sourced prompts. However, the fabric prompts consist of multiple files, making the downloading and caching process a bit more complex than handling a single JSON file.
Considering that requests for fabric integration are from users who already use fabric, the easiest way would be to allow users to point gptel to the directory where fabric's prompts are located. This avoids the downloading and caching issues.
After we implement this, we can think about fetching the prompts ourselves.
If nobody else steps up, I'm willing to give this a try at some point, but not promising anything.
Cool! I've been meaning to try fabric, but I only use LLMs from inside Emacs so I haven't had much opportunity yet.
We need to determine the best strategy for obtaining the prompts. Ideally, we should avoid requiring users to install Fabric or clone the entire repository. Instead, we can download and cache the prompts directly from GitHub, similar to our current method for crowd-sourced prompts. However, the fabric prompts consist of multiple files, making the downloading and caching process a bit more complex than handling a single JSON file. Considering that requests for fabric integration are from users who already use fabric, the easiest way would be to allow users to point gptel to the directory where fabric's prompts are located. This avoids the downloading and caching issues.
I would be happy with that - fabric has it's own script for fetching and updating the prompts, accessed with fabric -U that places the prompts in ~/.config/fabric/patterns/ on Mac/Linux.
I had the same use case - wanted fabric patterns instead of the default crowd sourced options.. ended up writing a small script to handle this https://github.com/rajp152k/fabric-gpt.el
@karthink, I'm sparse cloning (only picks up the patterns with minimal overhead) the fabric patterns in a directory of choice with some git commands (so a sync is just a git pull as well) rather than having to download the fabric executable..
I think that functionality could be used
after that I just pick the prompts up based on if a system.md exists in the pattern subdirectories and have been using it successfully for the past two weeks..
https://www.youtube.com/watch?v=39OsH_OeWSA&t=3410s&ab_channel=%28Bit-Mage%29
I would like to help out if this seems useful..
Here's what I'm doing to integrate @rajp152k's script with gptel's crowdsourced prompts selector:
(use-package fabric-gpt.el
:straight (:type git :host github :repo "rajp152k/fabric-gpt.el")
:after gptel
:commands (fabric-gpt.el-sync-patterns)
:autoload (fabric-gpt.el-populate-patterns)
:init
(defun gptel--crowdsourced-prompts ()
(when (hash-table-p gptel--crowdsourced-prompts)
(when (hash-table-empty-p gptel--crowdsourced-prompts)
(when (not fabric-gpt.el--patterns)
(fabric-gpt.el-sync-patterns))
(dolist (pattern fabric-gpt.el--patterns)
(let ((prompt (with-temp-buffer
(insert-file-contents (format "%s%s/%s/system.md"
fabric-gpt.el-root
fabric-gpt.el--patterns-path
pattern))
(buffer-string))))
(puthash pattern prompt gptel--crowdsourced-prompts)))))
gptel--crowdsourced-prompts)
(defun gptel--read-crowdsourced-prompt ()
"Pick a crowdsourced system prompt for gptel.
This uses the prompts in the variable
`gptel--crowdsourced-prompts', which see."
(interactive)
(if (not (hash-table-empty-p (gptel--crowdsourced-prompts)))
(let ((choice
(completing-read
"Pick and edit prompt: "
(lambda (str pred action)
(if (eq action 'metadata)
`(metadata
(affixation-function .
(lambda (cands)
(mapcar
(lambda (c)
(list c ""
(concat "\n"
(propertize (s-truncate 250 (gethash c gptel--crowdsourced-prompts))
'face 'completions-annotations))))
cands))))
(complete-with-action action gptel--crowdsourced-prompts str pred)))
nil t)))
(when-let* ((prompt (gethash choice gptel--crowdsourced-prompts)))
(gptel--set-with-scope
'gptel--system-message prompt gptel--set-buffer-locally)
(gptel--edit-directive 'gptel--system-message)))
(message "No prompts available.")))
:config
(setq fabric-gpt.el-root user-emacs-directory))
It's quite hacky: I'm redefining gptel--crowdsourced-prompts to read from the fabric prompts provided by fabric-gtp.el, and also redefining gptel--read-crowdsourced-prompt to truncate the prompts in the resulting list so the completion buffer doesn't slow to a crawl.
The output is fairly nice though - now in gptel-menu when I select a crowdsourced prompt, I get presented with a list of the fabric prompts:
Thought it would be helpful to comment here in case any of this ends up being useful for a real integration.
Sorry for taking so long to respond. First of all @rajp152k thank you for pointing me to your fabric-gpt.el, I started using that yesterday and it has been a good experience - though unless you know the fabric patterns really well it is a bit hard to know which to pick from the list presented by fabric-gpt.el-sent.
I thought that I could use @jdormit code to get the nice list of prompts that shows in the screenshot through gptel-menu but for some reason I am still being presented with the old crowdsources prompts I tried a bit of debugging but I am afraid I could not understand the code well enough.
If someone is able to suggest some troubleshooting steps that would be great