Rune Kaagaard
Rune Kaagaard
The main purporse is both. What the syntax looks like is pretty obvious, but you are right that it could be clearer why I've chosen the features I have. Here...
Hey It would be GREAT if you would want to contribute to the project!!!! In regards to you comments, i'm not trying to be as close to PHP as possible,...
Ok, thanks for the references!
Yeah, that would be a really nice option for us danes :)
Yeah, thank you so much. I've been looking at this beautiful theme everyday for 10+ years, it's a historic emacs package! This is the magic incantation to make it work...
Hi, thanks for your thoughtful reply!! Yeah as a user I personally would totally prefer to just be able to switch models in one single app. I guess a nice...
I got something working here: https://github.com/runekaagaard/chatgpt-shell-plus-claude/blob/main/claude-shell.el I still havent figured out how to get streaming responses to work so claude-shell-streaming must be nil. https://docs.anthropic.com/en/api/messages-streaming.
Ah didn't see your last reply, thank you for taking time to reply! Yes llm-shell would be all the common parts refactored out and then call the correct vendor specific...
Got streaming working, I think we have a claude mode now :)  EDIT: I had to create custom claude-shell-async-shell-command function. Hmmm. Surely some kind of LLM bridge that mirrors...
Of course MULTIPLE LLM bridges exists, I'm so stupid :) ```bash $ pip install 'litellm[proxy]' $ litellm --model claude-3-5-sonnet-20240620 --host localhost INFO: Uvicorn running on http://localhost:4000 (Press CTRL+C to quit)...