llm icon indicating copy to clipboard operation
llm copied to clipboard

A package abstracting llm capabilities for emacs.

Results 6 llm issues
Sort by recently updated
recently updated
newest added

Wondering how hard it would be to add support for APIs used by GH Copilot. This helps us make use of Copilot subscription with other Emacs llm frontends. [chep/copilot-chat.el](https://github.com/chep/copilot-chat.el) does...

gptel has recently been working on support for AWS Bedrock models (https://github.com/karthink/gptel/issues/379#issuecomment-2676029480). Wondering if any llm maintainers would be interested in porting it to this package? I don't currently have...

New ollama release 0.6.4 contains new api method `/api/show` which can be useful to check model capabilities https://github.com/ollama/ollama/releases

This is another feature I'd be keen to see included, and would be happy to help with. I've noticed that `caching` is recorded as a capability, so figured I'd check...

Is there a way to do so-called legacy completions? - https://platform.openai.com/docs/guides/text-generation/completions-api - https://platform.openai.com/docs/api-reference/completions/create ```python res = openrouter_client.completions.create( model="mistralai/mixtral-8x22b", prompt="""...""", stream=True, echo=False, #: Echo back the prompt in addition to the...

Hello! I noticed that one of the methods for the providers is `llm-count-tokens` which currently does a simple heuristic. I recently wrote a [port of tiktoken](https://github.com/zkry/tiktoken.el) that could add this...