llm-chain icon indicating copy to clipboard operation
llm-chain copied to clipboard

add support for Mistral using TGI / vllm / candle

Open pabl-o-ce opened this issue 1 year ago • 4 comments

Hi guys love your project

I was wondering if you can add support to mistral via:

for use it as endpoints also they have active support to new llm Architectures as Mistral

pabl-o-ce avatar Oct 21 '23 15:10 pabl-o-ce

Hey sounds like a very good idea :)

If anyone wants to add this it would be a most welcome contribution.

williamhogman avatar Oct 25 '23 15:10 williamhogman

Llama+Mistral+Zephyr and GPU acceleration in only ~450 lines using candle. https://github.com/huggingface/candle/blob/main/candle-examples/examples/quantized/main.rs

If Mistral support is added with candle it could be fairly trivial to also support Llama and Zephyr.

andychenbruce avatar Nov 12 '23 22:11 andychenbruce

I have some experience with Rust, although my familiarity with LLMS is somewhat limited. can take on this challenge, as it would mark my initial contribution to the LLM-chain.

01PrathamS avatar Nov 30 '23 11:11 01PrathamS

Sounds like a great idea :)

williamhogman avatar Nov 30 '23 19:11 williamhogman