shaltielshmid

Results 7 issues of shaltielshmid

**Is your feature request related to a problem? Please describe.** I'm requesting this feature after trying to use the GPT2-style tokenizer I trained using HuggingFace in my .NET code. I...

enhancement
Deep Learning

I saw on the TODO list Flash Attention, so I wanted to bring to your attention the announcement [here](https://github.com/dotnet/TorchSharp/discussions/1231). Two packages were announced there: 1] Loading model weights saved using...

### Model description This model was released by Mistral [here](https://mistral.ai/news/mistral-nemo/), and is available on HuggingFace [here](https://huggingface.co/mistralai/Mistral-Nemo-Base-2407). The model is meant to be a drop-in replacement for Mistral-7B, but requires some...

new model

Removed in huggingface#187 The parameter was added in order to allow passing the config programmatically without having to save the config directly to a file.

I see that `stop_reason` is only available with the `VendorAnthropicChatResult`. Would it be possible to include it by default? It's included by default in vLLM: https://github.com/vllm-project/vllm/blob/main/vllm/outputs.py#L47-L48 Would be super helpful...

I just started a new project and installed the latest LlmTornado.Toolkit library, and saw that the ToolkitChat class has been removed: https://github.com/lofcz/LlmTornado/blob/master/src/LlmTornado.Toolkit/Memory/ToolkitChat.cs#L13 Curious to know why it was removed :)

I couldn't figure out how to do it with LlmTornado - I want to add multiple OpenAI-compatible endpoint providers into the same object. It seems to me like it won't...