Darío Muñoz Prudant

Results 46 comments of Darío Muñoz Prudant

will try! thanks!

thanks to you, you are doing a great job with TGI, it is an honor to be able to contribute with a little grain of sand. I have looked the...

yes, that part is a bit complex in Use cases, my motivation was to append to the last user message in order to try not confuse the llm (

ps: Additionally, first experiment was try to adding a separate user/role/message at the end of the conversation so as not to have to modify the user's last message, but this...

Another option that I test a few months ago is to build a wrapper for the official OPEN AI client, which preset the chat template with the required stuff to...

great, yes Im worked some like that on my locals at chat template level, now Im on the gym, later ar night I can share my chat templates that have...

continuing the refinement of the idea, Let's review the key findings we have uncovered during the implementation of this feature 1. Models are highly sensitive to the chat template, with...

ps: in my use case that work for me was modify the chatML template in order to support tools for non-pretrained LLM for tool call understanding plus appending the prompt_tools...

I have solved the name issue with a const with tool name in the original json grammar, i'm not familiar with PR 's in github but will try it in...

in the server.rs the extraction of the name and the tool parser looks like this: ``` let (tool_calls, output) = if tool_grammar.is_some() { // gen_text should be valid json let...