Qualzz
Qualzz
This should be disabled by default, not an "opt out" option that is not mentioned anywhere in the main README of the repo. And discretely doing it in a commit...
> the other issue #37 has a workaround: go back in version. previous versions don't have LlamaCpp support :'(
> Yes, i know about the matter integration - but its coming without power consumption:-( This was why i ask for integration with your solution... Did you found a workaround...
> This could also be related: [ggerganov/llama.cpp#7449](https://github.com/ggerganov/llama.cpp/pull/7449) Merged in master !
Bumping this as it's almost a freebie 😸
I'm confused, is this possible already ? Since the notebook it's written `" # Choose ANY! eg teknium/OpenHermes-2.5-Mistral-7B "`
@EricLBuehler How would that scenario work in practice? I have a backend that, upon receiving a request, makes a call to MistralRS. If I have two users making requests within...
this could be combined with onlyIncludeTags. For exemple: `text A` `text B` `text C` If I use onlyIncludeTags the result will be `Text A Text B Text C` which is...
Is there anything new about this ? Or is it not planned ?
Confirming the issue on my side too with the last update. Nvidia (4090) AMD Ryzen 7950 X