Using other Ollama LLMs
I would like to see the ability to use other LLMs besides llama3 and support for hosts other than localhost when using Ollama added
I'm using Deepseek from a host running ollama
I'm using Deepseek from a host running ollama
How did you do that it does not work for me
I'm using Deepseek from a host running ollama
How did you do that it does not work for me
Tell me more about your circumstances?
The issue I see is that the <think> tags get included.
I'm using Deepseek from a host running ollama
How did you do that it does not work for me
jump onto the discord for some help... but from my testing deepseek is crap for this bot.
The issue I see is that the
<think>tags get included.
This is fine R1 deepseek models are trained and created so that the output shows its thinking
The issue I see is that the
<think>tags get included.This is fine R1 deepseek models are trained and created so that the output shows its thinking
but that causes it to execute commands its not supposed to
Проблема, которую я вижу, заключается в том, что
<think>теги включены.Это нормально. Модели R1 deepseek обучаются и создаются таким образом, чтобы вывод отображал ход их мыслей.
но это заставляет его выполнять команды, которые он не должен выполнять
in this case I recommend using regular models of the type deepseek-V3
Pull request #423 fixes this exact issue
#423 has been merged.