Deepseek
https://github.com/deepseek-ai/DeepSeek-V3
Are there plans to integrate Deepseek into the LLPhant source?
From what I can see, you can already use deepseek-r1 model from Ollama, the only difference with other models is that you get the chain of thought in the reply, between <think></think> tags.
I guess we need Ollama to be updated so the "think" section is returned in a different part of the message, but in the meantime, LLPhant could start refactoring the "generate*Chat" methods to be able to return more than just the main text response.
For now, we can just remove this manually from the response. Here's what I've done:
private function createMessage(string $response): Message {
$message = $response;
$thoughts = '';
$startOfThoughts = '<think>' . PHP_EOL;
if (mb_strpos($message, $startOfThoughts) === 0) {
$endOfThoughts = PHP_EOL . '</think>' . PHP_EOL;
list ($thoughts, $message) = explode(
$endOfThoughts,
mb_substr($message, mb_strlen($startOfThoughts)),
2
);
}
// I'm using a custom class to store the message, you can just return an array with the two texts
return Message::createMessage(trim($message), ChatRole::Assistant, $thoughts);
}
@killrawr the api seems to be compatible with openai. Do you want to make a pull request?
Something like : https://github.com/LLPhant/LLPhant/blob/main/src/Chat/MistralAIChat.php